Sample records for applied approaches modelisation

  1. Modelisation of the SECMin molten salts environment

    NASA Astrophysics Data System (ADS)

    Lucas, M.; Slim, C.; Delpech, S.; di Caprio, D.; Stafiej, J.

    2014-06-01

    We develop a cellular automata modelisation of SECM experiments to study corrosion in molten salt media for generation IV nuclear reactors. The electrodes used in these experiments are cylindrical glass tips with a coaxial metal wire inside. As the result of simulations we obtain the current approach curves of the electrodes with geometries characterized by several values of the ratios of glass to metal area at the tip. We compare these results with predictions of the known analytic expressions, solutions of partial differential equations for flat uniform geometry of the substrate. We present the results for other, more complicated substrate surface geometries e. g. regular saw modulated surface, surface obtained by Eden model process, ...

  2. Computational approach to estimating the effects of blood properties on changes in intra-stent flow.

    PubMed

    Benard, Nicolas; Perrault, Robert; Coisne, Damien

    2006-08-01

    In this study various blood rheological assumptions are numerically investigated for the hemodynamic properties of intra-stent flow. Non-newtonian blood properties have never been implemented in blood coronary stented flow investigation, although its effects appear essential for a correct estimation and distribution of wall shear stress (WSS) exerted by the fluid on the internal vessel surface. Our numerical model is based on a full 3D stent mesh. Rigid wall and stationary inflow conditions are applied. Newtonian behavior, non-newtonian model based on Carreau-Yasuda relation and a characteristic newtonian value defined with flow representative parameters are introduced in this research. Non-newtonian flow generates an alteration of near wall viscosity norms compared to newtonian. Maximal WSS values are located in the center part of stent pattern structure and minimal values are focused on the proximal stent wire surface. A flow rate increase emphasizes fluid perturbations, and generates a WSS rise except for interstrut area. Nevertheless, a local quantitative analysis discloses an underestimation of WSS for modelisation using a newtonian blood flow, with clinical consequence of overestimate restenosis risk area. Characteristic viscosity introduction appears to present a useful option compared to rheological modelisation based on experimental data, with computer time gain and relevant results for quantitative and qualitative WSS determination.

  3. Bellman Continuum (3rd) International Workshop (13-14 June 1988)

    DTIC Science & Technology

    1988-06-01

    Modelling Uncertain Problem ................. 53 David Bensoussan ,---,>Asymptotic Linearization of Uncertain Multivariable Systems by Sliding Modes...K. Ghosh .-. Robust Model Tracking for a Class of Singularly Perturbed Nonlinear Systems via Composite Control ....... 93 F. Garofalo and L. Glielmo...MODELISATION ET COMMANDE EN ECONOMIE MODELS AND CONTROL POLICIES IN ECONOMICS Qualitative Differential Games : A Viability Approach ............. 117

  4. Etude de pratiques d'enseignement relatives a la modelisation en sciences et technologies avec des enseignants du secondaire

    NASA Astrophysics Data System (ADS)

    Aurousseau, Emmanuelle

    Les modeles sont des outils amplement utilises en sciences et technologies (S&T) afin de representer et d’expliquer un phenomene difficilement accessible, voire abstrait. La demarche de modelisation est presentee de maniere explicite dans le programme de formation de l’ecole quebecoise (PFEQ), notamment au 2eme cycle du secondaire (Quebec. Ministere de l'Education du Loisir et du Sport, 2007a). Elle fait ainsi partie des sept demarches auxquelles eleves et enseignants sont censes recourir. Cependant, de nombreuses recherches mettent en avant la difficulte des enseignants a structurer leurs pratiques d’enseignement autour des modeles et de la demarche de modelisation qui sont pourtant reconnus comme indispensables. En effet, les modeles favorisent la conciliation des champs concrets et abstraits entre lesquels le scientifique, meme en herbe, effectue des allers-retours afin de concilier le champ experimental de reference qu’il manipule et observe au champ theorique relie qu’il construit. L’objectif de cette recherche est donc de comprendre comment les modeles et la demarche de modelisation contribuent a faciliter l’articulation du concret et de l’abstrait dans l’enseignement des sciences et des technologies (S&T) au 2eme cycle du secondaire. Pour repondre a cette question, nous avons travaille avec les enseignants dans une perspective collaborative lors de groupes focalises et d’observation en classe. Ces dispositifs ont permis d’examiner les pratiques d’enseignement que quatre enseignants mettent en oeuvre en utilisant des modeles et des demarches de modelisation. L’analyse des pratiques d’enseignement et des ajustements que les enseignants envisagent dans leur pratique nous permet de degager des connaissances a la fois pour la recherche et pour la pratique des enseignants, au regard de l’utilisation des modeles et de la demarche de modelisation en S&T au secondaire.

  5. Human Behaviour Representation in Constructive Modelling (Representation du comportement humain dans des modelisations creatives)

    DTIC Science & Technology

    2009-09-01

    involved in R&T activities. RTO reports both to the Military Committee of NATO and to the Conference of National Armament Directors. It comprises a...4 11.5.3 Project Description 11-5 Chapter 12 – Technical Evaluation Report 12-1 12.1 Executive Summary 12-1 12.2 Introduction 12-2 12.3...modelling human factors has been slow over the past decade, other forums have been reporting a number of theoretical and applied papers on human behaviour

  6. Reduction of Military Vehicle Acquisition Time and Cost through Advanced Modelling and Virtual Simulation (La reduction des couts et des delais d’acquisition des vehicules militaires par la modelisation avancee et la simulation de produit virtuel)

    DTIC Science & Technology

    2003-03-01

    nations, a very thorough examination of current practices. Introduction The Applied Vehicle Technology Panel (AVT) of the Research and Technology...the introduction of new information generated by computer codes required it to be timely and presented in appropriate fashion so that it could...military competition between the NATO allies and the Soviet Union. The second was the introduction of commercial, high capacity transonic aircraft and

  7. Conceptual Modeling (CM) for Military Modeling and Simulation (M&S) (Modelisation conceptuelle (MC) pour la modelisation et la simulation (M&S) militaires)

    DTIC Science & Technology

    2012-07-01

    du monde de la modélisation et de la simulation et lui fournir des directives de mise en œuvre ; et fournir des ...définition ; rapports avec les normes ; spécification de procédure de gestion de la MC ; spécification d’artefact de MC. Considérations importantes...utilisant la présente directive comme référence. • Les VV&A (vérification, validation et acceptation) des MC doivent faire partie intégrante du

  8. Le recours aux modeles dans l'enseignement de la biologie au secondaire : Conceptions d'enseignantes et d'enseignants et modes d'utilisation

    NASA Astrophysics Data System (ADS)

    Varlet, Madeleine

    Le recours aux modeles et a la modelisation est mentionne dans la documentation scientifique comme un moyen de favoriser la mise en oeuvre de pratiques d'enseignement-apprentissage constructivistes pour pallier les difficultes d'apprentissage en sciences. L'etude prealable du rapport des enseignantes et des enseignants aux modeles et a la modelisation est alors pertinente pour comprendre leurs pratiques d'enseignement et identifier des elements dont la prise en compte dans les formations initiale et disciplinaire peut contribuer au developpement d'un enseignement constructiviste des sciences. Plusieurs recherches ont porte sur ces conceptions sans faire de distinction selon les matieres enseignees, telles la physique, la chimie ou la biologie, alors que les modeles ne sont pas forcement utilises ou compris de la meme maniere dans ces differentes disciplines. Notre recherche s'est interessee aux conceptions d'enseignantes et d'enseignants de biologie au secondaire au sujet des modeles scientifiques, de quelques formes de representations de ces modeles ainsi que de leurs modes d'utilisation en classe. Les resultats, que nous avons obtenus au moyen d'une serie d'entrevues semi-dirigees, indiquent que globalement leurs conceptions au sujet des modeles sont compatibles avec celle scientifiquement admise, mais varient quant aux formes de representations des modeles. L'examen de ces conceptions temoigne d'une connaissance limitee des modeles et variable selon la matiere enseignee. Le niveau d'etudes, la formation prealable, l'experience en enseignement et un possible cloisonnement des matieres pourraient expliquer les differentes conceptions identifiees. En outre, des difficultes temporelles, conceptuelles et techniques peuvent freiner leurs tentatives de modelisation avec les eleves. Toutefois, nos resultats accreditent l'hypothese que les conceptions des enseignantes et des enseignants eux-memes au sujet des modeles, de leurs formes de representation et de leur approche constructiviste en enseignement representent les plus grands obstacles a la construction des modeles en classe. Mots-cles : Modeles et modelisation, biologie, conceptions, modes d'utilisation, constructivisme, enseignement, secondaire.

  9. Modelisation de l'historique d'operation de groupes turbine-alternateur

    NASA Astrophysics Data System (ADS)

    Szczota, Mickael

    Because of their ageing fleet, the utility managers are increasingly in needs of tools that can help them to plan efficiently maintenance operations. Hydro-Quebec started a project that aim to foresee the degradation of their hydroelectric runner, and use that information to classify the generating unit. That classification will help to know which generating unit is more at risk to undergo a major failure. Cracks linked to the fatigue phenomenon are a predominant degradation mode and the loading sequences applied to the runner is a parameter impacting the crack growth. So, the aim of this memoir is to create a generator able to generate synthetic loading sequences that are statistically equivalent to the observed history. Those simulated sequences will be used as input in a life assessment model. At first, we describe how the generating units are operated by Hydro-Quebec and analyse the available data, the analysis shows that the data are non-stationnary. Then, we review modelisation and validation methods. In the following chapter a particular attention is given to a precise description of the validation and comparison procedure. Then, we present the comparison of three kind of model : Discrete Time Markov Chains, Discrete Time Semi-Markov Chains and the Moving Block Bootstrap. For the first two models, we describe how to take account for the non-stationnarity. Finally, we show that the Markov Chain is not adapted for our case, and that the Semi-Markov chains are better when they include the non-stationnarity. The final choice between Semi-Markov Chains and the Moving Block Bootstrap depends of the user. But, with a long term vision we recommend the use of Semi-Markov chains for their flexibility. Keywords: Stochastic models, Models validation, Reliability, Semi-Markov Chains, Markov Chains, Bootstrap

  10. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    NASA Astrophysics Data System (ADS)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological dynamic and processes, i. e. sample heterogeneity. For a same streamflow range corresponds different processes such as rising limbs or recession, where uncertainties are different. The dynamical approach improves reliability, skills and sharpness of forecasts and globally reduces confidence intervals width. When compared in details, the dynamical approach allows a noticeable reduction of confidence intervals during recessions where uncertainty is relatively lower and a slight increase of confidence intervals during rising limbs or snowmelt where uncertainty is greater. The dynamic approach, validated by forecaster's experience that considered the empirical approach not discriminative enough, improved forecaster's confidence and communication of uncertainties. Montanari, A. and Brath, A., (2004). A stochastic approach for assessing the uncertainty of rainfall-runoff simulations. Water Resources Research, 40, W01106, doi:10.1029/2003WR002540. Schaefli, B., Balin Talamba, D. and Musy, A., (2007). Quantifying hydrological modeling errors through a mixture of normal distributions. Journal of Hydrology, 332, 303-315.

  11. Time Sensitive Course of Action Development and Evaluation

    DTIC Science & Technology

    2010-10-01

    Applications militaires de la modelisation humaine ). RTO-MP-HFM-202 14. ABSTRACT The development of courses of action that integrate military with...routes between the capital town C of the province and a neighboring country M. Both roads are historically significant smuggling routes. There were

  12. Biological Rhythms Modelisation of Vigilance and Sleep in Microgravity State with COSINOR and Volterra's Kernels Methods

    NASA Astrophysics Data System (ADS)

    Gaudeua de Gerlicz, C.; Golding, J. G.; Bobola, Ph.; Moutarde, C.; Naji, S.

    2008-06-01

    The spaceflight under microgravity cause basically biological and physiological imbalance in human being. Lot of study has been yet release on this topic especially about sleep disturbances and on the circadian rhythms (alternation vigilance-sleep, body, temperature...). Factors like space motion sickness, noise, or excitement can cause severe sleep disturbances. For a stay of longer than four months in space, gradual increases in the planned duration of sleep were reported. [1] The average sleep in orbit was more than 1.5 hours shorter than the during control periods on earth, where sleep averaged 7.9 hours. [2] Alertness and calmness were unregistered yield clear circadian pattern of 24h but with a phase delay of 4h.The calmness showed a biphasic component (12h) mean sleep duration was 6.4 structured by 3-5 non REM/REM cycles. Modelisations of neurophysiologic mechanisms of stress and interactions between various physiological and psychological variables of rhythms have can be yet release with the COSINOR method. [3

  13. Human Modelling for Military Application (Applications militaires de la modelisation humaine)

    DTIC Science & Technology

    2010-10-01

    techniques (rooted in the mathematics-centered analytic methods arising from World War I analyses by Lanchester 2 ). Recent requirements for research and...34Dry Shooting for Airplane Gunners - Popular Science Monthly". January 1919. p. 13-14. 2 Lanchester F.W., Mathematics in Warfare in The World of

  14. Comparison of different 3D wavefront sensing and reconstruction techniques for MCAO

    NASA Astrophysics Data System (ADS)

    Bello, Dolores; Vérinaud, Christophe; Conan, Jean-Marc; Fusco, Thierry; Carbillet, Marcel; Esposito, Simone

    2003-02-01

    The vertical distribution of the turbulence limits the field of view of classical adaptive optics due to the anisoplanatism. Multiconjugate adaptive optics (MCAO) uses several deformable mirrors conjugated to different layers in the atmosphere to overcome this effect. In the last few years, many studies and developments have been done regarding the analysis of the turbulence volume, and the choice of the wavefront reconstruction techniques.An extensive study of MCAO modelisation and performance estimation has been done at OAA and ONERA. The developed Monte Carlo codes allow to simulate and investigate many aspects: comparison of turbulence analysis strategies (tomography or layer oriented) and comparison of different reconstruction approaches. For instance in the layer oriented approach, the control for a given deformable mirror can be either deduced from the whole set of wavefront sensor measurements or only using the associated wavefront sensor. Numerical simulations are presented showing the advantages and disadvantages of these different options for several cases depending on the number, geometry and magnitude of the guide stars.

  15. Team Modelling: Survey of Experimental Platforms (Modelisation d’equipes : Examen de plate-formes experimentales)

    DTIC Science & Technology

    2006-09-01

    Control Force Agility Shared Situational Awareness Attentional Demand Interoperability Network Based Operations Effect Based Operations Speed of...Command Self Synchronization Reach Back Reach Forward Information Superiority Increased Mission Effectiveness Humansystems® Team Modelling...communication effectiveness and Distributed Mission Training (DMT) effectiveness . The NASA Ames Centre - Distributed Research Facilities platform could

  16. Environmental Modeling Packages for the MSTDCL TDP: Review and Recommendations (Trousses de Modelisation Environnementale Pour le PDT DCLTCM: Revue et Recommendations)

    DTIC Science & Technology

    2009-09-01

    frequency shallow water scenarios, and DRDC has ready access to a well-established PE model ( PECan ). In those spectral areas below 1 kHz, where the PE...PCs Personnel Computers PE Parabolic Equation PECan PE Model developed by DRDC SPADES/ICE Sensor Performance and Acoustic Detection Evaluation

  17. A Designer’s Guide to Human Performance Modelling (La Modelisation des Performances Humaines: Manuel du Concepteur).

    DTIC Science & Technology

    1998-12-01

    failure detection, monitoring, and decision making.) moderator function. Originally, the output from these One of the best known OCM implementations, the...imposed by the tasks themselves, the information and equipment provided, the task environment, operator skills and experience, operator strategies , the...problem-solving situation, including the toward failure.) knowledge necessary to generate the right problem- solving strategies , the attention that

  18. Future Modelling and Simulation Challenges (Defis futurs pour la modelisation et la simulation)

    DTIC Science & Technology

    2002-11-01

    Language School Figure 2: Location of the simulation center within the MEC Military operations research section - simulation lab Military operations... language . This logic can be probabilistic (branching is randomised, which is useful for modelling error), tactical (a branch goes to the task with the... language and a collection of simulation tools that can be used to create human and team behaviour models to meet users’ needs. Hence, different ways of

  19. Hidden Markov random field model and Broyden-Fletcher-Goldfarb-Shanno algorithm for brain image segmentation

    NASA Astrophysics Data System (ADS)

    Guerrout, EL-Hachemi; Ait-Aoudia, Samy; Michelucci, Dominique; Mahiou, Ramdane

    2018-05-01

    Many routine medical examinations produce images of patients suffering from various pathologies. With the huge number of medical images, the manual analysis and interpretation became a tedious task. Thus, automatic image segmentation became essential for diagnosis assistance. Segmentation consists in dividing the image into homogeneous and significant regions. We focus on hidden Markov random fields referred to as HMRF to model the problem of segmentation. This modelisation leads to a classical function minimisation problem. Broyden-Fletcher-Goldfarb-Shanno algorithm referred to as BFGS is one of the most powerful methods to solve unconstrained optimisation problem. In this paper, we investigate the combination of HMRF and BFGS algorithm to perform the segmentation operation. The proposed method shows very good segmentation results comparing with well-known approaches. The tests are conducted on brain magnetic resonance image databases (BrainWeb and IBSR) largely used to objectively confront the results obtained. The well-known Dice coefficient (DC) was used as similarity metric. The experimental results show that, in many cases, our proposed method approaches the perfect segmentation with a Dice Coefficient above .9. Moreover, it generally outperforms other methods in the tests conducted.

  20. Etude numerique et experimentale de la reponse vibro-acoustique des structures raidies a des excitations aeriennes et solidiennes

    NASA Astrophysics Data System (ADS)

    Mejdi, Abderrazak

    Les fuselages des avions sont generalement en aluminium ou en composite renforces par des raidisseurs longitudinaux (lisses) et transversaux (cadres). Les raidisseurs peuvent etre metalliques ou en composite. Durant leurs differentes phases de vol, les structures d'avions sont soumises a des excitations aeriennes (couche limite turbulente : TBL, champs diffus : DAF) sur la peau exterieure dont l'energie acoustique produite se transmet a l'interieur de la cabine. Les moteurs, montes sur la structure, produisent une excitation solidienne significative. Ce projet a pour objectifs de developper et de mettre en place des strategies de modelisations des fuselages d'avions soumises a des excitations aeriennes et solidiennes. Tous d'abord, une mise a jour des modeles existants de la TBL apparait dans le deuxieme chapitre afin de mieux les classer. Les proprietes de la reponse vibro-acoustique des structures planes finies et infinies sont analysees. Dans le troisieme chapitre, les hypotheses sur lesquelles sont bases les modeles existants concernant les structures metalliques orthogonalement raidies soumises a des excitations mecaniques, DAF et TBL sont reexamines en premier lieu. Ensuite, une modelisation fine et fiable de ces structures est developpee. Le modele est valide numeriquement a l'aide des methodes des elements finis (FEM) et de frontiere (BEM). Des tests de validations experimentales sont realises sur des panneaux d'avions fournis par des societes aeronautiques. Au quatrieme chapitre, une extension vers les structures composites renforcees par des raidisseurs aussi en composites et de formes complexes est etablie. Un modele analytique simple est egalement implemente et valide numeriquement. Au cinquieme chapitre, la modelisation des structures raidies periodiques en composites est beaucoup plus raffinee par la prise en compte des effets de couplage des deplacements planes et transversaux. L'effet de taille des structures finies periodiques est egalement pris en compte. Les modeles developpes ont permis de conduire plusieurs etudes parametriques sur les proprietes vibro-acoustiques des structures d'avions facilitant ainsi la tache des concepteurs. Dans le cadre de cette these, un article a ete publie dans le Journal of Sound and Vibration et trois autres soumis, respectivement aux Journal of Acoustical Society of America, International Journal of Solid Mechanics et au Journal of Sound and Vibration Mots cles : structures raidies, composites, vibro-acoustique, perte par transmission.

  1. Modelisation frequentielle de la permittivite du beton pour le controle non destructif par georadar

    NASA Astrophysics Data System (ADS)

    Bourdi, Taoufik

    Le georadar (Ground Penetrating Radar (GPR)) constitue une technique de controle non destructif (CND) interessante pour la mesure des epaisseurs des dalles de beton et la caracterisation des fractures, en raison de ses caracteristiques de resolution et de profondeur de penetration. Les equipements georadar sont de plus en plus faciles a utiliser et les logiciels d'interpretation sont en train de devenir plus aisement accessibles. Cependant, il est ressorti dans plusieurs conferences et ateliers sur l'application du georadar en genie civil qu'il fallait poursuivre les recherches, en particulier sur la modelisation et les techniques de mesure des proprietes electriques du beton. En obtenant de meilleures informations sur les proprietes electriques du beton aux frequences du georadar, l'instrumentation et les techniques d'interpretation pourraient etre perfectionnees plus efficacement. Le modele de Jonscher est un modele qui a montre son efficacite dans le domaine geophysique. Pour la premiere fois, son utilisation dans le domaine genie civil est presentee. Dans un premier temps, nous avons valide l'application du modele de Jonscher pour la caracterisation de la permittivite dielectrique du beton. Les resultats ont montre clairement que ce modele est capable de reproduire fidelement la variation de la permittivite de differents types de beton sur la bande de frequence georadar (100 MHz-2 GHz). Dans un deuxieme temps, nous avons montre l'interet du modele de Jonscher en le comparant a d'autres modeles (Debye et Debye-etendu) deja utilises dans le domaine genie civil. Nous avons montre aussi comment le modele de Jonscher peut presenter une aide a la prediction de l'efficacite de blindage et a l'interpretation des ondes de la technique GPR. Il a ete determine que le modele de Jonscher permet de donner une bonne presentation de la variation de la permittivite du beton dans la gamme de frequence georadar consideree. De plus, cette modelisation est valable pour differents types de beton et a differentes teneurs en eau. Dans une derniere partie, nous avons presente l'utilisation du modele de Jonscher pour l'estimation de l'epaisseur d'une dalle de beton par la technique GPR dans le domaine frequentiel. Mots-cles : CND, beton, georadar , permittivite, Jonscher

  2. 3D Modelling of Urban Terrain (Modelisation 3D de milieu urbain)

    DTIC Science & Technology

    2011-09-01

    Panel • IST Information Systems Technology Panel • NMSG NATO Modelling and Simulation Group • SAS System Analysis and Studies Panel • SCI... Systems Concepts and Integration Panel • SET Sensors and Electronics Technology Panel These bodies are made up of national representatives as well as...of a part of it may be made for individual use only. The approval of the RTA Information Management Systems Branch is required for more than one

  3. Understanding and Modeling Vortical Flows to Improve the Technology Readiness Level for Military Aircraft (Comprehension et Modelisation des Flux de Vortex Pour Ameliorer le Niveau de Maturite Technologique au Profit des Avions Militaires)

    DTIC Science & Technology

    2009-10-01

    636.7 115,418 0 2500 5000 7500 10000 12500 iterations -5 -4 -3 -2 -1 0 lo g( dρ /d t) SA EARSM EARSM + CC Hellsten EARSM Hellsten EARSM + CC DRSM...VORTEX BREAKDOWN RTO-TR-AVT-113 29 - 13 θU URo axial= (1) As a vortex passes through a normal shock, the tangential velocity is

  4. Human Behaviour Representation in Constructive Modelling (Representation du comportement humain dans des modelisations creatives)

    DTIC Science & Technology

    2009-09-01

    ordination with other NATO bodies involved in R&T activities. RTO reports both to the Military Committee of NATO and to the Conference of National...Aims 11-4 11.5.2 Background 11-4 11.5.3 Project Description 11-5 Chapter 12 – Technical Evaluation Report 12-1 12.1 Executive Summary 12-1...track. Although progress in modelling human factors has been slow over the past decade, other forums have been reporting a number of theoretical and

  5. Modelisation de l'architecture des forets pour ameliorer la teledetection des attributs forestiers

    NASA Astrophysics Data System (ADS)

    Cote, Jean-Francois

    The quality of indirect measurements of canopy structure, from in situ and satellite remote sensing, is based on knowledge of vegetation canopy architecture. Technological advances in ground-based, airborne or satellite remote sensing can now significantly improve the effectiveness of measurement programs on forest resources. The structure of vegetation canopy describes the position, orientation, size and shape of elements of the canopy. The complexity of the canopy in forest environments greatly limits our ability to characterize forest structural attributes. Architectural models have been developed to help the interpretation of canopy structural measurements by remote sensing. Recently, the terrestrial LiDAR systems, or TLiDAR (Terrestrial Light Detection and Ranging), are used to gather information on the structure of individual trees or forest stands. The TLiDAR allows the extraction of 3D structural information under the canopy at the centimetre scale. The methodology proposed in my Ph.D. thesis is a strategy to overcome the weakness in the structural sampling of vegetation cover. The main objective of the Ph.D. is to develop an architectural model of vegetation canopy, called L-Architect (LiDAR data to vegetation Architecture), and to focus on the ability to document forest sites and to get information on canopy structure from remote sensing tools. Specifically, L-Architect reconstructs the architecture of individual conifer trees from TLiDAR data. Quantitative evaluation of L-Architect consisted to investigate (i) the structural consistency of the reconstructed trees and (ii) the radiative coherence by the inclusion of reconstructed trees in a 3D radiative transfer model. Then, a methodology was developed to quasi-automatically reconstruct the structure of individual trees from an optimization algorithm using TLiDAR data and allometric relationships. L-Architect thus provides an explicit link between the range measurements of TLiDAR and structural attributes of individual trees. L-Architect has finally been applied to model the architecture of forest canopy for better characterization of vertical and horizontal structure with airborne LiDAR data. This project provides a mean to answer requests of detailed canopy architectural data, difficult to obtain, to reproduce a variety of forest covers. Because of the importance of architectural models, L-Architect provides a significant contribution for improving the capacity of parameters' inversion in vegetation cover for optical and lidar remote sensing. Mots-cles: modelisation architecturale, lidar terrestre, couvert forestier, parametres structuraux, teledetection.

  6. POD and PPP with multi-frequency processing

    NASA Astrophysics Data System (ADS)

    Roldán, Pedro; Navarro, Pedro; Rodríguez, Daniel; Rodríguez, Irma

    2017-04-01

    Precise Orbit Determination (POD) and Precise Point Positioning (PPP) are methods for estimating the orbits and clocks of GNSS satellites and the precise positions and clocks of user receivers. These methods are traditionally based on processing the ionosphere-free combination. With this combination, the delay introduced in the signal when passing through the ionosphere is removed, taking advantage of the dependency of this delay with the square of the frequency. It is also possible to process the individual frequencies, but in this case it is needed to properly model the ionospheric delay. This modelling is usually very challenging, as the electron content in the ionosphere experiences important temporal and spatial variations. These two options define the two main kinds of processing: the dual-frequency ionosphere-free processing, typically used in the POD and in certain applications of PPP, and the single-frequency processing with estimation or modelisation of the ionosphere, mostly used in the PPP processing. In magicGNSS, a software tool developed by GMV for POD and PPP, a hybrid approach has been implemented. This approach combines observations from any number of individual frequencies and any number of ionosphere-free combinations of these frequencies. In such a way, the observations of ionosphere-free combination allow a better estimation of positions and orbits, while the inclusion of observations from individual frequencies allows to estimate the ionospheric delay and to reduce the noise of the solution. It is also possible to include other kind of combinations, such as geometry-free combination, instead of processing individual frequencies. The joint processing of all the frequencies for all the constellations requires both the estimation or modelisation of ionospheric delay and the estimation of inter-frequency biases. The ionospheric delay can be estimated from the single-frequency or dual-frequency geometry-free observations, but it is also possible to use a-priori information based on ionospheric models, on external estimations and on the expected behavior of the ionosphere. The inter-frequency biases appear because the delay of the signal inside the transmitter and the receiver strongly depends on its frequency. However, it is possible to include constraints in the estimator regarding these delays, assuming small variations over time. By using different types of combinations, all the available information from GNSS systems can be included in the processing. This is especially interesting for the case of Galileo satellites, which transmit in several frequencies, and the GPS IIF satellites, which transmit in L5 in addition to the traditional L1 and L2. Several experiments have been performed, to assess the improvement on performance of POD and PPP when using all the constellations and all the available frequencies for each constellation. This paper describes the new approach of multi-frequency processing, including the estimation of biases and ionospheric delays impacting on GNSS observations, and presents the results of the performed experimentation activities to assess the benefits in POD and PPP algorithms.

  7. The Second NATO Modelling and Simulation Conference(Deuxieme conference OTAN sur la modelisation et la simulation)

    DTIC Science & Technology

    2001-07-01

    Major General A C Figgures, Capability Manager (Manœuvre) UK MOD, provided the Conference with a fitting end message encouraging the SE and M&S...SESSION Welcoming Address - ‘Synthetic Environments - Managing the Breakout’ WA by M. Markin Opening Address for NATO M&S Conference OA by G. Sürsal...Keynote Address KN by G.J. Burrows Industry’s Role IR† by M. Mansell The RMCS SSEL I by J.R. Searle SESSION 1: POLICY, STRATEGY & MANAGEMENT A Strategy

  8. Models for Aircrew Safety Assessment: Uses, Limitations and Requirements (la Modelisation des conditions de securite des equipages: applications, limitations et cahiers des charges)

    DTIC Science & Technology

    1999-08-01

    immediately, re- ducing venous return artifacts during the first beat of the simulation. tn+1 - W+ on c+ / \\ W_ on c_ t 1 Xi-l Xi+1 Figure 4...s) Figure 5: The effect of network complexity. The aortic pressure is shown in Figure 5 during the fifth beat for the networks with one and three...Mechanical Engineering Department, Uni- versity of Victoria. [19] Huyghe J.M., 1986, "Nonlinear Finite Element Models of The Beating Left

  9. Modelling and Simulation as a Service: New Concepts and Service-Oriented Architectures (Modelisation et simulation en tant que service: Nouveaux concepts et architectures orientes service)

    DTIC Science & Technology

    2015-05-01

    delivery business model where S&T activities are conducted in a NATO dedicated executive body, having its own personnel, capabilities and infrastructure ...SD-4: Design for Securability 5-4 5.3.2 Recommendations on Simulation Environment Infrastructure 5-5 5.3.2.1 Recommendation IN-1: Harmonize...Critical Data and 5-5 Algorithms 5.3.2.2 Recommendation IN-2: Establish Permanent Simulation 5-5 Infrastructure 5.3.2.3 Recommendation IN-3: Establish

  10. Modelling of Molecular Structures and Properties in Physical Chemistry and Biophysics, Forty-Fourth International Meeting (Modelisation des Structures et Proprietes Moleculaires en Chimie Physique et en Biophysique, Quarante- Quatrieme Reunion Internationale)

    DTIC Science & Technology

    1989-09-01

    pyridone).Previous work on, py/ridimum, pyrazinjumn or pyrimidi im salts Koon 2 -pyrimloone and 2 - pyrimidone salts [43j have shown that some...forces. Acct . r ~[U... •K;.i. LJ , ’ 0, ’’ .t_I ..- .It . ( :.. 2 A VIBRATIONAL MOLECULAR FORCE FIELD FOR .ACROMOLECULA-R MODELLI= Gerard VERGOTENi...microscopic point of view are (1) understanding, ( 2 ) interpretation of experimental results, (3) semiquantitative estimates of experimental results and (4

  11. Propagation Modelling and Decision Aids for Communications, Radar and Navigation Systems (La Modelisation de la Propagation et Aides a la Decision Pour les Sysemes de elecommunicaions, de Radar et de Navigation)

    DTIC Science & Technology

    1994-09-01

    the refractive index i. can be density, temperature , ion composition, ionospheric determined from a simplified form of the Appleton- electric field...see Cannon 119941. the electron density profile is based upon the underlying neutral composition. temperature and wind together with electric field...in many of the newer HF predictions decision software , NSSDC/WDC-A-R&S 90-19, National Space aids. They also provide a very useful stand alone

  12. Radio Wave Propagation Modeling, Prediction and Assessment (L’Evaluation, la Prevision et la Modelisation des Ondes Hertziennes)

    DTIC Science & Technology

    1990-01-01

    modifiers and added an additional set of modifiers to adjust the average VTOP. The original DECO model made use of waveguide excitation factors and...ranges far beyond the horizon. The modified refractivity M is defined by N - N + (h/a) x 106 - N + 0.157 h (2.1) where h is the height above the earth’s...LAYEIR APPING LAVER REFRACTIVITY N MODIFIED REFRAACTIVIT M Figure 2.4. N and N profiles for an elevated duct. t /VA--’’TM tDUCT ITx IFPAT4G RELRACIVT

  13. Modelisation and distribution of neutron flux in radium-beryllium source (226Ra-Be)

    NASA Astrophysics Data System (ADS)

    Didi, Abdessamad; Dadouch, Ahmed; Jai, Otman

    2017-09-01

    Using the Monte Carlo N-Particle code (MCNP-6), to analyze the thermal, epithermal and fast neutron fluxes, of 3 millicuries of radium-beryllium, for determine the qualitative and quantitative of many materials, using method of neutron activation analysis. Radium-beryllium source of neutron is established to practical work and research in nuclear field. The main objective of this work was to enable us harness the profile flux of radium-beryllium irradiation, this theoretical study permits to discuss the design of the optimal irradiation and performance for increased the facility research and education of nuclear physics.

  14. Guide to Modelling & Simulation (M&S) for NATO Network-Enabled Capability (M&S for NNEC) (Guide de la modelisation et de la simulation (M&S) pour las NATO network-enabled capability (M&S de la NNEC))

    DTIC Science & Technology

    2010-02-01

    interdependencies, and then modifying plans according to updated projections. This is currently an immature area where further research is required. The...crosscutting.html. [7] Zeigler, B.P. and Hammonds, P. (2007). “Modelling and Simulation- Based Data Engineering: Introducing Pragmatics and Ontologies for...the optimum benefit to be obtained and while immature , ongoing research needs to be maintained. 20) Use of M&S to support complex operations needs

  15. Modelisation numerique d'un actionneur plasma de type decharge a barriere dielectrique par la methode de derive-diffusion

    NASA Astrophysics Data System (ADS)

    Xing, Jacques

    Dielectric barrier discharge (DBD) plasma actuator is a proposed device for active for control in order to improve the performances of aircraft and turbomachines. Essentially, these actuators are made of two electrodes separated by a layer of dielectric material and convert electricity directly into flow. Because of the high costs associated with experiences in realistic operating conditions, there is a need to develop a robust numerical model that can predict the plasma body force and the effects of various parameters on it. Indeed, this plasma body force can be affected by atmospheric conditions (temperature, pressure, and humidity), velocity of the neutral flow, applied voltage (amplitude, frequency, and waveform), and by the actuator geometry. In that respect, the purpose of this thesis is to implement a plasma model for DBD actuator that has the potential to consider the effects of these various parameters. In DBD actuator modelling, two types of approach are commonly proposed, low-order modelling (or phenomenological) and high-order modelling (or scientific). However a critical analysis, presented in this thesis, showed that phenomenological models are not robust enough to predict the plasma body force without artificial calibration for each specific case. Moreover, there are based on erroneous assumptions. Hence, the selected approach to model the plasma body force is a scientific drift-diffusion model with four chemical species (electrons, positive ions, negative ions, and neutrals). This model was chosen because it gives consistent numerical results comparatively with experimental data. Moreover, this model has great potential to include the effect of temperature, pressure, and humidity on the plasma body force and requires only a reasonable computational time. This model was independently implemented in C++ programming language and validated with several test cases. This model was later used to simulate the effect of the plasma body force on the laminar-turbulent transition on airfoil in order to validate the performance of this model in practical CFD simulation. Numerical results show that this model gives a better prediction of the effect of the plasma on the fluid flow for a practical case in aerospace than a phenomenological model.

  16. Etude thermo-hydraulique de l'ecoulement du moderateur dans le reacteur CANDU-6

    NASA Astrophysics Data System (ADS)

    Mehdi Zadeh, Foad

    Etant donne la taille (6,0 m x 7,6 m) ainsi que le domaine multiplement connexe qui caracterisent la cuve des reacteurs CANDU-6 (380 canaux dans la cuve), la physique qui gouverne le comportement du fluide moderateur est encore mal connue de nos jours. L'echantillonnage de donnees dans un reacteur en fonction necessite d'apporter des changements a la configuration de la cuve du reacteur afin d'y inserer des sondes. De plus, la presence d'une zone intense de radiations empeche l'utilisation des capteurs courants d'echantillonnage. En consequence, l'ecoulement du moderateur doit necessairement etre etudie a l'aide d'un modele experimental ou d'un modele numerique. Pour ce qui est du modele experimental, la fabrication et la mise en fonction de telles installations coutent tres cher. De plus, les parametres de la mise a l'echelle du systeme pour fabriquer un modele experimental a l'echelle reduite sont en contradiction. En consequence, la modelisation numerique reste une alternative importante. Actuellement, l'industrie nucleaire utilise une approche numerique, dite de milieu poreux, qui approxime le domaine par un milieu continu ou le reseau des tubes est remplace par des resistances hydrauliques distribuees. Ce modele est capable de decrire les phenomenes macroscopiques de l'ecoulement, mais ne tient pas compte des effets locaux ayant un impact sur l'ecoulement global, tel que les distributions de temperatures et de vitesses a proximite des tubes ainsi que des instabilites hydrodynamiques. Dans le contexte de la surete nucleaire, on s'interesse aux effets locaux autour des tubes de calandre. En effet, des simulations faites par cette approche predisent que l'ecoulement peut prendre plusieurs configurations hydrodynamiques dont, pour certaines, l'ecoulement montre un comportement asymetrique au sein de la cuve. Ceci peut provoquer une ebullition du moderateur sur la paroi des canaux. Dans de telles conditions, le coefficient de reactivite peut varier de maniere importante, se traduisant par l'accroissement de la puissance du reacteur. Ceci peut avoir des consequences majeures pour la surete nucleaire. Une modelisation CFD (Computational Fluid Dynamics) detaillee tenant compte des effets locaux s'avere donc necessaire. Le but de ce travail de recherche est de modeliser le comportement complexe de l'ecoulement du moderateur au sein de la cuve d'un reacteur nucleaire CANDU-6, notamment a proximite des tubes de calandre. Ces simulations servent a identifier les configurations possibles de l'ecoulement dans la calandre. Cette etude consiste ainsi a formuler des bases theoriques a l'origine des instabilites macroscopiques du moderateur, c.-a-d. des mouvements asymetriques qui peuvent provoquer l'ebullition du moderateur. Le defi du projet est de determiner l'impact de ces configurations de l'ecoulement sur la reactivite du reacteur CANDU-6.

  17. Modelisation par elements finis du muscle strie

    NASA Astrophysics Data System (ADS)

    Leonard, Mathieu

    Ce present projet de recherche a permis. de creer un modele par elements finis du muscle strie humain dans le but d'etudier les mecanismes engendrant les lesions musculaires traumatiques. Ce modele constitue une plate-forme numerique capable de discerner l'influence des proprietes mecaniques des fascias et de la cellule musculaire sur le comportement dynamique du muscle lors d'une contraction excentrique, notamment le module de Young et le module de cisaillement de la couche de tissu conjonctif, l'orientation des fibres de collagene de cette membrane et le coefficient de poisson du muscle. La caracterisation experimentale in vitro de ces parametres pour des vitesses de deformation elevees a partir de muscles stries humains actifs est essentielle pour l'etude de lesions musculaires traumatiques. Le modele numerique developpe est capable de modeliser la contraction musculaire comme une transition de phase de la cellule musculaire par un changement de raideur et de volume a l'aide des lois de comportement de materiau predefinies dans le logiciel LS-DYNA (v971, Livermore Software Technology Corporation, Livermore, CA, USA). Le present projet de recherche introduit donc un phenomene physiologique qui pourrait expliquer des blessures musculaires courantes (crampes, courbatures, claquages, etc.), mais aussi des maladies ou desordres touchant le tissu conjonctif comme les collagenoses et la dystrophie musculaire. La predominance de blessures musculaires lors de contractions excentriques est egalement exposee. Le modele developpe dans ce projet de recherche met ainsi a l'avant-scene le concept de transition de phase ouvrant la porte au developpement de nouvelles technologies pour l'activation musculaire chez les personnes atteintes de paraplegie ou de muscles artificiels compacts pour l'elaboration de protheses ou d'exosquelettes. Mots-cles Muscle strie, lesion musculaire, fascia, contraction excentrique, modele par elements finis, transition de phase

  18. Unsteady Aerodynamics - Fundamentals and Applications of Aircraft Dynamics. Conference Proceedings of the Joint Symposium of the Fluid Dynamics and Flight Mechanics Panels Held in Goettingen, Federal Republic of Germany on 6-9 May 1985.

    DTIC Science & Technology

    1985-11-01

    tourbillons daxe perpendicu-V laire A l’fcoulement principal) issu d’un profil occillant en Tamis dan;, do,, condition,, dn dorochagp dynamique. 5_10...a~rodyna- - mique sur R. A partir de cette analyse experimentale, une tentative de modelisation th~sorique des effets non *lin~ laires observes aux...cisaillement A la paroi d’un profil d’aile anim6 d’un mouvament harmonique parall~le ou parpandicu- laire A 1𔄀coulement non perturb~s", EUROMECH

  19. Modelisation of an unspecialized quadruped walking mammal.

    PubMed

    Neveu, P; Villanova, J; Gasc, J P

    2001-12-01

    Kinematics and structural analyses were used as basic data to elaborate a dynamic quadruped model that may represent an unspecialized mammal. Hedgehogs were filmed on a treadmill with a cinefluorographic system providing trajectories of skeletal elements during locomotion. Body parameters such as limb segments mass and length, and segments centre of mass were checked from cadavers. These biological parameters were compiled in order to build a virtual quadruped robot. The robot locomotor behaviour was compared with the actual hedgehog to improve the model and to disclose the necessary changes. Apart from use in robotics, the resulting model may be useful to simulate the locomotion of extinct mammals.

  20. Scaling forecast models for wind turbulence and wind turbine power intermittency

    NASA Astrophysics Data System (ADS)

    Duran Medina, Olmo; Schmitt, Francois G.; Calif, Rudy

    2017-04-01

    The intermittency of the wind turbine power remains an important issue for the massive development of this renewable energy. The energy peaks injected in the electric grid produce difficulties in the energy distribution management. Hence, a correct forecast of the wind power in the short and middle term is needed due to the high unpredictability of the intermittency phenomenon. We consider a statistical approach through the analysis and characterization of stochastic fluctuations. The theoretical framework is the multifractal modelisation of wind velocity fluctuations. Here, we consider three wind turbine data where two possess a direct drive technology. Those turbines are producing energy in real exploitation conditions and allow to test our forecast models of power production at a different time horizons. Two forecast models were developed based on two physical principles observed in the wind and the power time series: the scaling properties on the one hand and the intermittency in the wind power increments on the other. The first tool is related to the intermittency through a multifractal lognormal fit of the power fluctuations. The second tool is based on an analogy of the power scaling properties with a fractional brownian motion. Indeed, an inner long-term memory is found in both time series. Both models show encouraging results since a correct tendency of the signal is respected over different time scales. Those tools are first steps to a search of efficient forecasting approaches for grid adaptation facing the wind energy fluctuations.

  1. Modelisation de photodetecteurs a base de matrices de diodes avalanche monophotoniques pour tomographie d'emission par positrons

    NASA Astrophysics Data System (ADS)

    Corbeil Therrien, Audrey

    La tomographie d'emission par positrons (TEP) est un outil precieux en recherche preclinique et pour le diagnostic medical. Cette technique permet d'obtenir une image quantitative de fonctions metaboliques specifiques par la detection de photons d'annihilation. La detection des ces photons se fait a l'aide de deux composantes. D'abord, un scintillateur convertit l'energie du photon 511 keV en photons du spectre visible. Ensuite, un photodetecteur convertit l'energie lumineuse en signal electrique. Recemment, les photodiodes avalanche monophotoniques (PAMP) disposees en matrice suscitent beaucoup d'interet pour la TEP. Ces matrices forment des detecteurs sensibles, robustes, compacts et avec une resolution en temps hors pair. Ces qualites en font un photodetecteur prometteur pour la TEP, mais il faut optimiser les parametres de la matrice et de l'electronique de lecture afin d'atteindre les performances optimales pour la TEP. L'optimisation de la matrice devient rapidement une operation difficile, car les differents parametres interagissent de maniere complexe avec les processus d'avalanche et de generation de bruit. Enfin, l'electronique de lecture pour les matrices de PAMP demeure encore rudimentaire et il serait profitable d'analyser differentes strategies de lecture. Pour repondre a cette question, la solution la plus economique est d'utiliser un simulateur pour converger vers la configuration donnant les meilleures performances. Les travaux de ce memoire presentent le developpement d'un tel simulateur. Celui-ci modelise le comportement d'une matrice de PAMP en se basant sur les equations de physique des semiconducteurs et des modeles probabilistes. Il inclut les trois principales sources de bruit, soit le bruit thermique, les declenchements intempestifs correles et la diaphonie optique. Le simulateur permet aussi de tester et de comparer de nouvelles approches pour l'electronique de lecture plus adaptees a ce type de detecteur. Au final, le simulateur vise a quantifier l'impact des parametres du photodetecteur sur la resolution en energie et la resolution en temps et ainsi optimiser les performances de la matrice de PAMP. Par exemple, l'augmentation du ratio de surface active ameliore les performances, mais seulement jusqu'a un certain point. D'autres phenomenes lies a la surface active, comme le bruit thermique, provoquent une degradation du resultat. Le simulateur nous permet de trouver un compromis entre ces deux extremes. Les simulations avec les parametres initiaux demontrent une efficacite de detection de 16,7 %, une resolution en energie de 14,2 % LMH et une resolution en temps de 0.478 ns LMH. Enfin, le simulateur propose, bien qu'il vise une application en TEP, peut etre adapte pour d'autres applications en modifiant la source de photons et en adaptant les objectifs de performances. Mots-cles : Photodetecteurs, photodiodes avalanche monophotoniques, semiconducteurs, tomographie d'emission par positrons, simulations, modelisation, detection monophotonique, scintillateurs, circuit d'etouffement, SPAD, SiPM, Photodiodes avalanche operees en mode Geiger

  2. Methodologie de modelisation aerostructurelle d'une aile utilisant un logiciel de calcul aerodynamique et un logiciel de calcul par elements finis =

    NASA Astrophysics Data System (ADS)

    Communier, David

    Lors de l'etude structurelle d'une aile d'avion, il est difficile de modeliser fidelement les forces aerodynamiques subies par l'aile de l'avion. Pour faciliter l'analyse, on repartit la portance maximale theorique de l'aile sur son longeron principal ou sur ses nervures. La repartition utilisee implique que l'aile entiere sera plus resistante que necessaire et donc que la structure ne sera pas totalement optimisee. Pour pallier ce probleme, il faudrait s'assurer d'appliquer une repartition aerodynamique de la portance sur la surface complete de l'aile. On serait donc en mesure d'obtenir une repartition des charges sur l'aile beaucoup plus fiable. Pour le realiser, nous aurons besoin de coupler les resultats d'un logiciel calculant les charges aerodynamiques de l'aile avec les resultats d'un logiciel permettant sa conception et son analyse structurelle. Dans ce projet, le logiciel utilise pour calculer les coefficients de pression sur l'aile est XFLR5 et le logiciel permettant la conception et l'analyse structurelle sera CATIA V5. Le logiciel XFLR5 permet une analyse rapide d'une aile en se basant sur l'analyse de ses profils. Ce logiciel calcule les performances des profils de la meme maniere que XFOIL et permet de choisir parmi trois methodes de calcul pour obtenir les performances de l'aile : Lifting Line Theory (LLT), Vortex Lattice Method (VLM) et 3D Panels. Dans notre methodologie, nous utilisons la methode de calcul 3D Panels dont la validite a ete testee en soufflerie pour confirmer les calculs sur XFLR5. En ce qui concerne la conception et l'analyse par des elements finis de la structure, le logiciel CATIA V5 est couramment utilise dans le domaine aerospatial. CATIA V5 permet une automatisation des etapes de conception de l'aile. Ainsi, dans ce memoire, nous allons decrire la methodologie permettant l'etude aerostructurelle d'une aile d'avion.

  3. Modelisation numerique de tunnels de metro dans les massifs rocheux sedimentaires de la region de Montreal

    NASA Astrophysics Data System (ADS)

    Lavergne, Catherine

    Geological formations of the Montreal area are mostly made of limestones. The usual approach for design is based on rock mass classification systems considering the rock mass as an equivalent continuous and isotropic material. However, for shallow excavations, stability is generally controlled by geological structures, that in Montreal, are bedding plans that give to the rock mass a strong strain and stress anisotropy. Objects of the research are to realize a numerical modeling that considers sedimentary rocks anisotropy and to determine the influence of the design parameters on displacements, stresses and failure around metro unsupported underground excavations. Geotechnical data used for this study comes from a metro extension project and has been made available to the author. The excavation geometries analyzed are the tunnel, the station and a garage consisting of three (3) parallel tunnels for rock covered between 4 and 16 m. The numerical modeling has been done with FLAC software that represents continuous environment, and ubiquitous joint behavior model to simulate strength anisotropy of sedimentary rock masses. The model considers gravity constraints for an anisotropic material and pore pressures. In total, eleven (11) design parameters have been analyzed. Results show that unconfined compressive strength of intact rock, fault zones and pore pressures in soils have an important influence on the stability of the numerical model. The geometry of excavation, the thickness of rock covered, the RQD, Poisson's ratio and the horizontal tectonic stresses have a moderate influence. Finally, ubiquitous joint parameters, pore pressures in rock mass, width of the pillars of the garage and the damage linked to the excavation method have a low impact. FLAC results have been compared with those of UDEC, a software that uses the distinct element method. Similar conclusions were obtained on displacements, stress state and failure modes. However, UDEC model give slightly less conservative results than FLAC. This study stands up by his local character and the large amount of geotechnical data available used to determine parameters of the numerical model. The results led to recommendations for laboratory tests that can be applied to characterize more specifically anisotropy of sedimentary rocks.

  4. Processus de modelisation et etude des orages geomagnetiques dans les reseaux electriques: Impact sur le reseau de transport d'Hydro-Quebec

    NASA Astrophysics Data System (ADS)

    Abdellaoui, Amr

    This research project presents a complete modelling process of the effects of GIC on Hydro-Quebec power system network for system planning studies. The advantage of the presented method is that it enables planning engineers to simulate the effects of geomagnetic disturbances on the Hydro-Quebec System under different conditions and contingencies within reasonable calculation time frame. This modelling method of GIC in electric power systems has been applied to the Hydro-Quebec System. An equivalent HQ DC model has been achieved. A numerical calculation method of DC sources from a non-uniform geoelectric field has been developed and implemented on HQ DC model. Harmonics and increased reactive power losses of saturated transformers have been defined as a function of GIC through a binary search algorithm using a chosen HQ magnetization curve. The evolution in time of each transformer saturation according to its effective GIC has been evaluated using analytical formulas. The reactive power losses of saturated transformers have been modeled in PSS/E[1] HQ network as constant reactive current loads assigned to the corresponding transformer buses. Finally, time domain simulations have been performed with PSS/E taking into account transformer saturation times. This has been achieved by integrating HQ DC model results and analytical calculations results of transformer saturation times into an EMTP load model. An interface has been used to link EMTP load model to HQ PSS/E network. Different aspects of GIC effects on the Hydro-Quebec system have been studied, including the influence of uniform and non-uniform geoelectric fields, the comparison of reactive power losses of the 735kV HQ system with those of Montreal network, the risks to voltage levels and the importance of reactive power dynamic reserve. This dissertation presents a new GIC modelling approach for power systems for planning and operations purposes. This methodology could be further enhanced, particularly, the aspect regarding the transformer saturation times. Hence more research remains to be pursued in this area.

  5. Modelisation geometrique par NURBS pour le design aerodynamique des ailes d'avion

    NASA Astrophysics Data System (ADS)

    Bentamy, Anas

    The constant evolution of the computer science gives rise to many research areas especially in computer aided design. This study is part, of the advancement of the numerical methods in engineering computer aided design, specifically in aerospace science. The geometric modeling based on NURBS has been applied successfully to generate a parametric wing surface for aerodynamic design while satisfying manufacturing constraints. The goal of providing a smooth geometry described with few parameters has been achieved. In that case, a wing design including ruled surfaces at the leading edge slat and at the flap, and, curved central surfaces with intrinsic geometric property coming from conic curves, necessitates 130 control points and 15 geometric design variables. The 3D character of the wing need to be analyzed by techniques of investigation of surfaces in order to judge conveniently the visual aspect and detect any sign inversion in both directions of parametrization u and nu. Color mapping of the Gaussian curvature appears to be a very effective tools in visualization. The automation of the construction has been attained using an heuristic optimization algorithm, simulated annealing. The relative high speed of convergence to the solutions confirms its practical interest in engineering problems nowadays. The robustness of the geometric model has been tested successfully with an academic inverse design problem. The results obtained allow to foresee multiple possible applications from an extension to a complete geometric description of an airplane to the interaction with others disciplines belonging to a preliminary aeronautical design process.

  6. Charge Transport in Carbon Nanotubes-Polymer Composite Photovoltaic Cells

    PubMed Central

    Ltaief, Adnen; Bouazizi, Abdelaziz; Davenas, Joel

    2009-01-01

    We investigate the dark and illuminated current density-voltage (J/V) characteristics of poly(2-methoxy-5-(2’-ethylhexyloxy)1-4-phenylenevinylene) (MEH-PPV)/single-walled carbon nanotubes (SWNTs) composite photovoltaic cells. Using an exponential band tail model, the conduction mechanism has been analysed for polymer only devices and composite devices, in terms of space charge limited current (SCLC) conduction mechanism, where we determine the power parameters and the threshold voltages. Elaborated devices for MEH-PPV:SWNTs (1:1) composites showed a photoresponse with an open-circuit voltage Voc of 0.4 V, a short-circuit current density JSC of 1 µA/cm² and a fill factor FF of 43%. We have modelised the organic photovoltaic devices with an equivalent circuit, where we calculated the series and shunt resistances.

  7. LLR data analysis and impact on lunar dynamics from recent developments at OCA LLR Station

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vishnu; Fienga, Agnes; Courde, Clement; Torre, Jean-Marie; Exertier, Pierre; Samain, Etienne; Feraudy, Dominique; Albanese, Dominique; Aimar, Mourad; Mariey, Hervé; Viot, Hervé; Martinot-Lagarde, Gregoire

    2016-04-01

    Since late 2014, OCA LLR station has been able to range with infrared wavelength (1064nm). IR ranging provides both temporal and spatial improvement in the LLR observations. IR detection also permits in densification of normal points, including the L1 and L2 retroreflectors due to better signal to noise ratio. This contributes to a better modelisation of the lunar libration. The hypothesis of lunar dust and environmental effects due to the chromatic behavior noticed on returns from L2 retroreflector is discussed. In addition, data analysis shows that the effect of retroreflector tilt and the use of calibration profile for the normal point deduction algorithm, contributes to improving the precision of normal points, thereby impacting lunar dynamical models and inner physics.

  8. Snow cover volumes dynamic monitoring during melting season using high topographic accuracy approach for a Lebanese high plateau witness sinkhole

    NASA Astrophysics Data System (ADS)

    Abou Chakra, Charbel; Somma, Janine; Elali, Taha; Drapeau, Laurent

    2017-04-01

    Climate change and its negative impact on water resource is well described. For countries like Lebanon, undergoing major population's rise and already decreasing precipitations issues, effective water resources management is crucial. Their continuous and systematic monitoring overs long period of time is therefore an important activity to investigate drought risk scenarios for the Lebanese territory. Snow cover on Lebanese mountains is the most important water resources reserve. Consequently, systematic observation of snow cover dynamic plays a major role in order to support hydrologic research with accurate data on snow cover volumes over the melting season. For the last 20 years few studies have been conducted for Lebanese snow cover. They were focusing on estimating the snow cover surface using remote sensing and terrestrial measurement without obtaining accurate maps for the sampled locations. Indeed, estimations of both snow cover area and volumes are difficult due to snow accumulation very high variability and Lebanese mountains chains slopes topographic heterogeneity. Therefore, the snow cover relief measurement in its three-dimensional aspect and its Digital Elevation Model computation is essential to estimate snow cover volume. Despite the need to cover the all lebanese territory, we favored experimental terrestrial topographic site approaches due to high resolution satellite imagery cost, its limited accessibility and its acquisition restrictions. It is also most challenging to modelise snow cover at national scale. We therefore, selected a representative witness sinkhole located at Ouyoun el Siman to undertake systematic and continuous observations based on topographic approach using a total station. After four years of continuous observations, we acknowledged the relation between snow melt rate, date of total melting and neighboring springs discharges. Consequently, we are able to forecast, early in the season, dates of total snowmelt and springs low water flows which are essentially feeded by snowmelt water. Simulations were ran, predicting the snow level between two sampled dates, they provided promising result for national scale extrapolation.

  9. A Simplified and Reliable Damage Method for the Prediction of the Composites Pieces

    NASA Astrophysics Data System (ADS)

    Viale, R.; Coquillard, M.; Seytre, C.

    2012-07-01

    Structural engineers are often faced to test results on composite structures largely tougher than predicted. By attempting to reduce this frequent gap, a survey of some extensive synthesis works relative to the prediction methods and to the failure criteria was led. This inquiry dealts with the plane stress state only. All classical methods have strong and weak points wrt practice and reliability aspects. The main conclusion is that in the plane stress case, the best usaul industrial methods give predictions rather similar. But very generally they do not explain the often large discrepancies wrt the tests, mainly in the cases of strong stress gradients or of bi-axial laminate loadings. It seems that only the methods considering the complexity of the composites damages (so-called physical methods or Continuum Damage Mechanics “CDM”) bring a clear mending wrt the usual methods..The only drawback of these methods is their relative intricacy mainly in urged industrial conditions. A method with an approaching but simplified representation of the CDM phenomenology is presented. It was compared to tests and other methods: - it brings a fear improvement of the correlation with tests wrt the usual industrial methods, - it gives results very similar to the painstaking CDM methods and very close to the test results. Several examples are provided. In addition this method is really thrifty wrt the material characterization as well as for the modelisation and the computation efforts.

  10. Contribution to study of interfaces instabilities in plane, cylindrical and spherical geometry

    NASA Astrophysics Data System (ADS)

    Toque, Nathalie

    1996-12-01

    This thesis proposes several experiments of hydrodynamical instabilities which are studied, numerically and theoretically. The experiments are in plane and cylindrical geometry. Their X-ray radiographies show the evolution of an interface between two solid media crossed by a detonation wave. These materials are initially solid. They become liquide under shock wave or stay between two phases, solid and liquid. The numerical study aims at simulating with the codes EAD and Ouranos, the interfaces instabilities which appear in the experiments. The experimental radiographies and the numerical pictures are in quite good agreement. The theoretical study suggests to modelise a spatio-temporal part of the experiments to obtain the quantitative development of perturbations at the interfaces and in the flows. The models are linear and in plane, cylindrical and spherical geometry. They preceed the inoming study of transition between linear and non linear development of instabilities in multifluids flows crossed by shock waves.

  11. Generation of 238U Covariance Matrices by Using the Integral Data Assimilation Technique of the CONRAD Code

    NASA Astrophysics Data System (ADS)

    Privas, E.; Archier, P.; Bernard, D.; De Saint Jean, C.; Destouche, C.; Leconte, P.; Noguère, G.; Peneliau, Y.; Capote, R.

    2016-02-01

    A new IAEA Coordinated Research Project (CRP) aims to test, validate and improve the IRDF library. Among the isotopes of interest, the modelisation of the 238U capture and fission cross sections represents a challenging task. A new description of the 238U neutrons induced reactions in the fast energy range is within progress in the frame of an IAEA evaluation consortium. The Nuclear Data group of Cadarache participates in this effort utilizing the 238U spectral indices measurements and Post Irradiated Experiments (PIE) carried out in the fast reactors MASURCA (CEA Cadarache) and PHENIX (CEA Marcoule). Such a collection of experimental results provides reliable integral information on the (n,γ) and (n,f) cross sections. This paper presents the Integral Data Assimilation (IDA) technique of the CONRAD code used to propagate the uncertainties of the integral data on the 238U cross sections of interest for dosimetry applications.

  12. Ground observations and remote sensing data for integrated modelisation of water budget in the Merguellil catchment, Tunisia

    NASA Astrophysics Data System (ADS)

    Mougenot, Bernard

    2016-04-01

    The Mediterranean region is affected by water scarcity. Some countries as Tunisia reached the limit of 550 m3/year/capita due overexploitation of low water resources for irrigation, domestic uses and industry. A lot of programs aim to evaluate strategies to improve water consumption at regional level. In central Tunisia, on the Merguellil catchment, we develop integrated water resources modelisations based on social investigations, ground observations and remote sensing data. The main objective is to close the water budget at regional level and to estimate irrigation and water pumping to test scenarios with endusers. Our works benefit from French, bilateral and European projects (ANR, MISTRALS/SICMed, FP6, FP7…), GMES/GEOLAND-ESA) and also network projects as JECAM and AERONET, where the Merguellil site is a reference. This site has specific characteristics associating irrigated and rainfed crops mixing cereals, market gardening and orchards and will be proposed as a new environmental observing system connected to the OMERE, TENSIFT and OSR systems respectively in Tunisia, Morocco and France. We show here an original and large set of ground and remote sensing data mainly acquired from 2008 to present to be used for calibration/validation of water budget processes and integrated models for present and scenarios: - Ground data: meteorological stations, water budget at local scale: fluxes tower, soil fluxes, soil and surface temperature, soil moisture, drainage, flow, water level in lakes, aquifer, vegetation parameters on selected fieds/month (LAI, height, biomass, yield), land cover: 3 times/year, bare soil roughness, irrigation and pumping estimations, soil texture. - Remote sensing data: remote sensing products from multi-platform (MODIS, SPOT, LANDSAT, ASTER, PLEIADES, ASAR, COSMO-SkyMed, TerraSAR X…), multi-wavelength (solar, micro-wave and thermal) and multi-resolution (0.5 meters to 1 km). Ground observations are used (1) to calibrate soil-vegetation-atmosphere models at field scale on different compartment and irrigated and rainfed land during a limited time (seasons or set of dry and wet years), (2) to calibrate and validate particularly evapotranspiration derived from multi-wavelength satellite data at watershed level in relationships with the aquifer conditions: pumping and recharge rate. We will point out some examples.

  13. Prediction du profil de durete de l'acier AISI 4340 traite thermiquement au laser

    NASA Astrophysics Data System (ADS)

    Maamri, Ilyes

    Les traitements thermiques de surfaces sont des procedes qui visent a conferer au coeur et a la surface des pieces mecaniques des proprietes differentes. Ils permettent d'ameliorer la resistance a l'usure et a la fatigue en durcissant les zones critiques superficielles par des apports thermiques courts et localises. Parmi les procedes qui se distinguent par leur capacite en terme de puissance surfacique, le traitement thermique de surface au laser offre des cycles thermiques rapides, localises et precis tout en limitant les risques de deformations indesirables. Les proprietes mecaniques de la zone durcie obtenue par ce procede dependent des proprietes physicochimiques du materiau a traiter et de plusieurs parametres du procede. Pour etre en mesure d'exploiter adequatement les ressources qu'offre ce procede, il est necessaire de developper des strategies permettant de controler et regler les parametres de maniere a produire avec precision les caracteristiques desirees pour la surface durcie sans recourir au classique long et couteux processus essai-erreur. L'objectif du projet consiste donc a developper des modeles pour predire le profil de durete dans le cas de traitement thermique de pieces en acier AISI 4340. Pour comprendre le comportement du procede et evaluer les effets des differents parametres sur la qualite du traitement, une etude de sensibilite a ete menee en se basant sur une planification experimentale structuree combinee a des techniques d'analyse statistiques eprouvees. Les resultats de cette etude ont permis l'identification des variables les plus pertinentes a exploiter pour la modelisation. Suite a cette analyse et dans le but d'elaborer un premier modele, deux techniques de modelisation ont ete considerees, soient la regression multiple et les reseaux de neurones. Les deux techniques ont conduit a des modeles de qualite acceptable avec une precision d'environ 90%. Pour ameliorer les performances des modeles a base de reseaux de neurones, deux nouvelles approches basees sur la caracterisation geometrique du profil de durete ont ete considerees. Contrairement aux premiers modeles predisant le profil de durete en fonction des parametres du procede, les nouveaux modeles combinent les memes parametres avec les attributs geometriques du profil de durete pour refleter la qualite du traitement. Les modeles obtenus montrent que cette strategie conduit a des resultats tres prometteurs.

  14. Codigestion of solid wastes: a review of its uses and perspectives including modeling.

    PubMed

    Mata-Alvarez, Joan; Dosta, Joan; Macé, Sandra; Astals, Sergi

    2011-06-01

    The last two years have witnessed a dramatic increase in the number of papers published on the subject of codigestion, highlighting the relevance of this topic within anaerobic digestion research. Consequently, it seems appropriate to undertake a review of codigestion practices starting from the late 1970s, when the first papers related to this concept were published, and continuing to the present day, demonstrating the exponential growth in the interest shown in this approach in recent years. Following a general analysis of the situation, state-of-the-art codigestion is described, focusing on the two most important areas as regards publication: codigestion involving sewage sludge and the organic fraction of municipal solid waste (including a review of the secondary advantages for wastewater treatment plant related to biological nutrient removal), and codigestion in the agricultural sector, that is, including agricultural - farm wastes, and energy crops. Within these areas, a large number of oversized digesters appear which can be used to codigest other substrates, resulting in economic and environmental advantages. Although the situation may be changing, there is still a need for good examples on an industrial scale, particularly with regard to wastewater treatment plants, in order to extend this beneficial practice. In the last section, a detailed analysis of papers addressing the important aspect of modelisation is included. This analysis includes the first codigestion models to be developed as well as recent applications of the standardised anaerobic digestion model ADM1 to codigestion. (This review includes studies ranging from laboratory to industrial scale.).

  15. Impact of upper-level fine-scale structures in the deepening of a Mediterranean "hurricane"

    NASA Astrophysics Data System (ADS)

    Claud, C.; Chaboureau, J.-P.; Argence, S.; Lambert, D.; Richard, E.; Gauthier, N.; Funatsu, B.; Arbogast, P.; Maynard, K.; Hauchecorne, A.

    2009-09-01

    Subsynoptic scale vortices that have been likened to tropical cyclones or polar lows (Medicanes) are occasionally observed over the Mediterranean Sea. They are usually associated with strong winds and heavy precipitation and thus can have highly destructive effects in densely-populated regions. Only a precise forecasting of such systems could mitigate these effects. In this study, the role of an approaching upper-level Potential Vorticity (PV) maximum close to the vicinity of a Medicane which appeared early in the morning of 26 September 2006 over the Ionian Sea and moved north-eastwards affecting Apulia, is evaluated using the anelastic non-hydrostatic model Méso-NH initialized with forecasts from ARPEGE, the French operational forecasting system. To this end, in a first step, high resolution PV fields have been determined using a semi-Lagrangian advection model, MIMOSA (Modelisation Isentrope du transport Meso-echelle de l'Ozone Stratospherique par Advection). MIMOSA PV fields at and around 320 K for 25 September 2006 at 1800 UTC clearly show a stratospheric intrusion under the form of a filament crossing UK, western Europe and the Tyrrhenian Sea. MIMOSA fields show a number of details that do not appear in ECMWF analysed PV fields, and in particular an area of high PV values just west of Italy over the Tyrrhenian Sea. While the overall structure of the filament is well described by ARPEGE analysis, the high PV values in the Tyrrhenian Sea close to the coast of Italy are missing. In order to take into account these differences, ARPEGE upper-level fields have been corrected after a PV inversion guided by MIMOSA fields. Modifications of PV in ARPEGE lead to a deepest system and improved rain fields (both in location and intensity), when evaluated against ground-based observations. In a second step, Meso-NH simulations coupled with corrected and non-corrected ARPEGE forecasts have been performed. The impact of the corrections on the intensity, the trajectory and the associated precipitation has been evaluated using in situ and satellite observations, in the latter case through a model to satellite approach. When the PV corrections are applied, the track of the simulated Medicane is closer to the observed one. The deepening of the low is also better reproduced, even if it is over-estimated (982 hPa instead of 986 hPa), as well as the precipitation. This study confirms the role of fine-scale upper level structures for short range forecasting of sub-synoptic vortices over the Mediterranean Sea. It also suggests that ensemble prediction models should include perturbations related to upper-level coherent structures.

  16. Modelisation agregee de chauffe-eau electriques commandes par champ moyen pour la gestion des charges dans un reseau

    NASA Astrophysics Data System (ADS)

    Losseau, Romain

    The ongoing energy transition is about to entail important changes in the way we use and manage energy. In this view, smart grids are expected to play a significant part through the use of intelligent storage techniques. Initiated in 2014, the SmartDesc project follows this trend to create an innovative load management program by exploiting the thermal storage associated with electric water heaters existing in residential households. The device control algorithms rely on the recent theory of mean field games to achieve a decentralized control of the water heaters temperatures producing an aggregate optimal trajectory, designed to smooth the electric demand of a neighborhood. Currently, this theory does not include power and temperature constraints due to the tank heating system or necessary for the user's safety and comfort. Therefore, a trajectory violating these constraints would not be feasible and would not induce the forecast load smoothing. This master's thesis presents a method to detect the non-feasability, of a target trajectory based on the Kolmogorov equations associated with the controlled electric water heaters and suggests a way to correct it so as to make it achievable under constraints. First, a partial differential equations based model of the water heaters under temperature constraints is presented. Subsequently, a numerical scheme is developed to simulate it, and applied to the mean field control. The results of the mean field control with and without constraints are compared, and non-feasabilities of the target trajectory are highlighted upon violations. The last part of the thesis is dedicated to developing an accelerated version of the mean field and a method of correcting the target trajectory so as to enlarge as much as possible the set of achievable profiles.

  17. Modelisation numerique de l'hydrologie pour l'aide a la gestion des bassins versants, par l'utilisation conjointe des systemes d'information geographique et de la methode des elements finis un nouvel outil pour le developpement durable SAGESS

    NASA Astrophysics Data System (ADS)

    Bel Hadj Kacem, Mohamed Salah

    All hydrological processes are affected by the spatial variability of the physical parameters of the watershed, and also by human intervention on the landscape. The water outflow from a watershed strictly depends on the spatial and temporal variabilities of the physical parameters of the watershed. It is now apparent that the integration of mathematical models into GIS's can benefit both GIS and three-dimension environmental models: a true modeling capability can help the modeling community bridge the gap between planners, scientists, decision-makers and end-users. The main goal of this research is to design a practical tool to simulate run-off water surface using Geographic design a practical tool to simulate run-off water surface using Geographic Information Systems and the simulation of the hydrological behavior by the Finite Element Method.

  18. Realisation et Applications D'un Laser a Fibre a L'erbium Monofrequence

    NASA Astrophysics Data System (ADS)

    Larose, Robert

    L'incorporation d'ions de terres rares a l'interieur de la matrice de verre d'une fibre optique a permis l'emergence de composants amplificateurs tout-fibre. Le but de cette these consiste d'une part a analyser et a modeliser un tel dispositif et d'autre part, a fabriquer puis a caracteriser un amplificateur et un oscillateur a fibre. A l'aide d'une fibre fortement dopee a l'erbium fabriquee sur demande, on realise un laser a fibre syntonisable qui fonctionne en regime multimodes longitudinaux avec une largeur de raie de 1.5 GHz et egalement comme source monofrequencielle de largeur de raie de 70 kHz. Le laser sert ensuite a caracteriser un reseau de Bragg ecrit par photosensibilite dans une fibre optique. La technique de syntonisation permet aussi l'asservissement au fond d'une resonance de l'acetylene. Le laser garde alors la position centrale de la raie avec une erreur de moins de 1 MHz corrigeant ainsi les derives mecaniques de la cavite.

  19. Modelisation des emissions de particules microniques et nanometriques en usinage

    NASA Astrophysics Data System (ADS)

    Khettabi, Riad

    La mise en forme des pieces par usinage emet des particules, de tailles microscopiques et nanometriques, qui peuvent etre dangereuses pour la sante. Le but de ce travail est d'etudier les emissions de ces particules pour fins de prevention et reduction a la source. L'approche retenue est experimentale et theorique, aux deux echelles microscopique et macroscopique. Le travail commence par des essais permettant de determiner les influences du materiau, de l'outil et des parametres d'usinage sur les emissions de particules. E nsuite un nouveau parametre caracterisant les emissions, nomme Dust unit , est developpe et un modele predictif est propose. Ce modele est base sur une nouvelle theorie hybride qui integre les approches energetiques, tribologiques et deformation plastique, et inclut la geometrie de l'outil, les proprietes du materiau, les conditions de coupe et la segmentation des copeaux. Il ete valide au tournage sur quatre materiaux: A16061-T6, AISI1018, AISI4140 et fonte grise.

  20. Regard epistemique sur une evolution conceptuelle en physique au secondaire

    NASA Astrophysics Data System (ADS)

    Potvin, Patrice

    The thesis, which is in continuity with Legendre's (1993) work, deals with qualitative understanding of physics notions at the secondary level. It attempts to identify and to label, in the verbalizations of 12 to 16 year-old students, the tendencies that guide their cognitive itineraries through the exploration of problem-situations. The hypotheses of work are about modelisations, conceptions and p-prims. These last objects are seen, in DiSessa's epistemological perspective, as a type of habit that influences the determination of links between the parameters of a problem. In other words, they coordinate logically and mathematically. Methodology is based on explicitation interviews. This type of interview authorizes verbalizations that involve an "intuitive sense" of mechanics. Twenty students are invited to share their evocations as they explore the logics of a computerized microworld. This microworld has been programmed on the "Interactive Physics(TM)" software and is made of five different situations that involve speed, acceleration, mass, force and inertia. The situations are presented to the students from the least to the most complex. An analysis of the verbalizations of the five students shows the existence of elements that play a role in modelisation and qualitative construction of comprehension as well as in its qualitative/quantitative articulation. Results indicate the presence of coordinative habits easily discernible. P-prims appear to play an important part in the construction of models and in the determination of links between the variables of a problem. The analysis of the results allows to see that conceptions are not so important pieces in comprehension. As such, they seem phenotypic. Also, analysis allows to recognize the difficulty to understand properly the inverse relation (1/x) and its asymptotic nature. The "p-prim" analysis also establishes the possibility to analyze not only efficient and inefficient intuitions, but also the cognitive itineraries of students working to construct the logic of the movement of a "ball" as a whole. Implications of the thesis are, among others, at the praxic level; it becomes possible to imagine sequences of learning and teaching physics that are based on the consideration of p-prims despite the implicit nature of these objects. This is a truly constructivist practice which establishes bridges between novice and expert knowledge because there are p-prims in both of them. As so, the thesis acknowledges a perspective of learning inscribed in "continuity". It also proposes a fertile theoretical ground for the comprehension of physics.

  1. Algorithmes de couplage RANS et ecoulement potentiel

    NASA Astrophysics Data System (ADS)

    Gallay, Sylvain

    Dans le processus de developpement d'avion, la solution retenue doit satisfaire de nombreux criteres dans de nombreux domaines, comme par exemple le domaine de la structure, de l'aerodynamique, de la stabilite et controle, de la performance ou encore de la securite, tout en respectant des echeanciers precis et minimisant les couts. Les geometries candidates sont nombreuses dans les premieres etapes de definition du produit et de design preliminaire, et des environnements d'optimisations multidisciplinaires sont developpes par les differentes industries aeronautiques. Differentes methodes impliquant differents niveaux de modelisations sont necessaires pour les differentes phases de developpement du projet. Lors des phases de definition et de design preliminaires, des methodes rapides sont necessaires afin d'etudier les candidats efficacement. Le developpement de methodes ameliorant la precision des methodes existantes tout en gardant un cout de calcul faible permet d'obtenir un niveau de fidelite plus eleve dans les premieres phases de developpement du projet et ainsi grandement diminuer les risques associes. Dans le domaine de l'aerodynamisme, les developpements des algorithmes de couplage visqueux/non visqueux permettent d'ameliorer les methodes de calcul lineaires non visqueuses en methodes non lineaires prenant en compte les effets visqueux. Ces methodes permettent ainsi de caracteriser l'ecoulement visqueux sur les configurations et predire entre autre les mecanismes de decrochage ou encore la position des ondes de chocs sur les surfaces portantes. Cette these se focalise sur le couplage entre une methode d'ecoulement potentiel tridimensionnelle et des donnees de section bidimensionnelles visqueuses. Les methodes existantes sont implementees et leurs limites identifiees. Une methode originale est ensuite developpee et validee. Les resultats sur une aile elliptique demontrent la capacite de l'algorithme a de grands angles d'attaques et dans la region post-decrochage. L'algorithme de couplage a ete compare a des donnees de plus haute fidelite sur des configurations issues de la litterature. Un modele de fuselage base sur des relations empiriques et des simulations RANS a ete teste et valide. Les coefficients de portance, de trainee et de moment de tangage ainsi que les coefficients de pression extraits le long de l'envergure ont montre un bon accord avec les donnees de soufflerie et les modeles RANS pour des configurations transsoniques. Une configuration a geometrie hypersustentatoire a permis d'etudier la modelisation des surfaces hypersustentees de la methode d'ecoulement potentiel, demontrant que la cambrure peut etre prise en compte uniquement dans les donnees visqueuses.

  2. Modelling a radiology department service using a VDL integrated approach.

    PubMed

    Guglielmino, Maria Gabriella; Celano, Giovanni; Costa, Antonio; Fichera, Sergio

    2009-01-01

    The healthcare industry is facing several challenges such as the reduction of costs and quality improvement of the provided services. Engineering studies could be very useful in supporting organizational and management processes. Healthcare service efficiency depends on a strong collaboration between clinical and engineering experts, especially when it comes to analyzing the system and its constraints in detail and subsequently, when it comes to deciding on the reengineering of some key activities. The purpose of this paper is to propose a case study showing how a mix of representation tools allow a manager of a radiology department to solve some human and technological resource re-organizational issues, which have to be faced due to the introduction of a new technology and a new portfolio of services. In order to simulate the activities within the radiology department and examine the relationship between human and technological resources, different visual diagrammatic language (VDL) techniques have been implemented to get knowledge about the heterogeneous factors related to the healthcare service delivery. In particular, flow charts, IDEFO diagrams and Petri nets have been integrated each other with success as a modelisation tools. The simulation study performed through the application of the aforementioned VDL techniques suggests the opportunity of re-organizing the nurse activities within the radiology department. The re-organization of a healthcare service and in particular of a radiology department by means of joint flow charts, IDEF0 diagrams and Petri nets is a poorly investigated topic in literature. This paper demonstrates how flow charts and IDEF0 can help people working within the department to understand the weak points of their organization and constitute an efficient base of knowledge for the implementation of a Petri net aimed at improving the departmental performance.

  3. Étude de la réponse photoacoustique d'objets massifs en 3D

    NASA Astrophysics Data System (ADS)

    Séverac, H.; Mousseigne, M.; Franceschi, J. L.

    1996-11-01

    In some sectors such as microelectronics or the physics of materials, reliability is of capital importance. It is also particularly attractive to have access on informations on the material behaviour without the use of a destructive test like chemical analysis or others mechanical tests. The submitted method for non-destructive testing is based on the waves generation with a laser beam. The aim of studying the various waves in the three-dimensional space is to bring informations about materials response. Thermoelastic modelisation allowed a rigorous analytic approach and to give rise to a software written in Turbo-Pascal for a more general solution. Dans les secteurs où la fiabilité est capitale, tels la micro-électronique ou la physique des matériaux, il est particulièrement utile d'accéder aux informations sur le comportement du matériau sans avoir à utiliser une méthode destructive (analyses chimiques ou autres essais mécaniques). La méthode de contrôle non destructif présentée est basée sur la génération d'ondes par impact d'un faisceau laser focalisé à la surface d'un échantillon, sans atteindre le régime d'ablation. L'étude de la propagation des diverses ondes dans l'espace tridimensionnel permet d'apporter des mesures quantitatives sur l'analyse de la réponse des matériaux utilisés. La modélisation des phénomènes thermoélastiques a permis une approche analytique rigoureuse et donné naissance à un logiciel de simulation écrit en Turbo-Pascal pour des études plus générales.

  4. Adsorption de gaz sur les materiaux microporeux modelisation, thermodynamique et applications

    NASA Astrophysics Data System (ADS)

    Richard, Marc-Andre

    2009-12-01

    Nos travaux sur l'adsorption de gaz dans les materiaux microporeux s'inscrivent dans le cadre des recherches visant a augmenter l'efficacite du stockage de l'hydrogene a bord des vehicules. Notre objectif etait d'etudier la possibilite d'utiliser l'adsorption afin d'ameliorer l'efficacite de la liquefaction de l'hydrogene des systemes a petite echelle. Nous avons egalement evalue les performances d'un systeme de stockage cryogenique de l'hydrogene base sur la physisorption. Comme nous avons affaire a des plages de temperatures particulierement etendues et a de hautes pressions dans la region supercritique du gaz, nous avons du commencer par travailler sur la modelisation et la thermodynamique de l'adsorption. La representation de la quantite de gaz adsorbee en fonction de la temperature et de la pression par un modele semi-empirique est un outil utile pour determiner la masse de gaz adsorbee dans un systeme mais egalement pour calculer les effets thermiques lies a l'adsorption. Nous avons adapte le modele Dubinin-Astakhov (D-A) pour modeliser des isothermes d'adsorption d'hydrogene, d'azote et de methane sur du charbon actif a haute pression et sur une grande plage de temperatures supercritiques en considerant un volume d'adsorption invariant. Avec cinq parametres de regression (incluant le volume d'adsorption Va), le modele que nous avons developpe permet de tres bien representer des isothermes experimentales d'adsorption d'hydrogene (de 30 a 293 K, jusqu'a 6 MPa), d'azote (de 93 a 298 K, jusqu'a 6 MPa) et de methane (de 243 a 333 K, jusqu'a 9 MPa) sur le charbon actif. Nous avons calcule l'energie interne de la phase adsorbee a partir du modele en nous servant de la thermodynamique des solutions sans negliger le volume d'adsorption. Par la suite, nous avons presente les equations de conservation de la niasse et de l'energie pour un systeme d'adsorption et valide notre demarche en comparant des simulations et des tests d'adsorption et de desorption. En plus de l'energie interne, nous avons evalue l'entropie, l'energie differentielle d'adsorption et la chaleur isosterique d'adsorption. Nous avons etudie la performance d'un systeme de stockage d'hydrogene par adsorption pour les vehicules. La capacite de stockage d'hydrogene et les performances thermiques d'un reservoir de 150 L contenant du charbon actif Maxsorb MSC-30(TM) (surface specifique ˜ 3000 m2/g) ont ete etudiees sur une plage de temperatures de 60 a 298 K et a des pressions allant jusqu'a 35 MPa. Le systeme a ete considere de facon globale, sans nous attarder a un design particulier. Il est possible de stocker 5 kg d'hydrogene a des pressions de 7.8, 15.2 et 29 MPa pour des temperatures respectivement de 80, 114 et 172 K, lorsqu'on recupere l'hydrogene residuel a 2.5 bars en le chauffant. La simulation des phenomenes thermiques nous a permis d'analyser le refroidissement necessaire lors du remplissage, le chauffage lors de la decharge et le temps de dormance. Nous avons developpe un cycle de liquefaction de l'hydrogene base sur l'adsorption avec compression mecanique (ACM) et avons evalue sa faisabilite. L'objectif etait d'augmenter sensiblement l'efficacite des systemes de liquefaction de l'hydrogene a petite echelle (moins d'une tonne/jour) et ce, sans en augmenter le cout en capital. Nous avons adapte le cycle de refrigeration par ACM afin qu'il puisse par la suite etre ajoute a un cycle de liquefaction de l'hydrogene. Nous avons ensuite simule des cycles idealises de refrigeration par ACM. Meme dans ces conditions ideales, la refrigeration specifique est faible. De plus, l'efficacite theorique maximale de ces cycles de refrigeration est d'environ 20 a 30% de l'ideal. Nous avons realise experimentalement un cycle de refrigeration par ACM avec le couple azote/charbon actif. (Abstract shortened by UMI.)

  5. Modelisation spatio-temporelle de la vulnerabilite du milieu a la degradation des sols en milieu semi-aride a partir de donnees radar

    NASA Astrophysics Data System (ADS)

    Sylla, Daouda

    Defined as a process that reduces the potential of soil production or the usefulness of natural resources, soil degradation is a major environmental problem which affects over 41 % of the land and, over 80 % of people affected by this phenomenon live in developing countries. The general objective of the present project is the characterisation of different types of land use and land cover and the detection of their spatio-temporal changes from radar data (ERS-1, RADARSAT-1 and ENVISAT) for a spatio-temporal modeling of environmental vulnerability to soil degradation in semi-arid area. Due to the high sensitivity of the radar signal to the observing conditions of the sensor and the target, a partition of the radar images with respect to their angular configurations (23° and [33°-35°-47°]) and to environmental conditions (wet and dry) was first performed. A good characterisation and a good temporal evolution of the four types of land use and land cover of interest are obtained with different levels of contrast depending on the incidence angles and environmental conditions. In addition to pixel-based approach used for change detection (images differences, Principal component analysis), a monitoring of land cover from an object-oriented approach which focused on two types of land cover is developed. The method allows a detailed mapping of bare soil occurrences as a function of environmental conditions. Finally, using different sources of information, a modeling of the environmental vulnerability to soil degradation is performed in the South-west of Niger from the probabilistic fusion rule of Dempster-Shafer. The resulting decision maps are statistically acceptable at 93 % and 91 % with Kappa values of 86 % and 84 %, for respectively dry and wet conditions. Besides, they are used to produce a global map of the environmental vulnerability to soil degradation in this semi-arid area. Key-words: Environmental vulnerability to soil degradation; data fusion; radar images; land use changes; semi-arid environment; South-west of Niger.

  6. Caracterisation et modelisation de la degradation des proprietes fonctionnelles des AMF soumis a un chargement cyclique

    NASA Astrophysics Data System (ADS)

    Paradis, Alexandre

    The principal objective of the present thesis is to elaborate a computational model describing the mechanical properties of NiTi under different loading conditions. Secondary objectives are to build an experimental database of NiTi under stress, strain and temperature in order to validate the versatility of the new model proposed herewith. The simulation model used presently at Laboratoire sur les Alliage a Memoire et les Systemes Intelligents (LAMSI) of ETS is showing good behaviour in quasi-static loading. However, dynamic loading with the same model do not allows one to include degradation. The goal of the present thesis is to build a model capable of describing such degradation in a relatively accurate manner. Some experimental testing and results will be presented. In particular, new results on the behaviour of NiTi being paused during cycling are presented in chapter 2. A model is developed in chapter 3 based on Likhachev's micromechanical model. Good agreement is found with experimental data. Finally, an adaptation of the model is presented in chapter 4, allowing it to be eventually implemented into a finite-element commercial software.

  7. Towards self-consistent plasma modelisation in presence of neoclassical tearing mode and sawteeth: effects on transport coefficients

    NASA Astrophysics Data System (ADS)

    Basiuk, V.; Huynh, P.; Merle, A.; Nowak, S.; Sauter, O.; Contributors, JET; the EUROfusion-IM Team

    2017-12-01

    The neoclassical tearing modes (NTM) increase the effective heat and particle radial transport inside the plasma, leading to a flattening of the electron and ion temperature and density profiles at a given location depending on the safety factor q rational surface (Hegna and Callen 1997 Phys. Plasmas 4 2940). In burning plasma such as in ITER, this NTM-induced increased transport could reduce significantly the fusion performance and even lead to a disruption. Validating models describing the NTM-induced transport in present experiment is thus important to help quantifying this effect on future devices. In this work, we apply an NTM model to an integrated simulation of current, heat and particle transport on JET discharges using the European transport simulator. In this model, the heat and particle radial transport coefficients are modified by a Gaussian function locally centered at the NTM position and characterized by a full width proportional to the island size through a constant parameter adapted to obtain the best simulations of experimental profiles. In the simulation, the NTM model is turned on at the same time as the mode is triggered in the experiment. The island evolution is itself determined by the modified Rutherford equation, using self-consistent plasma parameters determined by the transport evolution. The achieved simulation reproduces the experimental measurements within the error bars, before and during the NTM. A small discrepancy is observed on the radial location of the island due to a shift of the position of the computed q = 3/2 surface compared to the experimental one. To explain such small shift (up to about 12% with respect to the position observed from the experimental electron temperature profiles), sensitivity studies of the NTM location as a function of the initialization parameters are presented. First results validate both the transport model and the transport modification calculated by the NTM model.

  8. Fate of Organic Matters in a Soil Erosion Context : Qualitative and Quantitative Monitoring in a Karst Hydrosystem

    NASA Astrophysics Data System (ADS)

    Quiers, M.; Gateuille, D.; Perrette, Y.; Naffrechoux, E.; David, B.; Malet, E.

    2017-12-01

    Soils are a key compartments of hydrosystems, especially in karst aquifers which are characterized by fast hydrologic responses to rainfalls. In steady state, soils are efficient filters preventing karst water from pollutions. But agricultural or forestry land uses can alter or even reverse the role of soils. Thus , soils can act as pollution sources rather than pollution filters. In order to manage water quality together with man activities in karst environment, the development of new tools and procedures designed to monitor the fate of soil organic matter are needed. This study reports two complementary methods applied in a moutain karst system impacted by anthropic activities and environmental stresses. A continuous monitoring of water fluorescence coupled with punctual sampling was analyzed by chemiometric methods and allowed to discriminate the type of organic matter transferred through the karst system along the year (winter / summer) and hydrological stages. As a main result, the modelisation of organic carbone fluxes is dominated by a colloidal or particulate part during highwaters, and a main part dissolved in solution during low water, demonstrating the change of organic carbone source. To confirm this result, a second method was used based on the observation of Polycyclic Aromatic Hydrocarbons (PAH) profiles. Two previous studies (Perrette et al 2013, Schwarz et al 2011) led to opposite conclusions about the fate of PAH from soil to groundwaters. This opposition leads to a potential use of PAH profiles (low molecular weight less hydrophobic ones versus high molecular weight more hydrophobic ones) as an indicator of soil erosion. We validate that use by the anaylsis of these PAH profiles for low and high waters (floods). These results demonstrate if needed the high vulnerability of karst system to soil erosion, and propose a new proxy to record soils erosion in groundwaters and in natural archives as stalagmites or sediments.

  9. Modelisation de la diffusion sur les surfaces metalliques: De l'adatome aux processus de croissance

    NASA Astrophysics Data System (ADS)

    Boisvert, Ghyslain

    Cette these est consacree a l'etude des processus de diffusion en surface dans le but ultime de comprendre, et de modeliser, la croissance d'une couche mince. L'importance de bien mai triser la croissance est primordiale compte tenu de son role dans la miniaturisation des circuits electroniques. Nous etudions ici les surface des metaux nobles et de ceux de la fin de la serie de transition. Dans un premier temps, nous nous interessons a la diffusion d'un simple adatome sur une surface metallique. Nous avons, entre autres, mis en evidence l'apparition d'une correlation entre evenements successifs lorsque la temperature est comparable a la barriere de diffusion, i.e., la diffusion ne peut pas etre associee a une marche aleatoire. Nous proposons un modele phenomenologique simple qui reproduit bien les resultats des simulations. Ces calculs nous ont aussi permis de montrer que la diffusion obeit a la loi de Meyer-Neldel. Cette loi stipule que, pour un processus active, le prefacteur augmente exponentiellement avec la barriere. En plus, ce travail permet de clarifier l'origine physique de cette loi. En comparant les resultats dynamiques aux resultats statiques, on se rend compte que la barriere extraite des calculs dynamiques est essentiellement la meme que celle obtenue par une approche statique, beaucoup plus simple. On peut donc obtenir cette barriere a l'aide de methodes plus precises, i.e., ab initio, comme la theorie de la fonctionnelle de la densite, qui sont aussi malheureusement beaucoup plus lourdes. C'est ce que nous avons fait pour plusieurs systemes metalliques. Nos resultats avec cette derniere approche se comparent tres bien aux resultats experimentaux. Nous nous sommes attardes plus longuement a la surface (111) du platine. Cette surface regorge de particularites interessantes, comme la forme d'equilibre non-hexagonale des i lots et deux sites d'adsorption differents pour l'adatome. De plus, des calculs ab initio precedents n'ont pas reussi a confirmer la forme d'equilibre et surestiment grandement la barriere. Nos calculs, plus complets et dans un formalisme mieux adapte a ce genre de probleme, predisent correctement la forme d'equilibre, qui est en fait due a un relachement different du stress de surface aux deux types de marches qui forment les cotes des i lots. Notre valeur pour la barriere est aussi fortement diminuee lorsqu'on relaxe les forces sur les atomes de la surface, amenant le resultat theorique beaucoup plus pres de la valeur experimentale. Nos calculs pour le cuivre demontre en effet que la diffusion de petits i lots pendant la croissance ne peut pas etre negligee dans ce cas, mettant en doute la valeur des interpretations des mesures experimentales. (Abstract shortened by UMI.)

  10. Modelisation de la Propagation des Ondes Sonores dans un Environnement Naturel Complexe

    NASA Astrophysics Data System (ADS)

    L'Esperance, Andre

    Ce travail est consacre a la propagation sonore a l'exterieur dans un environnement naturel complexe, i.e. en presence de conditions reelles de vent, de gradient de temperature et de turbulence atmospherique. Plus specifiquement ce travail comporte deux objectifs. D'une part, il vise a developper un modele heuristique de propagation sonore (MHP) permettant de prendre en consideration l'ensemble des phenomenes meteorologiques et acoustiques influencant la propagation du son a l'exterieur. D'autre part, il vise a identifier dans quelles circonstances et avec quelle importance les conditions meteorologiques interviennent sur la propagation sonore. Ce travail est divise en cinq parties. Apres une breve introduction identifiant les motivations de cette etude (chapitre 1), le chapitre 2 fait un rappel des travaux deja realises dans le domaine de la propagation du son a l'exterieur. Ce chapitre presente egalement les bases de l'acoustique geometrique a partir desquelles ont ete developpees la partie acoustique du modele heuristique de propagation. En outre, on y decrit comment les phenomenes de refraction et de turbulence atmospherique peuvent etre consideres dans la theorie des rayons. Le chapitre 3 presente le modele heuristique de propagation (MHP) developpe au cours de cet ouvrage. La premiere section de ce chapitre decrit le modele acoustique de propagation, modele qui fait l'hypothese d'un gradient de celerite lineaire et qui est base sur une solution hybride d'acoustique geometrique et de theorie des residus. La deuxieme section du chapitre 3 traite plus specifiquement de la modelisation des aspects meteorologiques et de la determination des profils de celerite et des index de fluctuation associes aux conditions meteorologiques. La section 3 de ce chapitre decrit comment les profils de celerite resultants sont linearises pour les calculs dans le modele acoustique, et finalement la section 4 donne les tendances generales obtenues par le modele. Le chapitre 4 decrit les compagnes de mesures qui ont ete realisees a Rock-Spring (Pennsylvanie, Etats -Unis) au cours de l'ete 90 et a Bouin (Vendee, France) au cours de l'automne 91. La campagne de mesure de Rock -Spring a porte plus specifiquement sur les effets de la refraction alors que la campagne de Bouin a prote plus specifiquement sur les effets de la turbulence. La section 4.1 decrit les equipements et le traitement des donnees meteorologiques realisees dans chaque cas et la section 4.2 fait de meme pour les resultats acoustiques. Finalement, le chapitre 5 compare les resultats experimentaux obtenus avec ceux donnes par le MHP, tant pour les resultats meteorologiques que pour les resultats acoustiques. Des comparaisons avec un autre modele (le Fast Field Program) sont egalement presentees.

  11. La conception, la modelisation et la simulation du systeme VSC-HVDC offshore

    NASA Astrophysics Data System (ADS)

    Benhalima, Seghir

    Wind energy is recognized worldwide as a proven technology to meet the growing demands of green sustainable energy. To exploit this stochastic energy source and put together with the conventional energy sources without affecting the performance of existing electrical grids, several research projects have been achieved. In addition, at ocean level, wind energy has a great potential. It means that the production of this energy will increase in the world. The optimal extraction of this energy source needs to be connected to the grid via a voltage source converter which plays the role of interface. To minimise losses due to the transport of energy at very long distances, the technology called High Voltage Direct Current based on Voltage Source Converter (VSC-HVDC) is used. To achieve this goal, a new topology is designed with a new control algorithm based on control of power generated by the wind farm, the DC voltage regulation and the synchronization between wind farm and VSC-HVDC (based on NPC). The proposed topology and its control technique are validated using the "MATLAB/Simulink program". The results are promising, because the THD is less than 5% and the power factor is close to one.

  12. MONET: multidimensional radiative cloud scene model

    NASA Astrophysics Data System (ADS)

    Chervet, Patrick

    1999-12-01

    All cloud fields exhibit variable structures (bulge) and heterogeneities in water distributions. With the development of multidimensional radiative models by the atmospheric community, it is now possible to describe horizontal heterogeneities of the cloud medium, to study these influences on radiative quantities. We have developed a complete radiative cloud scene generator, called MONET (French acronym for: MOdelisation des Nuages En Tridim.) to compute radiative cloud scene from visible to infrared wavelengths for various viewing and solar conditions, different spatial scales, and various locations on the Earth. MONET is composed of two parts: a cloud medium generator (CSSM -- Cloud Scene Simulation Model) developed by the Air Force Research Laboratory, and a multidimensional radiative code (SHDOM -- Spherical Harmonic Discrete Ordinate Method) developed at the University of Colorado by Evans. MONET computes images for several scenario defined by user inputs: date, location, viewing angles, wavelength, spatial resolution, meteorological conditions (atmospheric profiles, cloud types)... For the same cloud scene, we can output different viewing conditions, or/and various wavelengths. Shadowing effects on clouds or grounds are taken into account. This code is useful to study heterogeneity effects on satellite data for various cloud types and spatial resolutions, and to determine specifications of new imaging sensor.

  13. Etude vibroacoustique d'un systeme coque-plancher-cavite avec application a un fuselage simplifie

    NASA Astrophysics Data System (ADS)

    Missaoui, Jemai

    L'objectif de ce travail est de developper des modeles semi-analytiques pour etudier le comportement structural, acoustique et vibro-acoustique d'un systeme coque-plancher-cavite. La connection entre la coque et le plancher est assuree en utilisant le concept de rigidite artificielle. Ce concept de modelisation flexible facilite le choix des fonctions de decomposition du mouvement de chaque sous-structure. Les resultats issus de cette etude vont permettre la comprehension des phenomenes physiques de base rencontres dans une structure d'avion. Une approche integro-modale est developpee pour calculer les caracteristiques modales acoustiques. Elle utilise une discretisation de la cavite irreguliere en sous-cavites acoustiques dont les bases de developpement sont connues a priori. Cette approche, a caractere physique, presente l'avantage d'etre efficace et precise. La validite de celle-ci a ete demontree en utilisant des resultats disponibles dans la litterature. Un modele vibro-acoustique est developpe dans un but d'analyser et de comprendre les effets structuraux et acoustiques du plancher dans la configuration. La validite des resultats, en termes de resonance et de fonction de transfert, est verifiee a l'aide des mesures experimentales realisees au laboratoire.

  14. Multi-scale properties of large eddy simulations: correlations between resolved-scale velocity-field increments and subgrid-scale quantities

    NASA Astrophysics Data System (ADS)

    Linkmann, Moritz; Buzzicotti, Michele; Biferale, Luca

    2018-06-01

    We provide analytical and numerical results concerning multi-scale correlations between the resolved velocity field and the subgrid-scale (SGS) stress-tensor in large eddy simulations (LES). Following previous studies for Navier-Stokes equations, we derive the exact hierarchy of LES equations governing the spatio-temporal evolution of velocity structure functions of any order. The aim is to assess the influence of the subgrid model on the inertial range intermittency. We provide a series of predictions, within the multifractal theory, for the scaling of correlation involving the SGS stress and we compare them against numerical results from high-resolution Smagorinsky LES and from a-priori filtered data generated from direct numerical simulations (DNS). We find that LES data generally agree very well with filtered DNS results and with the multifractal prediction for all leading terms in the balance equations. Discrepancies are measured for some of the sub-leading terms involving cross-correlation between resolved velocity increments and the SGS tensor or the SGS energy transfer, suggesting that there must be room to improve the SGS modelisation to further extend the inertial range properties for any fixed LES resolution.

  15. Modelisation de l'erosion et des sources de pollution dans le bassin versant Iroquois/Blanchette dans un contexte de changements climatiques

    NASA Astrophysics Data System (ADS)

    Coulibaly, Issa

    Principale source d'approvisionnement en eau potable de la municipalite d'Edmundston, le bassin versant Iroquois/Blanchette est un enjeu capital pour cette derniere, d'ou les efforts constants deployes pour assurer la preservation de la qualite de son eau. A cet effet, plusieurs etudes y ont ete menees. Les plus recentes ont identifie des menaces de pollution de diverses origines dont celles associees aux changements climatiques (e.g. Maaref 2012). Au regard des impacts des modifications climatiques annonces a l'echelle du Nouveau-Brunswick, le bassin versant Iroquois/Blanchette pourrait etre fortement affecte, et cela de diverses facons. Plusieurs scenarios d'impacts sont envisageables, notamment les risques d'inondation, d'erosion et de pollution a travers une augmentation des precipitations et du ruissellement. Face a toutes ces menaces eventuelles, l'objectif de cette etude est d'evaluer les impacts potentiels des changements climatiques sur les risques d'erosion et de pollution a l'echelle du bassin versant Iroquois/Blanchette. Pour ce faire, la version canadienne de l'equation universelle revisee des pertes en sol RUSLE-CAN et le modele hydrologique SWAT ( Soil and Water Assessment Tool) ont ete utilises pour modeliser les risques d'erosion et de pollution au niveau dans la zone d'etude. Les donnees utilisees pour realiser ce travail proviennent de sources diverses et variees (teledetections, pedologiques, topographiques, meteorologiques, etc.). Les simulations ont ete realisees en deux etapes distinctes, d'abord dans les conditions actuelles ou l'annee 2013 a ete choisie comme annee de reference, ensuite en 2025 et 2050. Les resultats obtenus montrent une tendance a la hausse de la production de sediments dans les prochaines annees. La production maximale annuelle augmente de 8,34 % et 8,08 % respectivement en 2025 et 2050 selon notre scenario le plus optimiste, et de 29,99 % en 2025 et 29,72 % en 2050 selon le scenario le plus pessimiste par rapport a celle de 2013. Pour ce qui est de la pollution, les concentrations observees (sediment, nitrate et phosphore) connaissent une evolution avec les changements climatiques. La valeur maximale de la concentration en sediments connait une baisse en 2025 et 2050 par rapport a 2013, de 11,20 mg/l en 2013, elle passe a 9,03 mg/l en 2025 puis a 6,25 en 2050. On s'attend egalement a une baisse de la valeur maximale de la concentration en nitrate au fil des annees, plus accentuee en 2025. De 4,12 mg/l en 2013, elle passe a 1,85 mg/l en 2025 puis a 2,90 en 2050. La concentration en phosphore par contre connait une augmentation dans les annees a venir par rapport a celle de 2013, elle passe de 0,056 mg/l en 2013 a 0,234 mg/l en 2025 puis a 0,144 en 2050.

  16. Modelisation 0D/1D des emissions de particules de suie dans les turbines a gaz aeronautiques

    NASA Astrophysics Data System (ADS)

    Bisson, Jeremie

    Because of more stringent regulations of aircraft particle emissions as well as strong uncertainties about their formation and their effects on the atmosphere, a better understanding of particle microphysical mechanisms and their interactions with the engine components is required. This thesis focuses on the development of a 0D/1D combustion model with soot production in an aeronautical gas turbine. A major objective of this study is to assess the quality of soot particle emission predictions for different flight configurations. The model should eventually allow performing parametric studies on current or future engines with a minimal computation time. The model represents the combustor as well as turbines and nozzle with a chemical reactor network (CRN) that is coupled with a detailed combustion chemistry for kerosene (Jet A-1) and a soot particle dynamics model using the method of moments. The CRN was applied to the CFM56-2C1 engine during flight configurations of the LTO cycle (Landing-Take-Off) as in the APEX-1 study on aircraft particle emissions. The model was mainly validated on gas turbine thermodynamic data and pollutant concentrations (H2O, COX, NOx, SOX) which were measured in the same study. Once the first validation completed, the model was subsequently used for the computation of mass and number-based emissions indices of the soot particulate population and average diameter. Overall, the model is representative of the thermodynamic conditions and succeeds in predicting the emissions of major pollutants, particularly at high power. Concerning soot particulate emissions, the model's ability to predict simultaneously the emission indices as well as mean diameter has been partially validated. Indeed, the mass emission indices have remained higher than experimental results particularly at high power. These differences on particulate emission index may be the result of uncertainties on thermodynamic parameters of the CRN and mass air flow distribution in the combustion chamber. The analysis of the number-based emission index profile along the CRN also highlights the need to review the nucleation model that has been used and to consider in the future the implementation of a particle aggregation mechanism.

  17. Applying SF-Based Genre Approaches to English Writing Class

    ERIC Educational Resources Information Center

    Wu, Yan; Dong, Hailin

    2009-01-01

    By exploring genre approaches in systemic functional linguistics and examining the analytic tools that can be applied to the process of English learning and teaching, this paper seeks to find a way of applying genre approaches to English writing class.

  18. Utilization of the multiple scattering model for the conception of heterogeneous materials with specific electromagnetic properties

    NASA Astrophysics Data System (ADS)

    Vignéras-Lefebvre, V.; Miane, J. L.; Parneix, J. P.

    1993-03-01

    A modelisation of the behaviour of heterogeneous structures, made with spherical particles, submitted to microwave fields, using multiple scattering, is presented. First of all, we expose the principle of scattering by a single sphere, using Mie's equations represented in the form of a transfer matrix. This matrix, generalized to a distribution of particles, allows the translation of an average behaviour of the material. As part of the limit of Rayleigh, the value of the effective permittivity thus obtained is compared to experimental results. Une modélisation du comportement de structures hétérogènes composées de particules sphériques, soumises à des champs hyperfréquences, utilisant la multidiffusion, est présentée. Dans un premier temps, nous exposons le principe de la diffusion par une seule sphère en utilisant les équations de Mie représentées sous forme matricielle. Cette matrice généralisée à une distribution de particules permet de traduire un comportement moyen du matériau. Dans le cadre de l'approximation de Rayleigh, la valeur de la permittivité ainsi calculée est comparée à des résultats expérimentaux.

  19. Modelisations et inversions tri-dimensionnelles en prospections gravimetrique et electrique

    NASA Astrophysics Data System (ADS)

    Boulanger, Olivier

    The aim of this thesis is the application of gravity and resistivity methods for mining prospecting. The objectives of the present study are: (1) to build a fast gravity inversion method to interpret surface data; (2) to develop a tool for modelling the electrical potential acquired at surface and in boreholes when the resistivity distribution is heterogeneous; and (3) to define and implement a stochastic inversion scheme allowing the estimation of the subsurface resistivity from electrical data. The first technique concerns the elaboration of a three dimensional (3D) inversion program allowing the interpretation of gravity data using a selection of constraints such as the minimum distance, the flatness, the smoothness and the compactness. These constraints are integrated in a Lagrangian formulation. A multi-grid technique is also implemented to resolve separately large and short gravity wavelengths. The subsurface in the survey area is divided into juxtaposed rectangular prismatic blocks. The problem is solved by calculating the model parameters, i.e. the densities of each block. Weights are given to each block depending on depth, a priori information on density, and density range allowed for the region under investigation. The present code is tested on synthetic data. Advantages and behaviour of each method are compared in the 3D reconstruction. Recovery of geometry (depth, size) and density distribution of the original model is dependent on the set of constraints used. The best combination of constraints experimented for multiple bodies seems to be flatness and minimum volume for multiple bodies. The inversion method is tested on real gravity data. The second tool developed in this thesis is a three-dimensional electrical resistivity modelling code to interpret surface and subsurface data. Based on the integral equation, it calculates the charge density caused by conductivity gradients at each interface of the mesh allowing an exact estimation of the potential. Modelling generates a huge matrix made of Green's functions which is stored by using the method of pyramidal compression. The third method consists to interpret electrical potential measurements from a non-linear geostatistical approach including new constraints. This method estimates an analytical covariance model for the resistivity parameters from the potential data. (Abstract shortened by UMI.)

  20. Engaging Students in Applied Electromagnetics at the University of San Diego

    ERIC Educational Resources Information Center

    Lumori, M. L. D.; Kim, E. M.

    2010-01-01

    Two possible topical approaches that have been applied to teaching an upper-division undergraduate electrical engineering applied electromagnetics course are presented. Each approach was applied to one of two offerings of the course, taught in different semesters. In either case, the course includes the study of electromagnetic theory and…

  1. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    ERIC Educational Resources Information Center

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  2. On the connection between financial processes with stochastic volatility and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Queirós, S. M. D.; Tsallis, C.

    2005-11-01

    The GARCH algorithm is the most renowned generalisation of Engle's original proposal for modelising returns, the ARCH process. Both cases are characterised by presenting a time dependent and correlated variance or volatility. Besides a memory parameter, b, (present in ARCH) and an independent and identically distributed noise, ω, GARCH involves another parameter, c, such that, for c=0, the standard ARCH process is reproduced. In this manuscript we use a generalised noise following a distribution characterised by an index qn, such that qn=1 recovers the Gaussian distribution. Matching low statistical moments of GARCH distribution for returns with a q-Gaussian distribution obtained through maximising the entropy Sq=1-sumipiq/q-1, basis of nonextensive statistical mechanics, we obtain a sole analytical connection between q and left( b,c,qnright) which turns out to be remarkably good when compared with computational simulations. With this result we also derive an analytical approximation for the stationary distribution for the (squared) volatility. Using a generalised Kullback-Leibler relative entropy form based on Sq, we also analyse the degree of dependence between successive returns, zt and zt+1, of GARCH(1,1) processes. This degree of dependence is quantified by an entropic index, qop. Our analysis points the existence of a unique relation between the three entropic indexes qop, q and qn of the problem, independent of the value of (b,c).

  3. Fluctuations Magnetiques des Gaz D'electrons Bidimensionnels: Application AU Compose Supraconducteur LANTHANE(2-X) Strontium(x) Cuivre OXYGENE(4)

    NASA Astrophysics Data System (ADS)

    Benard, Pierre

    Nous presentons une etude des fluctuations magnetiques de la phase normale de l'oxyde de cuivre supraconducteur La_{2-x}Sr _{x}CuO_4 . Le compose est modelise par le Hamiltonien de Hubbard bidimensionnel avec un terme de saut vers les deuxiemes voisins (modele tt'U). Le modele est etudie en utilisant l'approximation de la GRPA (Generalized Random Phase Approximation) et en incluant les effets de la renormalisation de l'interaction de Hubbard par les diagrammes de Brueckner-Kanamori. Dans l'approche presentee dans ce travail, les maximums du facteur de structure magnetique observes par les experiences de diffusion de neutrons sont associes aux anomalies 2k _{F} de reseau du facteur de structure des gaz d'electrons bidimensionnels sans interaction. Ces anomalies proviennent de la diffusion entre particules situees a des points de la surface de Fermi ou les vitesses de Fermi sont tangentes, et conduisent a des divergences dont la nature depend de la geometrie de la surface de Fermi au voisinage de ces points. Ces resultats sont ensuite appliques au modele tt'U, dont le modele de Hubbard usuel tU est un cas particulier. Dans la majorite des cas, les interactions ne determinent pas la position des maximums du facteur de structure. Le role de l'interaction est d'augmenter l'intensite des structures du facteur de structure magnetique associees a l'instabilite magnetique du systeme. Ces structures sont souvent deja presentes dans la partie imaginaire de la susceptibilite sans interaction. Le rapport d'intensite entre les maximums absolus et les autres structures du facteur de structure magnetique permet de determiner le rapport U_ {rn}/U_{c} qui mesure la proximite d'une instabilite magnetique. Le diagramme de phase est ensuite etudie afin de delimiter la plage de validite de l'approximation. Apres avoir discute des modes collectifs et de l'effet d'une partie imaginaire non-nulle de la self-energie, l'origine de l'echelle d'energie des fluctuations magnetiques est examinee. Il est ensuite demontre que le modele a trois bandes predit les memes resultats pour la position des structures du facteur de structure magnetique que le modele a une bande, dans la limite ou l'hybridation des orbitales des atomes d'oxygene des plans Cu-O_2 et l'amplitude de sauts vers les seconds voisins sont nulles. Il est de plus constate que l'effet de l'hybridation des orbitales des atomes d'oxygene est bien modelise par le terme de saut vers les seconds voisins. Meme si ils decrivent correctement le comportement qualitatif des maximums du facteur de structure magnetique, les modeles a trois bandes et a une bande ne permettent pas d'obtenir une position de ces structures conforme avec les mesures experimentales, si on suppose que la bande est rigide, c'est-a-dire que les parametres du Hamiltonien sont independants de la concentration de strontium. Ceci peut etre cause par la dependance des parametres du Hamiltonien sur la concentration de strontium. Finalement, les resultats sont compares avec les experiences de diffusion de neutrons et les autres theories, en particulier celles de Littlewood et al. (1993) et de Q. Si et al. (1993). La comparaison avec les resultats experimentaux pour le compose de lanthane suggere que le liquide de Fermi possede une surface de Fermi disjointe, et qu'il est situe pres d'une instabilite magnetique incommensurable.

  4. Les mousses adaptatives pour l'amelioration de l'absorption acoustique: Modelisation, mise en oeuvre, mecanismes de controle

    NASA Astrophysics Data System (ADS)

    Leroy, Pierre

    The objective of this thesis is to conduct a thorough numerical and experimental analysis of the smart foam concept, in order to highlight the physical mechanisms and the technological limitations for the control of acoustic absorption. A smart foam is made of an absorbing material with an embedded actuator able to complete the lack of effectiveness of this material in the low frequencies (<500Hz). In this study, the absorbing material is a melamine foam and the actuator is a piezoelectric film of PVDF. A 3D finite element model coupling poroelastic, acoustic, elastic and piezoelectric fields is proposed. The model uses volume and surface quadratic elements. The improved formulation (u,p) is used. An orthotropic porous element is proposed. The power balance in the porous media is established. This model is a powerful and general tool allowing the modeling of all hybrid configurations using poroelastic and piezoelectric fields. Three smart foams prototypes have been built with the aim of validating the numerical model and setting up experimental active control. The comparison of numerical calculations and experimental measurements shows the validity of the model for passive aspects, transducer behaviors and also for control configuration. The active control of acoustic absorption is carried out in normal incidence with the assumption of plane wave in the frequency range [0-1500Hz]. The criterion of minimization is the reflected pressure measured by an unidirectional microphone. Three control cases were tested: off line control with a sum of pure tones, adaptive control with the nFX-LMS algorithm for a pure tone and for a random broad band noise. The results reveal the possibility of absorbing a pressure of 1.Pa at 1.00Hz with 100V and a broad band noise of 94dB with a hundred Vrms starting from 250Hz. These results have been obtained with a mean foam thickness of 4cm. The control ability of the prototypes is directly connected to the acoustic flow. An important limitation for the broad band control comes from the high distortion level through the system in the low and high frequency range (<500Hz, > 1500Hz). The use of the numerical model, supplemented by an analytical study made it possible to clarify the action mode and the dissipation mechanisms in smart foams. The PVDF moves with the same phase and amplitude of the residual incidental pressure which is not dissipated in the foam. Viscous effect dissipation is then very weak in the low frequencies and becomes more important in the high frequencies. The wave which was not been dissipated in the porous material is transmitted by the PVDF in the back cavity. The outlooks of this study are on the one hand, the improvement of the model and the prototypes and on the other hand, the widening of the field of research to the control of the acoustic transmission and the acoustic radiation of surfaces. The model could be improved by integrating viscoelastic elements able to account for the behavior of the adhesive layer between the PVDF and foam. A modelisation of electro-elastomers materials would also have to be implemented in the code. This new type of actuator could make it possible to exceed the PVDF displacement limitations. Finally it would be interesting for the industrial integration prospects to seek configurations able to maximize acoustic absorption and to limit the transmission and the radiation of surfaces at the same time.

  5. Developpement D'un Modele Climatique Regional: Fizr Simulation des Conditions de Janvier de la Cote Ouest Nord Americaine

    NASA Astrophysics Data System (ADS)

    Goyette, Stephane

    1995-11-01

    Le sujet de cette these concerne la modelisation numerique du climat regional. L'objectif principal de l'exercice est de developper un modele climatique regional ayant les capacites de simuler des phenomenes de meso-echelle spatiale. Notre domaine d'etude se situe sur la Cote Ouest nord americaine. Ce dernier a retenu notre attention a cause de la complexite du relief et de son controle sur le climat. Les raisons qui motivent cette etude sont multiples: d'une part, nous ne pouvons pas augmenter, en pratique, la faible resolution spatiale des modeles de la circulation generale de l'atmosphere (MCG) sans augmenter a outrance les couts d'integration et, d'autre part, la gestion de l'environnement exige de plus en plus de donnees climatiques regionales determinees avec une meilleure resolution spatiale. Jusqu'alors, les MCG constituaient les modeles les plus estimes pour leurs aptitudes a simuler le climat ainsi que les changements climatiques mondiaux. Toutefois, les phenomenes climatiques de fine echelle echappent encore aux MCG a cause de leur faible resolution spatiale. De plus, les repercussions socio-economiques des modifications possibles des climats sont etroitement liees a des phenomenes imperceptibles par les MCG actuels. Afin de circonvenir certains problemes inherents a la resolution, une approche pratique vise a prendre un domaine spatial limite d'un MCG et a y imbriquer un autre modele numerique possedant, lui, un maillage de haute resolution spatiale. Ce processus d'imbrication implique alors une nouvelle simulation numerique. Cette "retro-simulation" est guidee dans le domaine restreint a partir de pieces d'informations fournies par le MCG et forcee par des mecanismes pris en charge uniquement par le modele imbrique. Ainsi, afin de raffiner la precision spatiale des previsions climatiques de grande echelle, nous developpons ici un modele numerique appele FIZR, permettant d'obtenir de l'information climatique regionale valide a la fine echelle spatiale. Cette nouvelle gamme de modeles-interpolateurs imbriques qualifies d'"intelligents" fait partie de la famille des modeles dits "pilotes". L'hypothese directrice de notre etude est fondee sur la supposition que le climat de fine echelle est souvent gouverne par des forcages provenant de la surface plutot que par des transports atmospheriques de grande echelle spatiale. La technique que nous proposons vise donc a guider FIZR par la Dynamique echantillonnee d'un MCG et de la forcer par la Physique du MCG ainsi que par un forcage orographique de meso-echelle, en chacun des noeuds de la grille fine de calculs. Afin de valider la robustesse et la justesse de notre modele climatique regional, nous avons choisi la region de la Cote Ouest du continent nord americain. Elle est notamment caracterisee par une distribution geographique des precipitations et des temperatures fortement influencee par le relief sous-jacent. Les resultats d'une simulation d'un mois de janvier avec FIZR demontrent que nous pouvons simuler des champs de precipitations et de temperatures au niveau de l'abri beaucoup plus pres des observations climatiques comparativement a ceux simules a partir d'un MCG. Ces performances sont manifestement attribuees au forcage orographique de meso-echelle de meme qu'aux caracteristiques de surface determinees a fine echelle. Un modele similaire a FIZR peut, en principe, etre implante sur l'importe quel MCG, donc, tout organisme de recherche implique en modelisation numerique mondiale de grande echelle pourra se doter d'un el outil de regionalisation.

  6. Evaluating Perry's Structured Approach for Professional Doctorate Theses

    ERIC Educational Resources Information Center

    Charles, Michael; Farr-Wharton, Ben; von der Heidt, Tania; Sheldon, Neroli

    2017-01-01

    Purpose: The purpose of this paper is to investigate examiner reactions to doctorate of business administration (DBA) theses at an Australian university applying Perry's structured approach to thesis presentation, which had its origin in the marketing discipline, but is now widely applied to other business disciplines. Design/methodology/approach:…

  7. Applying Bayesian Item Selection Approaches to Adaptive Tests Using Polytomous Items

    ERIC Educational Resources Information Center

    Penfield, Randall D.

    2006-01-01

    This study applied the maximum expected information (MEI) and the maximum posterior-weighted information (MPI) approaches of computer adaptive testing item selection to the case of a test using polytomous items following the partial credit model. The MEI and MPI approaches are described. A simulation study compared the efficiency of ability…

  8. A novel family of DG methods for diffusion problems

    NASA Astrophysics Data System (ADS)

    Johnson, Philip; Johnsen, Eric

    2017-11-01

    We describe and demonstrate a novel family of numerical schemes for handling elliptic/parabolic PDE behavior within the discontinuous Galerkin (DG) framework. Starting from the mixed-form approach commonly applied for handling diffusion (examples include Local DG and BR2), the new schemes apply the Recovery concept of Van Leer to handle cell interface terms. By applying recovery within the mixed-form approach, we have designed multiple schemes that show better accuracy than other mixed-form approaches while being more flexible and easier to implement than the Recovery DG schemes of Van Leer. While typical mixed-form approaches converge at rate 2p in the cell-average or functional error norms (where p is the order of the solution polynomial), many of our approaches achieve order 2p +2 convergence. In this talk, we will describe multiple schemes, including both compact and non-compact implementations; the compact approaches use only interface-connected neighbors to form the residual for each element, while the non-compact approaches add one extra layer to the stencil. In addition to testing the schemes on purely parabolic PDE problems, we apply them to handle the diffusive flux terms in advection-diffusion systems, such as the compressible Navier-Stokes equations.

  9. A preliminary study of mechanistic approach in pavement design to accommodate climate change effects

    NASA Astrophysics Data System (ADS)

    Harnaeni, S. R.; Pramesti, F. P.; Budiarto, A.; Setyawan, A.

    2018-03-01

    Road damage is caused by some factors, including climate changes, overload, and inappropriate procedure for material and development process. Meanwhile, climate change is a phenomenon which cannot be avoided. The effects observed include air temperature rise, sea level rise, rainfall changes, and the intensity of extreme weather phenomena. Previous studies had shown the impacts of climate changes on road damage. Therefore, several measures to anticipate the damage should be considered during the planning and construction in order to reduce the cost of road maintenance. There are three approaches generally applied in the design of flexible pavement thickness, namely mechanistic approach, mechanistic-empirical (ME) approach and empirical approach. The advantages of applying mechanistic approach or mechanistic-empirical (ME) approaches are its efficiency and reliability in the design of flexible pavement thickness as well as its capacity to accommodate climate changes in compared to empirical approach. However, generally, the design of flexible pavement thickness in Indonesia still applies empirical approach. This preliminary study aimed to emphasize the importance of the shifting towards a mechanistic approach in the design of flexible pavement thickness.

  10. Applying the Cultural Formulation Approach to Career Counseling with Latinas/os

    ERIC Educational Resources Information Center

    Flores, Lisa Y.; Ramos, Karina; Kanagui, Marlen

    2010-01-01

    In this article, the authors present two hypothetical cases, one of a Mexican American female college student and one of a Mexican immigrant adult male, and apply a culturally sensitive approach to career assessment and career counseling with each of these clients. Drawing from Leong, Hardin, and Gupta's cultural formulation approach (CFA) to…

  11. Views on Montessori Approach by Teachers Serving at Schools Applying the Montessori Approach

    ERIC Educational Resources Information Center

    Atli, Sibel; Korkmaz, A. Merve; Tastepe, Taskin; Koksal Akyol, Aysel

    2016-01-01

    Problem Statement: Further studies on Montessori teachers are required on the grounds that the Montessori approach, which, having been applied throughout the world, holds an important place in the alternative education field. Yet it is novel for Turkey, and there are only a limited number of studies on Montessori teachers in Turkey. Purpose of…

  12. Applying the Subject "Cell" through Constructivist Approach during Science Lessons and the Teacher's View

    ERIC Educational Resources Information Center

    Dogru, Mustafa; Kalender, Suna

    2007-01-01

    In this study our purpose is to determine how the teachers are applying the structuralist approach in their classes by classifying the teachers according to graduated faculty, department and their years in the duty. Besides understanding the difference of the effects of structuralist approach and traditional education method on student's success…

  13. Applying the Subject "Cell" through Constructivist Approach during Science Lessons and the Teacher's View

    ERIC Educational Resources Information Center

    Dogru, Mustafa; Kalender, Suna

    2007-01-01

    In this study our purpose is to determine how the teachers are applying the constructivist approach in their classes by classifying the teachers according to graduated faculty, department and their years in the duty. Besides understanding the difference of the effects of constructivist approach and traditional education method on student success…

  14. Applying the Cognitive Information Processing Approach to Career Problem Solving and Decision Making to Women's Career Development.

    ERIC Educational Resources Information Center

    McLennan, Natasha A.; Arthur, Nancy

    1999-01-01

    Outlines an expanded framework of the Cognitive Information Processing (CIP) approach to career problem solving and decision making for career counseling with women. Addresses structural and individual barriers in women's career development and provides practical suggestions for applying and evaluating the CIP approach in career counseling.…

  15. How to anticipate the assessment of the public health benefit of new medicines?

    PubMed

    Massol, Jacques; Puech, Alain; Boissel, Jean-Pierre

    2007-01-01

    The Public Health Benefit (PHB) of new medicines is a recent and French-specific criterion (October 1999 decree) which is often only partially documented in the transparency files due to a lack of timely information. At the time of the first reimbursement application for a new medicine to the "Transparency Committee", the file is exclusively based on data from randomised clinical trials. These data are generated from a global clinical development plan which was designed a long time before the new medicine's submission for reimbursement. And this plan does not systematically provide the data needed to assess the PHB. Thus, one easily understands the difficulty to anticipate and document this recent French criterion. In France, the PHB is both one of the necessary criteria for the reimbursement submission and an indicator for the national health policy management. Its assessment also helps to identify the needs and objectives of the post-registration studies (nowadays in the scope of responsibilities of the "Drug Economics Committee"). The assessment of the PHB criterion is carried through after the marketing authorization process and is an addition to it. To understand how to anticipate the assessment of the new medicines' PHB, one needs to consider how it differs from the preliminary step of the marketing authorization process. Whereas the evaluation for marketing authorization seeks to determine if the new medicine could be useful in a specific indication, the PHB assessment aims at quantifying the therapeutic benefit in a population, taking into account the reference treatments in this population. A new medicine receives a marketing authorization based on the data of the registration file which provides information on the clinical benefit of the new medicine in the populations of the trials and in the context of the trials. On the other side, the PHB looks at the effects of the new medicine at the scale of the general population, in real practice. The PHB components of a new medicine at first submission are the expected response of this new medicine to a public health need, the expected benefit on the health status of the population and ultimately the expected impact on the health care system. The benefit of a new medicine on the health status of a population is based on public health criteria which can be morbi-mortality or quality of life criteria. However, few registration files contain these public health criteria from the beginning and the predictive value of the surrogate criteria used in the trials is not always precisely assessed. It is, thus, difficult to quantify the expected benefit on these public health criteria. Moreover, the data that enable to quantify the new medicine's effects according to the various characteristics of the target population, are rarely available. Similarly, the French population epidemiological data related to the indication of the new medicine are often not available at the time of the assessment. Therefore it is difficult to evaluate the expected number of events that could be avoided if the new medicine reached the market. The authors suggest to adapt the clinical development plan for a better documentation of the PHB. They specifically recommend to integrate to the judgment criteria (endpoints) of the trials, criteria that are relevant in terms of public health, and to check for a good heterogeneity of the trial populations. They also suggest to start early enough collecting reliable national epidemiological data and the necessary elements for the assessment of the transposability of the trial results to the French population (ability to target the patients to be treated, adaptation of the healthcare system...). About the epidemiological data, the authors consider that the needs are covered in various ways depending on the diseases. To meet the needs of evaluation of the new medicines' target populations in specific indications, they recommend to use ad hoc studies as much as needed. In addition, epidemiological studies designed for market purpose with an acceptable methodology should not be systematically rejected but deserve to be presented. To be able to assess the importance of the expected theoretical benefit of a new medicine in a population, the authors underline the necessity to have access to study results with criteria related to this objective. They suggest to first define and list the criteria by disease. Regarding the representativity of the populations, it comes out that it would be advisable, but unrealistic to include in trials a population 100% representative of the population to be treated. Therefore the effect of the new medicine must be modelised (the "effect model") to be evaluated in the general population. Yet to obtain a reliable effect model, the study population must be sufficiently heterogeneous, which legitimates the demand to ensure a good population heterogeneity at the time of decision-making about trials methodology. When the criteria assessed during the development plan does not correspond to the PHB criteria, the only way to evaluate the number of events related to the PHB criterion is, again, to use modelisation. However, modelisation is only possible when the scientific literature has established a reliable correlation between the two types of criteria. In this case, the new model should be applied to a French target population to assess the expected benefit. As a conclusion, the possibilities to estimate the expected benefit of a new medicine on the health status of a specific population are currently limited. These limitations are regrettable because such an estimate is feasible without disrupting the development plans. The authors' general recommendations to update the development plans seem especially appropriate as the additions should not only be beneficial to France but to all the health authorities who would wish to assess the expected benefit of a new medicine on their territories. Anticipating the lack of clinical and epidemiological data and the lack of data that enable to evaluate the transposability of the trials results to real clinical practice is a sine qua none condition to improve the PHB assessment. The anticipation of these needs should be planned early enough by the pharmaceutical companies which could in this purpose meet the health authorities and the heads of the French public health policy in a consultation.Finally, because of the PHB's universal dimension, it is suggested that the necessary actions and publications be initiated so that the PHB can be acknowledged at the European level.

  16. A one-model approach based on relaxed combinations of inputs for evaluating input congestion in DEA

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, Mohammad

    2009-08-01

    This paper provides a one-model approach of input congestion based on input relaxation model developed in data envelopment analysis (e.g. [G.R. Jahanshahloo, M. Khodabakhshi, Suitable combination of inputs for improving outputs in DEA with determining input congestion -- Considering textile industry of China, Applied Mathematics and Computation (1) (2004) 263-273; G.R. Jahanshahloo, M. Khodabakhshi, Determining assurance interval for non-Archimedean ele improving outputs model in DEA, Applied Mathematics and Computation 151 (2) (2004) 501-506; M. Khodabakhshi, A super-efficiency model based on improved outputs in data envelopment analysis, Applied Mathematics and Computation 184 (2) (2007) 695-703; M. Khodabakhshi, M. Asgharian, An input relaxation measure of efficiency in stochastic data analysis, Applied Mathematical Modelling 33 (2009) 2010-2023]. This approach reduces solving three problems with the two-model approach introduced in the first of the above-mentioned reference to two problems which is certainly important from computational point of view. The model is applied to a set of data extracted from ISI database to estimate input congestion of 12 Canadian business schools.

  17. Modeling of Complex Mixtures: JP-8 Toxicokinetics

    DTIC Science & Technology

    2008-10-01

    generic tissue compartments in which we have combined diffusion limitation and deep tissue (global tissue model). We also applied a QSAR approach for...SUBJECT TERMS jet fuel, JP-8, PBPK modeling, complex mixtures, nonane, decane, naphthalene, QSAR , alternative fuels 16. SECURITY CLASSIFICATION OF...necessary, to apply to the interaction of specific compounds with specific tissues. We have also applied a QSAR approach for estimating blood and tissue

  18. Development of a precision multimodal surgical navigation system for lung robotic segmentectomy

    PubMed Central

    Soldea, Valentin; Lachkar, Samy; Rinieri, Philippe; Sarsam, Mathieu; Bottet, Benjamin; Peillon, Christophe

    2018-01-01

    Minimally invasive sublobar anatomical resection is becoming more and more popular to manage early lung lesions. Robotic-assisted thoracic surgery (RATS) is unique in comparison with other minimally invasive techniques. Indeed, RATS is able to better integrate multiple streams of information including advanced imaging techniques, in an immersive experience at the level of the robotic console. Our aim was to describe three-dimensional (3D) imaging throughout the surgical procedure from preoperative planning to intraoperative assistance and complementary investigations such as radial endobronchial ultrasound (R-EBUS) and virtual bronchoscopy for pleural dye marking. All cases were operated using the DaVinci SystemTM. Modelisation was provided by Visible Patient™ (Strasbourg, France). Image integration in the operative field was achieved using the Tile Pro multi display input of the DaVinci console. Our experience was based on 114 robotic segmentectomies performed between January 2012 and October 2017. The clinical value of 3D imaging integration was evaluated in 2014 in a pilot study. Progressively, we have reached the conclusion that the use of such an anatomic model improves the safety and reliability of procedures. The multimodal system including 3D imaging has been used in more than 40 patients so far and demonstrated a perfect operative anatomic accuracy. Currently, we are developing an original virtual reality experience by exploring 3D imaging models at the robotic console level. The act of operating is being transformed and the surgeon now oversees a complex system that improves decision making. PMID:29785294

  19. Development of a precision multimodal surgical navigation system for lung robotic segmentectomy.

    PubMed

    Baste, Jean Marc; Soldea, Valentin; Lachkar, Samy; Rinieri, Philippe; Sarsam, Mathieu; Bottet, Benjamin; Peillon, Christophe

    2018-04-01

    Minimally invasive sublobar anatomical resection is becoming more and more popular to manage early lung lesions. Robotic-assisted thoracic surgery (RATS) is unique in comparison with other minimally invasive techniques. Indeed, RATS is able to better integrate multiple streams of information including advanced imaging techniques, in an immersive experience at the level of the robotic console. Our aim was to describe three-dimensional (3D) imaging throughout the surgical procedure from preoperative planning to intraoperative assistance and complementary investigations such as radial endobronchial ultrasound (R-EBUS) and virtual bronchoscopy for pleural dye marking. All cases were operated using the DaVinci System TM . Modelisation was provided by Visible Patient™ (Strasbourg, France). Image integration in the operative field was achieved using the Tile Pro multi display input of the DaVinci console. Our experience was based on 114 robotic segmentectomies performed between January 2012 and October 2017. The clinical value of 3D imaging integration was evaluated in 2014 in a pilot study. Progressively, we have reached the conclusion that the use of such an anatomic model improves the safety and reliability of procedures. The multimodal system including 3D imaging has been used in more than 40 patients so far and demonstrated a perfect operative anatomic accuracy. Currently, we are developing an original virtual reality experience by exploring 3D imaging models at the robotic console level. The act of operating is being transformed and the surgeon now oversees a complex system that improves decision making.

  20. Mechanical behavior and modelisation of Ti-6Al-4V titanium sheet under hot stamping conditions

    NASA Astrophysics Data System (ADS)

    Sirvin, Q.; Velay, V.; Bonnaire, R.; Penazzi, L.

    2017-10-01

    The Ti-6Al-4V titanium alloy is widely used for the manufacture of aeronautical and automotive parts (solid parts). In aeronautics, this alloy is employed for its excellent mechanical behavior associated with low density, outstanding corrosion resistance and good mechanical properties up to 600°C. It is especially used for the manufacture of fuselage frames, on the pylon for carrying out the primary structure (machining forged blocks) and the secondary structure in sheet form. In this last case, the sheet metal forming can be done through various methods: at room temperature by drawing operation, at very high temperature (≃900°C) by superplastic forming (SPF) and at intermediate temperature (≥750°C) by hot forming (HF). In order to reduce production costs and environmental troubles, the cycle times reduction associated with a decrease of temperature levels are relevant. This study focuses on the behavior modelling of Ti-6Al-4V alloy at temperatures above room temperature to obtained greater formability and below SPF condition to reduce tools workshop and energy costs. The displacement field measurement obtained by Digital Image Correlation (DIC) is based on innovative surface preparation pattern adapted to high temperature exposures. Different material parameters are identified to define a model able to predict the mechanical behavior of Ti-6Al-4V alloy under hot stamping conditions. The hardening plastic model identified is introduced in FEM to simulate an omega shape forming operation.

  1. Applying Mixed Methods Research at the Synthesis Level: An Overview

    ERIC Educational Resources Information Center

    Heyvaert, Mieke; Maes, Bea; Onghena, Patrick

    2011-01-01

    Historically, qualitative and quantitative approaches have been applied relatively separately in synthesizing qualitative and quantitative evidence, respectively, in several research domains. However, mixed methods approaches are becoming increasingly popular nowadays, and practices of combining qualitative and quantitative research components at…

  2. Comparison of different phase retrieval algorithms

    NASA Astrophysics Data System (ADS)

    Kaufmann, Rolf; Plamondon, Mathieu; Hofmann, Jürgen; Neels, Antonia

    2017-09-01

    X-ray phase contrast imaging is attracting more and more interest. Since the phase cannot be measured directly an indirect method using e.g. a grating interferometer has to be applied. This contribution compares three different approaches to calculate the phase from Talbot-Lau interferometer measurements using a phase-stepping approach. Besides the usually applied Fourier coefficient method also a linear fitting technique and Taylor series expansion method are applied and compared.

  3. Hamiltonian approach to slip-stacking dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S. Y.; Ng, K. Y.

    Hamiltonian dynamics has been applied to study the slip-stacking dynamics. The canonical-perturbation method is employed to obtain the second-harmonic correction term in the slip-stacking Hamiltonian. The Hamiltonian approach provides a clear optimal method for choosing the slip-stacking parameter and improving stacking efficiency. The dynamics are applied specifically to the Fermilab Booster-Recycler complex. As a result, the dynamics can also be applied to other accelerator complexes.

  4. Hamiltonian approach to slip-stacking dynamics

    DOE PAGES

    Lee, S. Y.; Ng, K. Y.

    2017-06-29

    Hamiltonian dynamics has been applied to study the slip-stacking dynamics. The canonical-perturbation method is employed to obtain the second-harmonic correction term in the slip-stacking Hamiltonian. The Hamiltonian approach provides a clear optimal method for choosing the slip-stacking parameter and improving stacking efficiency. The dynamics are applied specifically to the Fermilab Booster-Recycler complex. As a result, the dynamics can also be applied to other accelerator complexes.

  5. Ethical analysis in HTA of complex health interventions.

    PubMed

    Lysdahl, Kristin Bakke; Oortwijn, Wija; van der Wilt, Gert Jan; Refolo, Pietro; Sacchini, Dario; Mozygemba, Kati; Gerhardus, Ansgar; Brereton, Louise; Hofmann, Bjørn

    2016-03-22

    In the field of health technology assessment (HTA), there are several approaches that can be used for ethical analysis. However, there is a scarcity of literature that critically evaluates and compares the strength and weaknesses of these approaches when they are applied in practice. In this paper, we analyse the applicability of some selected approaches for addressing ethical issues in HTA in the field of complex health interventions. Complex health interventions have been the focus of methodological attention in HTA. However, the potential methodological challenges for ethical analysis are as yet unknown. Six of the most frequently described and applied ethical approaches in HTA were critically assessed against a set of five characteristics of complex health interventions: multiple and changing perspectives, indeterminate phenomena, uncertain causality, unpredictable outcomes, and ethical complexity. The assessments are based on literature and the authors' experiences of developing, applying and assessing the approaches. The Interactive, participatory HTA approach is by its nature and flexibility, applicable across most complexity characteristics. Wide Reflective Equilibrium is also flexible and its openness to different perspectives makes it better suited for complex health interventions than more rigid conventional approaches, such as Principlism and Casuistry. Approaches developed for HTA purposes are fairly applicable for complex health interventions, which one could expect because they include various ethical perspectives, such as the HTA Core Model® and the Socratic approach. This study shows how the applicability for addressing ethical issues in HTA of complex health interventions differs between the selected ethical approaches. Knowledge about these differences may be helpful when choosing and applying an approach for ethical analyses in HTA. We believe that the study contributes to increasing awareness and interest of the ethical aspects of complex health interventions in general.

  6. Augmented neural networks and problem structure-based heuristics for the bin-packing problem

    NASA Astrophysics Data System (ADS)

    Kasap, Nihat; Agarwal, Anurag

    2012-08-01

    In this article, we report on a research project where we applied augmented-neural-networks (AugNNs) approach for solving the classical bin-packing problem (BPP). AugNN is a metaheuristic that combines a priority rule heuristic with the iterative search approach of neural networks to generate good solutions fast. This is the first time this approach has been applied to the BPP. We also propose a decomposition approach for solving harder BPP, in which subproblems are solved using a combination of AugNN approach and heuristics that exploit the problem structure. We discuss the characteristics of problems on which such problem structure-based heuristics could be applied. We empirically show the effectiveness of the AugNN and the decomposition approach on many benchmark problems in the literature. For the 1210 benchmark problems tested, 917 problems were solved to optimality and the average gap between the obtained solution and the upper bound for all the problems was reduced to under 0.66% and computation time averaged below 33 s per problem. We also discuss the computational complexity of our approach.

  7. Estimating impacts of plantation forestry on plant biodiversity in southern Chile-a spatially explicit modelling approach.

    PubMed

    Braun, Andreas Christian; Koch, Barbara

    2016-10-01

    Monitoring the impacts of land-use practices is of particular importance with regard to biodiversity hotspots in developing countries. Here, conserving the high level of unique biodiversity is challenged by limited possibilities for data collection on site. Especially for such scenarios, assisting biodiversity assessments by remote sensing has proven useful. Remote sensing techniques can be applied to interpolate between biodiversity assessments taken in situ. Through this approach, estimates of biodiversity for entire landscapes can be produced, relating land-use intensity to biodiversity conditions. Such maps are a valuable basis for developing biodiversity conservation plans. Several approaches have been published so far to interpolate local biodiversity assessments in remote sensing data. In the following, a new approach is proposed. Instead of inferring biodiversity using environmental variables or the variability of spectral values, a hypothesis-based approach is applied. Empirical knowledge about biodiversity in relation to land-use is formalized and applied as ascription rules for image data. The method is exemplified for a large study site (over 67,000 km(2)) in central Chile, where forest industry heavily impacts plant diversity. The proposed approach yields a coefficient of correlation of 0.73 and produces a convincing estimate of regional biodiversity. The framework is broad enough to be applied to other study sites.

  8. An Iterative Approach for the Optimization of Pavement Maintenance Management at the Network Level

    PubMed Central

    Torres-Machí, Cristina; Chamorro, Alondra; Videla, Carlos; Yepes, Víctor

    2014-01-01

    Pavement maintenance is one of the major issues of public agencies. Insufficient investment or inefficient maintenance strategies lead to high economic expenses in the long term. Under budgetary restrictions, the optimal allocation of resources becomes a crucial aspect. Two traditional approaches (sequential and holistic) and four classes of optimization methods (selection based on ranking, mathematical optimization, near optimization, and other methods) have been applied to solve this problem. They vary in the number of alternatives considered and how the selection process is performed. Therefore, a previous understanding of the problem is mandatory to identify the most suitable approach and method for a particular network. This study aims to assist highway agencies, researchers, and practitioners on when and how to apply available methods based on a comparative analysis of the current state of the practice. Holistic approach tackles the problem considering the overall network condition, while the sequential approach is easier to implement and understand, but may lead to solutions far from optimal. Scenarios defining the suitability of these approaches are defined. Finally, an iterative approach gathering the advantages of traditional approaches is proposed and applied in a case study. The proposed approach considers the overall network condition in a simpler and more intuitive manner than the holistic approach. PMID:24741352

  9. An iterative approach for the optimization of pavement maintenance management at the network level.

    PubMed

    Torres-Machí, Cristina; Chamorro, Alondra; Videla, Carlos; Pellicer, Eugenio; Yepes, Víctor

    2014-01-01

    Pavement maintenance is one of the major issues of public agencies. Insufficient investment or inefficient maintenance strategies lead to high economic expenses in the long term. Under budgetary restrictions, the optimal allocation of resources becomes a crucial aspect. Two traditional approaches (sequential and holistic) and four classes of optimization methods (selection based on ranking, mathematical optimization, near optimization, and other methods) have been applied to solve this problem. They vary in the number of alternatives considered and how the selection process is performed. Therefore, a previous understanding of the problem is mandatory to identify the most suitable approach and method for a particular network. This study aims to assist highway agencies, researchers, and practitioners on when and how to apply available methods based on a comparative analysis of the current state of the practice. Holistic approach tackles the problem considering the overall network condition, while the sequential approach is easier to implement and understand, but may lead to solutions far from optimal. Scenarios defining the suitability of these approaches are defined. Finally, an iterative approach gathering the advantages of traditional approaches is proposed and applied in a case study. The proposed approach considers the overall network condition in a simpler and more intuitive manner than the holistic approach.

  10. Islamic approach in counseling.

    PubMed

    Hanin Hamjah, Salasiah; Mat Akhir, Noor Shakirah

    2014-02-01

    A religious approach is one of the matters emphasized in counseling today. Many researchers find that there is a need to apply the religious element in counseling because religion is important in a client's life. The purpose of this research is to identify aspects of the Islamic approach applied in counseling clients by counselors at Pusat Kaunseling Majlis Agama Islam Negeri Sembilan (PKMAINS). In addition, this research also analyses the Islamic approach applied in counseling at PKMAINS with reference to al-Quran and al-Sunnah. This is a qualitative research in the form of case study at PKMAINS. The main method used in this research is interview. The research instrument used is interview protocol. The respondents in this study include 9 counselors who serve in one of the counseling centers in Malaysia. This study also uses questionnaire as an additional instrument, distributed to 36 clients who receive counseling service at the center. The findings of the study show that the Islamic approach applied in counseling at PKMAINS may be categorized into three main aspects: aqidah (faith), ibadah (worship/ultimate devotion and love for God) and akhlaq (moral conduct). Findings also show that the counseling in these aspects is in line with Islamic teachings as contained in al-Quran and al-Sunnah.

  11. Transformative Learning Approaches for Public Relations Pedagogy

    ERIC Educational Resources Information Center

    Motion, Judy; Burgess, Lois

    2014-01-01

    Public relations educators are frequently challenged by students' flawed perceptions of public relations. Two contrasting case studies are presented in this paper to illustrate how socially-oriented paradigms may be applied to a real-client project to deliver a transformative learning experience. A discourse-analytic approach is applied within the…

  12. Health Care Communication: A Problematic Site for Applied Linguistics Research.

    ERIC Educational Resources Information Center

    Candlin, Christopher N.; Candlin, Sally

    2003-01-01

    Addresses how applied linguists and those concerned with discourse analysis in particular have recently approached the study of health care communication, especially in intercultural contexts, and relates these approaches to studies undertaken by researchers in other academic disciplines, such as the sociology of medicine and by health care…

  13. [A comprehensive approach to designing of magnetotherapy techniques based on the Atos device].

    PubMed

    Raĭgorodskiĭ, Iu M; Semiachkin, G P; Tatarenko, D A

    1995-01-01

    The paper determines how to apply a comprehensive approach to designing magnetic therapeutical techniques based on concomitant exposures to two or more physical factors. It shows the advantages of the running pattern of a magnetic field and photostimuli in terms of optimization of physiotherapeutical exposures. An Atos apparatus with an Amblio-1 attachment is used as an example to demonstrate how to apply the comprehensive approach for ophthalmology.

  14. An Autonomous Sensor Tasking Approach for Large Scale Space Object Cataloging

    NASA Astrophysics Data System (ADS)

    Linares, R.; Furfaro, R.

    The field of Space Situational Awareness (SSA) has progressed over the last few decades with new sensors coming online, the development of new approaches for making observations, and new algorithms for processing them. Although there has been success in the development of new approaches, a missing piece is the translation of SSA goals to sensors and resource allocation; otherwise known as the Sensor Management Problem (SMP). This work solves the SMP using an artificial intelligence approach called Deep Reinforcement Learning (DRL). Stable methods for training DRL approaches based on neural networks exist, but most of these approaches are not suitable for high dimensional systems. The Asynchronous Advantage Actor-Critic (A3C) method is a recently developed and effective approach for high dimensional systems, and this work leverages these results and applies this approach to decision making in SSA. The decision space for the SSA problems can be high dimensional, even for tasking of a single telescope. Since the number of SOs in space is relatively high, each sensor will have a large number of possible actions at a given time. Therefore, efficient DRL approaches are required when solving the SMP for SSA. This work develops a A3C based method for DRL applied to SSA sensor tasking. One of the key benefits of DRL approaches is the ability to handle high dimensional data. For example DRL methods have been applied to image processing for the autonomous car application. For example, a 256x256 RGB image has 196608 parameters (256*256*3=196608) which is very high dimensional, and deep learning approaches routinely take images like this as inputs. Therefore, when applied to the whole catalog the DRL approach offers the ability to solve this high dimensional problem. This work has the potential to, for the first time, solve the non-myopic sensor tasking problem for the whole SO catalog (over 22,000 objects) providing a truly revolutionary result.

  15. Sustainable intensification: a multifaceted, systemic approach to international development.

    PubMed

    Himmelstein, Jennifer; Ares, Adrian; van Houweling, Emily

    2016-12-01

    Sustainable intensification (SI) is a term increasingly used to describe a type of approach applied to international agricultural projects. Despite its widespread use, there is still little understanding or knowledge of the various facets of this composite paradigm. A review of the literature has led to the formalization of three principles that convey the current characterization of SI, comprising a whole system, participatory, agroecological approach. Specific examples of potential bottlenecks to the SI approach are cited, in addition to various technologies and techniques that can be applied to overcome these obstacles. Models of similar, succcessful approaches to agricultural development are examined, along with higher level processes. Additionally, this review explores the desired end points of SI and argues for the inclusion of gender and nutrition throughout the process. To properly apply the SI approach, its various aspects need to be understood and adapted to different cultural and geographic situations. New modeling systems and examples of the effective execution of SI strategies can assist with the successful application of the SI paradigm within complex developing communities. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  16. Force approach to radiation reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    López, Gustavo V., E-mail: gulopez@udgserv.cencar.udg.mx

    The difficulty of the usual approach to deal with the radiation reaction is pointed out, and under the condition that the radiation force must be a function of the external force and is zero whenever the external force be zero, a new and straightforward approach to radiation reaction force and damping is proposed. Starting from the Larmor formula for the power radiated by an accelerated charged particle, written in terms of the applied force instead of the acceleration, an expression for the radiation force is established in general, and applied to the examples for the linear and circular motion ofmore » a charged particle. This expression is quadratic in the magnitude of the applied force, inversely proportional to the speed of the charged particle, and directed opposite to the velocity vector. This force approach may contribute to the solution of the very old problem of incorporating the radiation reaction to the motion of the charged particles, and future experiments may tell us whether or not this approach point is in the right direction.« less

  17. Scenario-based modeling for multiple allocation hub location problem under disruption risk: multiple cuts Benders decomposition approach

    NASA Astrophysics Data System (ADS)

    Yahyaei, Mohsen; Bashiri, Mahdi

    2017-12-01

    The hub location problem arises in a variety of domains such as transportation and telecommunication systems. In many real-world situations, hub facilities are subject to disruption. This paper deals with the multiple allocation hub location problem in the presence of facilities failure. To model the problem, a two-stage stochastic formulation is developed. In the proposed model, the number of scenarios grows exponentially with the number of facilities. To alleviate this issue, two approaches are applied simultaneously. The first approach is to apply sample average approximation to approximate the two stochastic problem via sampling. Then, by applying the multiple cuts Benders decomposition approach, computational performance is enhanced. Numerical studies show the effective performance of the SAA in terms of optimality gap for small problem instances with numerous scenarios. Moreover, performance of multi-cut Benders decomposition is assessed through comparison with the classic version and the computational results reveal the superiority of the multi-cut approach regarding the computational time and number of iterations.

  18. Minitheories: A Modular Approach to Learning Applied to Science.

    ERIC Educational Resources Information Center

    Claxton, Guy

    Perspectives on a psychological approach to learning are offered in this paper. Specific emphasis is directed to the assumption that children possess "minitheories." Minitheories are defined as attempts to make sense of particular kinds of experiences and are explained and delimited by the domain of experience to which they currently apply. This…

  19. Using the SCR Specification Technique in a High School Programming Course.

    ERIC Educational Resources Information Center

    Rosen, Edward; McKim, James C., Jr.

    1992-01-01

    Presents the underlying ideas of the Software Cost Reduction (SCR) approach to requirements specifications. Results of applying this approach to the teaching of programing to high school students indicate that students perform better in writing programs. An appendix provides two examples of how the method is applied to problem solving. (MDH)

  20. Applying Agrep to r-NSA to solve multiple sequences approximate matching.

    PubMed

    Ni, Bing; Wong, Man-Hon; Lam, Chi-Fai David; Leung, Kwong-Sak

    2014-01-01

    This paper addresses the approximate matching problem in a database consisting of multiple DNA sequences, where the proposed approach applies Agrep to a new truncated suffix array, r-NSA. The construction time of the structure is linear to the database size, and the computations of indexing a substring in the structure are constant. The number of characters processed in applying Agrep is analysed theoretically, and the theoretical upper-bound can approximate closely the empirical number of characters, which is obtained through enumerating the characters in the actual structure built. Experiments are carried out using (synthetic) random DNA sequences, as well as (real) genome sequences including Hepatitis-B Virus and X-chromosome. Experimental results show that, compared to the straight-forward approach that applies Agrep to multiple sequences individually, the proposed approach solves the matching problem in much shorter time. The speed-up of our approach depends on the sequence patterns, and for highly similar homologous genome sequences, which are the common cases in real-life genomes, it can be up to several orders of magnitude.

  1. The contribution of applied social sciences to obesity stigma-related public health approaches.

    PubMed

    Bombak, Andrea E

    2014-01-01

    Obesity is viewed as a major public health concern, and obesity stigma is pervasive. Such marginalization renders obese persons a "special population." Weight bias arises in part due to popular sources' attribution of obesity causation to individual lifestyle factors. This may not accurately reflect the experiences of obese individuals or their perspectives on health and quality of life. A powerful role may exist for applied social scientists, such as anthropologists or sociologists, in exploring the lived and embodied experiences of this largely discredited population. This novel research may aid in public health intervention planning. Through these studies, applied social scientists could help develop a nonstigmatizing, salutogenic approach to public health that accurately reflects the health priorities of all individuals. Such an approach would call upon applied social science's strengths in investigating the mundane, problematizing the "taken for granted" and developing emic (insiders') understandings of marginalized populations.

  2. A Multifaceted Mathematical Approach for Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexander, F.; Anitescu, M.; Bell, J.

    2012-03-07

    Applied mathematics has an important role to play in developing the tools needed for the analysis, simulation, and optimization of complex problems. These efforts require the development of the mathematical foundations for scientific discovery, engineering design, and risk analysis based on a sound integrated approach for the understanding of complex systems. However, maximizing the impact of applied mathematics on these challenges requires a novel perspective on approaching the mathematical enterprise. Previous reports that have surveyed the DOE's research needs in applied mathematics have played a key role in defining research directions with the community. Although these reports have had significantmore » impact, accurately assessing current research needs requires an evaluation of today's challenges against the backdrop of recent advances in applied mathematics and computing. To address these needs, the DOE Applied Mathematics Program sponsored a Workshop for Mathematics for the Analysis, Simulation and Optimization of Complex Systems on September 13-14, 2011. The workshop had approximately 50 participants from both the national labs and academia. The goal of the workshop was to identify new research areas in applied mathematics that will complement and enhance the existing DOE ASCR Applied Mathematics Program efforts that are needed to address problems associated with complex systems. This report describes recommendations from the workshop and subsequent analysis of the workshop findings by the organizing committee.« less

  3. In-flight calibration/validation of the ENVISAT/MWR

    NASA Astrophysics Data System (ADS)

    Tran, N.; Obligis, E.; Eymard, L.

    2003-04-01

    Retrieval algorithms for wet tropospheric correction, integrated vapor and liquid water contents, atmospheric attenuations of backscattering coefficients in Ku and S band, have been developed using a database of geophysical parameters from global analyses from a meteorological model and corresponding simulated brightness temperatures and backscattering cross-sections by a radiative transfer model. Meteorological data correspond to 12 hours predictions from the European Center for Medium range Weather Forecasts (ECMWF) model. Relationships between satellite measurements and geophysical parameters are determined using a statistical method. The quality of the retrieval algorithms depends therefore on the representativity of the database, the accuracy of the radiative transfer model used for the simulations and finally on the quality of the inversion model. The database has been built using the latest version of the ECMWF forecast model, which has been operationally run since November 2000. The 60 levels in the model allow a complete description of the troposphere/stratosphere profiles and the horizontal resolution is now half of a degree. The radiative transfer model is the emissivity model developed at the Université Catholique de Louvain [Lemaire, 1998], coupled to an atmospheric model [Liebe et al, 1993] for gaseous absorption. For the inversion, we have replaced the classical log-linear regression with a neural networks inversion. For Envisat, the backscattering coefficient in Ku band is used in the different algorithms to take into account the surface roughness as it is done with the 18 GHz channel for the TOPEX algorithms or an additional term in wind speed for ERS2 algorithms. The in-flight calibration/validation of the Envisat radiometer has been performed with the tuning of 3 internal parameters (the transmission coefficient of the reflector, the sky horn feed transmission coefficient and the main antenna transmission coefficient). First an adjustment of the ERS2 brightness temperatures to the simulations for the 2000/2001 version of the ECMWF model has been applied. Then, Envisat brightness temperatures have been calibrated on these adjusted ERS2 values. The advantages of this calibration approach are that : i) such a method provides the relative discrepancy with respect to the simulation chain. The results, obtained simultaneously for several radiometers (we repeat the same analyze with TOPEX and JASON radiometers), can be used to detect significant calibration problems, more than 2 3 K). ii) the retrieval algorithms have been developed using the same meteorological model (2000/2001 version of the ECMWF model), and the same radiative transfer model than the calibration process, insuring the consistency between calibration and retrieval processing. Retrieval parameters are then optimized. iii) the calibration of the Envisat brightness temperatures over the 2000/2001 version of the ECMWF model, as well as the recommendation to use the same model as a reference to correct ERS2 brightness temperatures, allow the use of the same retrieval algorithms for the two missions, providing the continuity between the two. iv) by comparison with other calibration methods (such as systematic calibration of an instrument or products by using respectively the ones from previous mission), this method is more satisfactory since improvements in terms of technology, modelisation, retrieval processing are taken into account. For the validation of the brightness temperatures, we use either a direct comparison with measurements provided by other instruments in similar channel, or the monitoring over stable areas (coldest ocean points, stable continental areas). The validation of the wet tropospheric correction can be also provided by comparison with other radiometer products, but the only real validation rely on the comparison between in-situ measurements (performed by radiosonding) and retrieved products in coincidence.

  4. Motivational interviewing as a pedagogical approach in behavioral science education: "walking the talk".

    PubMed

    Triana, A Catalina; Olson, Michael

    2013-01-01

    Motivational Interviewing (MI) is an evidence-based approach to facilitating behavior change. This approach has been applied in multiple settings (e.g., healthcare, drug and alcohol treatment, psychotherapy, health and wellness coaching, etc.). This article applies MI in a pedagogical context with medical residents as a semi-directive, learner-centered teaching style for eliciting clinical behavior change. Herein we present the foundational theories that inform this approach, describe the process of teaching, address barriers and challenges, and conclude with a review of performance to date including residents' narrative accounts of their experience with the curriculum.

  5. Predicting relationship between magnetostriction and applied field of magnetostrictive composites

    NASA Astrophysics Data System (ADS)

    Guan, Xinchun; Dong, Xufeng; Ou, Jinping

    2008-03-01

    Consideration of demagnetization effect, the model used to calculate the magnetostriction of single particle under the applied field is firstly built up. Then treating the magnetostriction particulate as an eigenstrain, based on Eshelby equivalent inclusion and Mori-Tanaka method, the approach to calculate average magnetostriction of the composites under any applied field as well as saturation is studied. Results calculated by the approach indicate that saturation magnetostriction of magnetostrictive composites increases with increasing of particle aspect, particle volume fraction and decreasing of Young' modulus of matrix, and the influence of applied field on magnetostriction of the composites becomes more significant with larger particle volume fraction or particle aspect.

  6. Embracing uncertainty in applied ecology.

    PubMed

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  7. A modelling methodology to assess the effect of insect pest control on agro-ecosystems.

    PubMed

    Wan, Nian-Feng; Ji, Xiang-Yun; Jiang, Jie-Xian; Li, Bo

    2015-04-23

    The extensive use of chemical pesticides for pest management in agricultural systems can entail risks to the complex ecosystems consisting of economic, ecological and social subsystems. To analyze the negative and positive effects of external or internal disturbances on complex ecosystems, we proposed an ecological two-sidedness approach which has been applied to the design of pest-controlling strategies for pesticide pollution management. However, catastrophe theory has not been initially applied to this approach. Thus, we used an approach of integrating ecological two-sidedness with a multi-criterion evaluation method of catastrophe theory to analyze the complexity of agro-ecosystems disturbed by the insecticides and screen out the best insect pest-controlling strategy in cabbage production. The results showed that the order of the values of evaluation index (RCC/CP) for three strategies in cabbage production was "applying frequency vibration lamps and environment-friendly insecticides 8 times" (0.80) < "applying trap devices and environment-friendly insecticides 9 times" (0.83) < "applying common insecticides 14 times" (1.08). The treatment "applying frequency vibration lamps and environment-friendly insecticides 8 times" was considered as the best insect pest-controlling strategy in cabbage production in Shanghai, China.

  8. Computer simulation of morphological evolution and rafting of {gamma}{prime} particles in Ni-based superalloys under applied stresses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, D.Y.; Chen, L.Q.

    Mechanical properties of Ni-based superalloys are strongly affected by the morphology, distribution, and size of {gamma}{prime} precipitates in the {gamma} matrix. The main purpose of this paper is to propose a continuum field approach for modeling the morphology and rafting kinetics of coherent precipitates under applied stresses. This approach can be used to simulate the temporal evolution of arbitrary morphologies and microstructures without any a priori assumption. Recently, the authors applied this approach to the selected variant growth in Ni-Ti alloys under applied stresses using an inhomogeneous modulus approximation. For the {gamma}{prime} precipitates in Ni-based superalloys, the eigenstrain is dilatational,more » and hence the {gamma}{prime} morphological evolution can be affected by applied stresses only when the elastic modulus is inhomogeneous. In the present work, the elastic inhomogeneity was taken into account by reformulating a sharp-interface elasticity theory developed recently by Khachaturyan et al. in terms of diffuse interfaces. Although the present work is for a {gamma}{prime} {minus} {gamma} system, this model is general in a sense that it can be applied to other alloy systems containing coherent ordered intermetallic precipitates with elastic inhomogeneity.« less

  9. A modelling methodology to assess the effect of insect pest control on agro-ecosystems

    PubMed Central

    Wan, Nian-Feng; Ji, Xiang-Yun; Jiang, Jie-Xian; Li, Bo

    2015-01-01

    The extensive use of chemical pesticides for pest management in agricultural systems can entail risks to the complex ecosystems consisting of economic, ecological and social subsystems. To analyze the negative and positive effects of external or internal disturbances on complex ecosystems, we proposed an ecological two-sidedness approach which has been applied to the design of pest-controlling strategies for pesticide pollution management. However, catastrophe theory has not been initially applied to this approach. Thus, we used an approach of integrating ecological two-sidedness with a multi-criterion evaluation method of catastrophe theory to analyze the complexity of agro-ecosystems disturbed by the insecticides and screen out the best insect pest-controlling strategy in cabbage production. The results showed that the order of the values of evaluation index (RCC/CP) for three strategies in cabbage production was “applying frequency vibration lamps and environment-friendly insecticides 8 times” (0.80) < “applying trap devices and environment-friendly insecticides 9 times” (0.83) < “applying common insecticides 14 times” (1.08). The treatment “applying frequency vibration lamps and environment-friendly insecticides 8 times” was considered as the best insect pest-controlling strategy in cabbage production in Shanghai, China. PMID:25906199

  10. Identifying interactions in the time and frequency domains in local and global networks - A Granger Causality Approach.

    PubMed

    Zou, Cunlu; Ladroue, Christophe; Guo, Shuixia; Feng, Jianfeng

    2010-06-21

    Reverse-engineering approaches such as Bayesian network inference, ordinary differential equations (ODEs) and information theory are widely applied to deriving causal relationships among different elements such as genes, proteins, metabolites, neurons, brain areas and so on, based upon multi-dimensional spatial and temporal data. There are several well-established reverse-engineering approaches to explore causal relationships in a dynamic network, such as ordinary differential equations (ODE), Bayesian networks, information theory and Granger Causality. Here we focused on Granger causality both in the time and frequency domain and in local and global networks, and applied our approach to experimental data (genes and proteins). For a small gene network, Granger causality outperformed all the other three approaches mentioned above. A global protein network of 812 proteins was reconstructed, using a novel approach. The obtained results fitted well with known experimental findings and predicted many experimentally testable results. In addition to interactions in the time domain, interactions in the frequency domain were also recovered. The results on the proteomic data and gene data confirm that Granger causality is a simple and accurate approach to recover the network structure. Our approach is general and can be easily applied to other types of temporal data.

  11. A partial Hamiltonian approach for current value Hamiltonian systems

    NASA Astrophysics Data System (ADS)

    Naz, R.; Mahomed, F. M.; Chaudhry, Azam

    2014-10-01

    We develop a partial Hamiltonian framework to obtain reductions and closed-form solutions via first integrals of current value Hamiltonian systems of ordinary differential equations (ODEs). The approach is algorithmic and applies to many state and costate variables of the current value Hamiltonian. However, we apply the method to models with one control, one state and one costate variable to illustrate its effectiveness. The current value Hamiltonian systems arise in economic growth theory and other economic models. We explain our approach with the help of a simple illustrative example and then apply it to two widely used economic growth models: the Ramsey model with a constant relative risk aversion (CRRA) utility function and Cobb Douglas technology and a one-sector AK model of endogenous growth are considered. We show that our newly developed systematic approach can be used to deduce results given in the literature and also to find new solutions.

  12. Etude experimentale et modelisation de la digestion anaerobie des matieres organiques residuelles dans des conditions hyperthermophiles =

    NASA Astrophysics Data System (ADS)

    Altamirano, Felipe Ignacio Castro

    This dissertation focuses on the problem of designing rates in the utility sector. It is motivated by recent developments in the electricity industry, where renewable generation technologies and distributed energy resources are becoming increasingly relevant. Both technologies disrupt the sector in unique ways. While renewables make grid operations more complex, and potentially more expensive, distributed energy resources enable consumers to interact two-ways with the grid. Both developments present challenges and opportunities for regulators, who must adapt their techniques for evaluating policies to the emerging technological conditions. The first two chapters of this work make the case for updating existing techniques to evaluate tariff structures. They also propose new methods which are more appropriate given the prospective technological characteristics of the sector. The first chapter constructs an analytic tool based on a model that captures the interaction between pricing and investment. In contrast to previous approaches, this technique allows consistently comparing portfolios of rates while enabling researchers to model with a significantly greater level of detail the supply side of the sector. A key theoretical implication of the model that underlies this technique is that, by properly updating the portfolio of tariffs, a regulator could induce the welfare maximizing adoption of distributed energy resources and enrollment in rate structures. We develop an algorithm to find globally optimal solutions of this model, which is a nonlinear mathematical program. The results of a computational experiment show that the performance of the algorithm dominates that of commercial nonlinear solvers. In addition, to illustrate the practical relevance of the method, we conduct a cost benefit analysis of implementing time-variant tariffs in two electricity systems, California and Denmark. Although portfolios with time-varying rates create value in both systems, these improvements differ enough to advise very different policies. While in Denmark time-varying tariffs appear unattractive, they at least deserve further revision in California. This conclusion is beyond the reach of previous techniques to analyze rates, as they do not capture the interplay between an intermittent supply and a price-responsive demand. While useful, the method we develop in the first chapter has two important limitations. One is the lack of transparency of the parameters that determine demand substitution patterns, and demand heterogeneity; the other is the narrow range of rate structures that could be studied with the technique. Both limitations stem from taking as a primitive a demand function. Following an alternative path, in the second chapter we develop a technique based on a pricing model that has as a fundamental building block the consumer utility maximization problem. Because researchers do not have to limit themselves to problems with unique solutions, this approach significantly increases the flexibility of the model and, in particular, addresses the limitations of the technique we develop in the first chapter. This gain in flexibility decreases the practicality of our method since the underlying model becomes a Bilevel Problem. To be able to handle realistic instances, we develop a decomposition method based on a non-linear variant of the Alternating Direction Method of Multipliers, which combines Conic and Mixed Integer Programming. A numerical experiment shows that the performance of the solution technique is robust to instance sizes and a wide combination of parameters. We illustrate the relevance of the new method with another applied analysis of rate structures. Our results highlight the value of being able to model in detail distributed energy resources. They also show that ignoring transmission constraints can have meaningful impacts on the analysis of rate structures. In addition, we conduct a distributional analysis, which portrays how our method permits regulators and policy makers to study impacts of a rate update on a heterogeneous population. While a switch in rates could have a positive impact on the aggregate of households, it could benefit some more than others, and even harm some customers. Our technique permits to anticipate these impacts, letting regulators decide among rate structures with considerably more information than what would be available with alternative approaches. In the third chapter, we conduct an empirical analysis of rate structures in California, which is currently undergoing a rate reform. To contribute to the ongoing regulatory debate about the future of rates, we analyze in depth a set of plausible tariff alternatives. In our analysis, we focus on a scenario in which advanced metering infrastructure and home energy management systems are widely adopted. Our modeling approach allows us to capture a wide variety of temporal and spatial demand substitution patterns without the need of estimating a large number of parameters. (Abstract shortened by ProQuest.).

  13. One-loop quantum gravity repulsion in the early Universe.

    PubMed

    Broda, Bogusław

    2011-03-11

    Perturbative quantum gravity formalism is applied to compute the lowest order corrections to the classical spatially flat cosmological Friedmann-Lemaître-Robertson-Walker solution (for the radiation). The presented approach is analogous to the approach applied to compute quantum corrections to the Coulomb potential in electrodynamics, or rather to the approach applied to compute quantum corrections to the Schwarzschild solution in gravity. In the framework of the standard perturbative quantum gravity, it is shown that the corrections to the classical deceleration, coming from the one-loop graviton vacuum polarization (self-energy), have (UV cutoff free) opposite to the classical repulsive properties which are not negligible in the very early Universe. The repulsive "quantum forces" resemble those known from loop quantum cosmology.

  14. [Ideal type and history--a critical review of applied criminology].

    PubMed

    Köchel, Stefan

    2013-01-01

    Applied Criminology describes an established criminological school in the German-speaking area, which was founded by Hans Göppinger and Michael Bock, criminologists at Tübingen, in the 1980s and has meanwhile published a number of comprehensive basic methodological papers. The conceptual centrepiece with interdisciplinary approach is the formation and application of concepts referring to the so-called ideal type, which has been essentially inspired by the epistemology of Max Weber. However, the result of a critical reconstruction of these fundamentals is that the claimed interdisciplinary approach comes into conflict with a second much more phenomenological approach of Applied Criminology which is unable to comply with the political implications of criminological research and thus disavows the necessary historical relationality of the ideal type concepts.

  15. [Relapse: causes and consequences].

    PubMed

    Thomas, P

    2013-09-01

    Relapse after a first episode of schizophrenia is the recurrence of acute symptoms after a period of partial or complete remission. Due to its variable aspects, there is no operational definition of relapse able to modelise the outcome of schizophrenia and measure how the treatment modifies the disease. Follow-up studies based on proxys such as hospital admission revealed that 7 of 10 patients relapsed after a first episode of schizophrenia. The effectiveness of antipsychotic medications on relapse prevention has been widely demonstrated. Recent studies claim for the advantages of atypical over first generation antipsychotic medication. Non-adherence to antipsychotic represents with addictions the main causes of relapse long before some non-consensual factors such as premorbid functioning, duration of untreated psychosis and associated personality disorders. The consequences of relapse are multiple, psychological, biological and social. Pharmaco-clinical studies have demonstrated that the treatment response decreases with each relapse. Relapse, even the first one, will contribute to worsen the outcome of the disease and reduce the capacity in general functionning. Accepting the idea of continuing treatment is a complex decision in which the psychiatrist plays a central role besides patients and their families. The development of integrated actions on modifiable risk factors such as psychosocial support, addictive comorbidities, access to care and the therapeutic alliance should be promoted. Relapse prevention is a major goal of the treatment of first-episode schizophrenia. It is based on adherence to the maintenance treatment, identification of prodromes, family active information and patient therapeutical education. Copyright © 2013 L’Encéphale. Published by Elsevier Masson SAS.. All rights reserved.

  16. Refining mortality estimates in shark demographic analyses: a Bayesian inverse matrix approach.

    PubMed

    Smart, Jonathan J; Punt, André E; White, William T; Simpfendorfer, Colin A

    2018-01-18

    Leslie matrix models are an important analysis tool in conservation biology that are applied to a diversity of taxa. The standard approach estimates the finite rate of population growth (λ) from a set of vital rates. In some instances, an estimate of λ is available, but the vital rates are poorly understood and can be solved for using an inverse matrix approach. However, these approaches are rarely attempted due to prerequisites of information on the structure of age or stage classes. This study addressed this issue by using a combination of Monte Carlo simulations and the sample-importance-resampling (SIR) algorithm to solve the inverse matrix problem without data on population structure. This approach was applied to the grey reef shark (Carcharhinus amblyrhynchos) from the Great Barrier Reef (GBR) in Australia to determine the demography of this population. Additionally, these outputs were applied to another heavily fished population from Papua New Guinea (PNG) that requires estimates of λ for fisheries management. The SIR analysis determined that natural mortality (M) and total mortality (Z) based on indirect methods have previously been overestimated for C. amblyrhynchos, leading to an underestimated λ. The updated Z distributions determined using SIR provided λ estimates that matched an empirical λ for the GBR population and corrected obvious error in the demographic parameters for the PNG population. This approach provides opportunity for the inverse matrix approach to be applied more broadly to situations where information on population structure is lacking. © 2018 by the Ecological Society of America.

  17. A Systems Approach to Nitrogen Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goins, Bobby

    A systems based approach will be used to evaluate the nitrogen delivery process. This approach involves principles found in Lean, Reliability, Systems Thinking, and Requirements. This unique combination of principles and thought process yields a very in depth look into the system to which it is applied. By applying a systems based approach to the nitrogen delivery process there should be improvements in cycle time, efficiency, and a reduction in the required number of personnel needed to sustain the delivery process. This will in turn reduce the amount of demurrage charges that the site incurs. In addition there should bemore » less frustration associated with the delivery process.« less

  18. A simplified approach to quasi-linear viscoelastic modeling

    PubMed Central

    Nekouzadeh, Ali; Pryse, Kenneth M.; Elson, Elliot L.; Genin, Guy M.

    2007-01-01

    The fitting of quasi-linear viscoelastic (QLV) constitutive models to material data often involves somewhat cumbersome numerical convolution. A new approach to treating quasi-linearity in one dimension is described and applied to characterize the behavior of reconstituted collagen. This approach is based on a new principle for including nonlinearity and requires considerably less computation than other comparable models for both model calibration and response prediction, especially for smoothly applied stretching. Additionally, the approach allows relaxation to adapt with the strain history. The modeling approach is demonstrated through tests on pure reconstituted collagen. Sequences of “ramp-and-hold” stretching tests were applied to rectangular collagen specimens. The relaxation force data from the “hold” was used to calibrate a new “adaptive QLV model” and several models from literature, and the force data from the “ramp” was used to check the accuracy of model predictions. Additionally, the ability of the models to predict the force response on a reloading of the specimen was assessed. The “adaptive QLV model” based on this new approach predicts collagen behavior comparably to or better than existing models, with much less computation. PMID:17499254

  19. Approach for Autonomous Control of Unmanned Aerial Vehicle Using Intelligent Agents for Knowledge Creation

    NASA Technical Reports Server (NTRS)

    Dufrene, Warren R., Jr.

    2004-01-01

    This paper describes the development of a planned approach for Autonomous operation of an Unmanned Aerial Vehicle (UAV). A Hybrid approach will seek to provide Knowledge Generation through the application of Artificial Intelligence (AI) and Intelligent Agents (IA) for UAV control. The applications of several different types of AI techniques for flight are explored during this research effort. The research concentration is directed to the application of different AI methods within the UAV arena. By evaluating AI and biological system approaches. which include Expert Systems, Neural Networks. Intelligent Agents, Fuzzy Logic, and Complex Adaptive Systems, a new insight may be gained into the benefits of AI and CAS techniques applied to achieving true autonomous operation of these systems. Although flight systems were explored, the benefits should apply to many Unmanned Vehicles such as: Rovers. Ocean Explorers, Robots, and autonomous operation systems. A portion of the flight system is broken down into control agents that represent the intelligent agent approach used in AI. After the completion of a successful approach, a framework for applying an intelligent agent is presented. The initial results from simulation of a security agent for communication are presented.

  20. Comparison of Science-Technology-Society Approach and Textbook Oriented Instruction on Students' Abilities to Apply Science Concepts

    ERIC Educational Resources Information Center

    Kapici, Hasan Ozgur; Akcay, Hakan; Yager, Robert E.

    2017-01-01

    It is important for students to learn concepts and using them for solving problems and further learning. Within this respect, the purpose of this study is to investigate students' abilities to apply science concepts that they have learned from Science-Technology-Society based approach or textbook oriented instruction. Current study is based on…

  1. Applying Program Theory-Driven Approach to Design and Evaluate a Teacher Professional Development Program

    ERIC Educational Resources Information Center

    Lin, Su-ching; Wu, Ming-sui

    2016-01-01

    This study was the first year of a two-year project which applied a program theory-driven approach to evaluating the impact of teachers' professional development interventions on students' learning by using a mix of methods, qualitative inquiry, and quasi-experimental design. The current study was to show the results of using the method of…

  2. A New Approach to Automated Labeling of Internal Features of Hardwood Logs Using CT Images

    Treesearch

    Daniel L. Schmoldt; Pei Li; A. Lynn Abbott

    1996-01-01

    The feasibility of automatically identifying internal features of hardwood logs using CT imagery has been established previously. Features of primary interest are bark, knots, voids, decay, and clear wood. Our previous approach: filtered original CT images, applied histogram segmentation, grew volumes to extract 3-d regions, and applied a rule base, with Dempster-...

  3. Associations between the Classroom Learning Environment and Student Engagement in Learning 1: A Rasch Model Approach

    ERIC Educational Resources Information Center

    Cavanagh, Rob

    2012-01-01

    This report is about one of two phases in an investigation into associations between student engagement in classroom learning and the classroom learning environment. Both phases applied the same instrumentation to the same sample. The difference between the phases was in the measurement approach applied. This report is about application of the…

  4. A Microworld-Based Role-Playing Game Development Approach to Engaging Students in Interactive, Enjoyable, and Effective Mathematics Learning

    ERIC Educational Resources Information Center

    Wang, Sheng-Yuan; Chang, Shao-Chen; Hwang, Gwo-Jen; Chen, Pei-Ying

    2018-01-01

    In traditional teacher-centered mathematics instruction, students might show low learning motivation owing to the lack of applied contexts. Game-based learning has been recognized as a potential approach to addressing this issue; however, without proper alignment between the gaming and math-applied contexts, the benefits of game-based learning…

  5. Are your covariates under control? How normalization can re-introduce covariate effects.

    PubMed

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  6. Mass spectrometry-based protein identification by integrating de novo sequencing with database searching.

    PubMed

    Wang, Penghao; Wilson, Susan R

    2013-01-01

    Mass spectrometry-based protein identification is a very challenging task. The main identification approaches include de novo sequencing and database searching. Both approaches have shortcomings, so an integrative approach has been developed. The integrative approach firstly infers partial peptide sequences, known as tags, directly from tandem spectra through de novo sequencing, and then puts these sequences into a database search to see if a close peptide match can be found. However the current implementation of this integrative approach has several limitations. Firstly, simplistic de novo sequencing is applied and only very short sequence tags are used. Secondly, most integrative methods apply an algorithm similar to BLAST to search for exact sequence matches and do not accommodate sequence errors well. Thirdly, by applying these methods the integrated de novo sequencing makes a limited contribution to the scoring model which is still largely based on database searching. We have developed a new integrative protein identification method which can integrate de novo sequencing more efficiently into database searching. Evaluated on large real datasets, our method outperforms popular identification methods.

  7. Developpement energetique par modelisation et intelligence territoriale: Un outil de prise de decision participative pour le developpement durable des projets eoliens

    NASA Astrophysics Data System (ADS)

    Vazquez Rascon, Maria de Lourdes

    This thesis focuses on the implementation of a participatory and transparent decision making tool about the wind farm projects. This tool is based on an (argumentative) framework that reflects the stakeholder's values systems involved in these projects and it employs two multicriteria methods: the multicriteria decision aide and the participatory geographical information systems, making it possible to represent this value systems by criteria and indicators to be evaluated. The stakeholder's values systems will allow the inclusion of environmental, economic and social-cultural aspects of wind energy projects and, thus, a sustainable development wind projects vision. This vision will be analyzed using the 16 sustainable principles included in the Quebec's Sustainable Development Act. Four specific objectives have been instrumented to favor a logical completion work, and to ensure the development of a successfultool : designing a methodology to couple the MCDA and participatory GIS, testing the developed methodology by a case study, making a robustness analysis to address strategic issues and analyzing the strengths, weaknesses, opportunities and threads of the developed methodology. Achieving the first goal allowed us to obtain a decision-making tool called Territorial Intelligence Modeling for Energy Development (TIMED approach). The TIMED approach is visually represented by a figure expressing the idea of a co-construction decision and where ail stakeholders are the focus of this methodology. TIMED is composed of four modules: Multi-Criteria decision analysis, participatory geographic Information systems, active involvement of the stakeholders and scientific knowledge/local knowledge. The integration of these four modules allows for the analysis of different implementation scenarios of wind turbines in order to choose the best one based on a participatory and transparent decision-making process that takes into account stakeholders' concerns. The second objective enabled the testing of TIMED in an ex-post experience of a wind farm in operation since 2006. In this test, II people participated representing four stakeholder' categories: the private sector, the public sector, experts and civil society. This test allowed us to analyze the current situation in which wind projects are currently developed in Quebec. The concerns of some stakeholders regarding situations that are not considered in the current context were explored through a third goal. This third objective allowed us to make simulations taking into account the assumptions of strategic levels. Examples of the strategic level are the communication tools used to approach the host community and the park property type. Finally, the fourth objective, a SWOT analysis with the participation of eight experts, allowed us to verify the extent to which TIMED approach succeeded in constructing four fields for participatory decision-making: physical, intellectual, emotional and procedural. From these facts, 116 strengths, 28 weaknesses, 32 constraints and 54 opportunities were identified. Contributions, applications, limitations and extensions of this research are based on giving a participatory decision-making methodology taking into account socio-cultural, environmental and economic variables; making reflection sessions on a wind farm in operation; acquiring MCDA knowledge for participants involved in testing the proposed methodology; taking into account the physical, intellectual, emotional and procedural spaces to al1iculate a participatory decision; using the proposed methodology in renewable energy sources other than wind; the need to an interdisciplinary team for the methodology application; access to quality data; access to information technologies; the right to public participation; the neutrality of experts; the relationships between experts and non-experts; cultural constraints; improvement of designed indicators; the implementation of a Web platform for participatory decision-making and writing a manual on the use of the developed methodology. Keywords: wind farm, multicriteria decision, geographic information systems, TIMED approach, sustainable wind energy projects development, renewable energy, social participation, robustness concern, SWOT analysis.

  8. Etude de la transmission sonore a travers un protecteur de type "coquilles" : modelisation numerique et validation experimentale

    NASA Astrophysics Data System (ADS)

    Boyer, Sylvain

    On estime que sur les 3,7 millions des travailleurs au Quebec, plus de 500 000 sont exposes quotidiennement a des niveaux de bruits pouvant causer des lesions de l'appareil auditif. Lorsqu'il n'est pas possible de diminuer le niveau de bruit environnant, en modifiant les sources de bruits, ou en limitant la propagation du son, le port de protecteurs auditifs individualises, telles que les coquilles, demeure l'ultime solution. Bien que vue comme une solution a court terme, elle est communement employee, du fait de son caractere peu dispendieux, de sa facilite d'implantation et de son adaptabilite a la plupart des operations en environnement bruyant. Cependant les protecteurs auditifs peuvent etre a la fois inadaptes aux travailleurs et a leur environnement et inconfortables ce qui limite leur temps de port, reduisant leur protection effective. Afin de palier a ces difficultes, un projet de recherche sur la protection auditive intitule : " Developpement d'outils et de methodes pour ameliorer et mieux evaluer la protection auditive individuelle des travailleur ", a ete mis sur pied en 2010, associant l'Ecole de technologie superieure (ETS) et l'Institut de recherche Robert-Sauve en sante et en securite du travail (IRSST). S'inscrivant dans ce programme de recherche, le present travail de doctorat s'interesse specifiquement a la protection auditive au moyen de protecteurs auditifs " passifs " de type coquille, dont l'usage presente trois problematiques specifiques presentees dans les paragraphes suivants. La premiere problematique specifique concerne l'inconfort cause par exemple par la pression statique induite par la force de serrage de l'arceau, qui peut reduire le temps de port recommande pour limiter l'exposition au bruit. Il convient alors de pouvoir donner a l'utilisateur un protecteur confortable, adapte a son environnement de travail et a son activite. La seconde problematique specifique est l'evaluation de la protection reelle apportee par le protecteur. La methode des seuils auditifs REAT (Real Ear Attenuation Threshold) aussi vu comme un "golden standard" est utilise pour quantifier la reduction du bruit mais surestime generalement la performance des protecteurs. Les techniques de mesure terrains, telles que la F-MIRE (Field Measurement in Real Ear) peuvent etre a l'avenir de meilleurs outils pour evaluer l'attenuation individuelle. Si ces techniques existent pour des bouchons d'oreilles, elles doivent etre adaptees et ameliorees pour le cas des coquilles, en determinant l'emplacement optimal des capteurs acoustiques et les facteurs de compensation individuels qui lient la mesure microphonique a la mesure qui aurait ete prise au tympan. La troisieme problematique specifique est l'optimisation de l'attenuation des coquilles pour les adapter a l'individu et a son environnement de travail. En effet, le design des coquilles est generalement base sur des concepts empiriques et des methodes essais/erreurs sur des prototypes. La piste des outils predictifs a ete tres peu etudiee jusqu'a present et meriterait d'etre approfondie. L'utilisation du prototypage virtuel, permettrait a la fois d'optimiser le design avant production, d'accelerer la phase de developpement produit et d'en reduire les couts. L'objectif general de cette these est de repondre a ces differentes problematiques par le developpement d'un modele de l'attenuation sonore d'un protecteur auditif de type coquille. A cause de la complexite de la geometrie de ces protecteurs, la methode principale de modelisation retenue a priori est la methode des elements finis (FEM). Pour atteindre cet objectif general, trois objectifs specifiques ont ete etablis et sont presentes dans les trois paragraphes suivants. (Abstract shortened by ProQuest.).

  9. APPLICATION OF THE SURFACE COMPLEXATION CONCEPT TO COMPLEX MINERAL ASSEMBLAGES

    EPA Science Inventory

    Two types of modeling approaches are illustrated for describing inorganic contaminant adsorption in aqueous environments: (a) the component additivity approach and (b) the generalized composite approach. Each approach is applied to simulate Zn2+ adsorption by a well-characterize...

  10. Approach to proliferation risk assessment based on multiple objective analysis framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrianov, A.; Kuptsov, I.; Studgorodok 1, Obninsk, Kaluga region, 249030

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materialsmore » circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.« less

  11. Thermal Assisted In Vivo Gene Electrotransfer

    PubMed Central

    Donate, Amy; Bulysheva, Anna; Edelblute, Chelsea; Jung, Derrick; Malik, Mohammad A.; Guo, Siqi; Burcus, Niculina; Schoenbach, Karl; Heller, Richard

    2016-01-01

    Gene electrotransfer is an effective approach for delivering plasmid DNA to a variety of tissues. Delivery of molecules with electric pulses requires control of the electrical parameters to achieve effective delivery. Since discomfort or tissue damage may occur with high applied voltage, the reduction of the applied voltage while achieving the desired expression may be an important improvement. One possible approach is to combine electrotransfer with exogenously applied heat. Previous work performed in vitro demonstrated that increasing temperature before pulsing can enhance gene expres sion and made it possible to reduce electric fields while maintaining expression levels. In the study reported here, this combination was evaluated in vivo using a novel electrode device designed with an inserted laser for application of heat. The results obtained in this study demonstrated that increased temperature during electrotransfer increased expression or maintained expression with a reduction in applied voltage. With further optimization this approach may provide the basis for both a novel method and a novel instrument that may greatly enhance translation of gene electrotransfer. PMID:27029944

  12. Applying Current Approaches to the Teaching of Reading

    ERIC Educational Resources Information Center

    Villanueva de Debat, Elba

    2006-01-01

    This article discusses different approaches to reading instruction for EFL learners based on theoretical frameworks. The author starts with the bottom-up approach to reading instruction, and briefly explains phonics and behaviorist ideas that inform this instructional approach. The author then explains the top-down approach and the new cognitive…

  13. A machine learning approach for predicting the relationship between energy resources and economic development

    NASA Astrophysics Data System (ADS)

    Cogoljević, Dušan; Alizamir, Meysam; Piljan, Ivan; Piljan, Tatjana; Prljić, Katarina; Zimonjić, Stefan

    2018-04-01

    The linkage between energy resources and economic development is a topic of great interest. Research in this area is also motivated by contemporary concerns about global climate change, carbon emissions fluctuating crude oil prices, and the security of energy supply. The purpose of this research is to develop and apply the machine learning approach to predict gross domestic product (GDP) based on the mix of energy resources. Our results indicate that GDP predictive accuracy can be improved slightly by applying a machine learning approach.

  14. An alternative regionalization scheme for defining nutrient criteria for rivers and streams

    USGS Publications Warehouse

    Robertson, Dale M.; Saad, David A.; Wieben, Ann M.

    2001-01-01

    The environmental nutrient zone approach can be applied to specific states or nutrient ecoregions and used to develop criteria as a function of stream type. This approach can also be applied on the basis of environmental characteristics of the watershed alone rather than the general environmental characteristics from the region in which the site is located. The environmental nutrient zone approach will enable states to refine the basic nutrient criteria established by the USEPA by developing attainable criteria given the environmental characteristics where the streams are located.

  15. Bayesian Exploratory Factor Analysis

    PubMed Central

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  16. Safe-life and damage-tolerant design approaches for helicopter structures

    NASA Technical Reports Server (NTRS)

    Reddick, H. K., Jr.

    1983-01-01

    The safe-life and damage-tolerant design approaches discussed apply to both metallic and fibrous composite helicopter structures. The application of these design approaches to fibrous composite structures is emphasized. Safe-life and damage-tolerant criteria are applied to all helicopter flight critical components, which are generally categorized as: dynamic components with a main and tail rotor system, which includes blades, hub and rotating controls, and drive train which includes transmission, and main and interconnecting rotor shafts; and the airframe, composed of the fuselage, aerodynamic surfaces, and landing gear.

  17. Identifying and assessing the application of ecosystem services approaches in environmental policies and decision making.

    PubMed

    Van Wensem, Joke; Calow, Peter; Dollacker, Annik; Maltby, Lorraine; Olander, Lydia; Tuvendal, Magnus; Van Houtven, George

    2017-01-01

    The presumption is that ecosystem services (ES) approaches provide a better basis for environmental decision making than do other approaches because they make explicit the connection between human well-being and ecosystem structures and processes. However, the existing literature does not provide a precise description of ES approaches for environmental policy and decision making, nor does it assess whether these applications will make a difference in terms of changing decisions and improving outcomes. We describe 3 criteria that can be used to identify whether and to what extent ES approaches are being applied: 1) connect impacts all the way from ecosystem changes to human well-being, 2) consider all relevant ES affected by the decision, and 3) consider and compare the changes in well-being of different stakeholders. As a demonstration, we then analyze retrospectively whether and how the criteria were met in different decision-making contexts. For this assessment, we have developed an analysis format that describes the type of policy, the relevant scales, the decisions or questions, the decision maker, and the underlying documents. This format includes a general judgment of how far the 3 ES criteria have been applied. It shows that the criteria can be applied to many different decision-making processes, ranging from the supranational to the local scale and to different parts of decision-making processes. In conclusion we suggest these criteria could be used for assessments of the extent to which ES approaches have been and should be applied, what benefits and challenges arise, and whether using ES approaches made a difference in the decision-making process, decisions made, or outcomes of those decisions. Results from such studies could inform future use and development of ES approaches, draw attention to where the greatest benefits and challenges are, and help to target integration of ES approaches into policies, where they can be most effective. Integr Environ Assess Manag 2017;13:41-51. © 2016 SETAC. © 2016 SETAC.

  18. A whole-of-curriculum approach to improving nursing students' applied numeracy skills.

    PubMed

    van de Mortel, Thea F; Whitehair, Leeann P; Irwin, Pauletta M

    2014-03-01

    Nursing students often perform poorly on numeracy tests. Whilst one-off interventions have been trialled with limited success, a whole-of-curriculum approach may provide a better means of improving applied numeracy skills. The objective of the study is to assess the efficacy of a whole-of-curriculum approach in improving nursing students' applied numeracy skills. Two cycles of assessment, implementation and evaluation of strategies were conducted following a high fail rate in the final applied numeracy examination in a Bachelor of Nursing (BN) programme. Strategies included an early diagnostic assessment followed by referral to remediation, setting the pass mark at 100% for each of six applied numeracy examinations across the programme, and employing a specialist mathematics teacher to provide consistent numeracy teaching. The setting of the study is one Australian university. 1035 second and third year nursing students enrolled in four clinical nursing courses (CNC III, CNC IV, CNC V and CNC VI) were included. Data on the percentage of students who obtained 100% in their applied numeracy examination in up to two attempts were collected from CNCs III, IV, V and VI between 2008 and 2011. A four by two χ(2) contingency table was used to determine if the differences in the proportion of students achieving 100% across two examination attempts in each CNC were significantly different between 2008 and 2011. The percentage of students who obtained 100% correct answers on the applied numeracy examinations was significantly higher in 2011 than in 2008 in CNC III (χ(2)=272, 3; p<0.001), IV (χ(2)=94.7, 3; p<0.001) and VI (χ(2)=76.3, 3; p<0.001). A whole-of-curriculum approach to developing applied numeracy skills in BN students resulted in a substantial improvement in these skills over four years. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Bootstrapping in Applied Linguistics: Assessing Its Potential Using Shared Data

    ERIC Educational Resources Information Center

    Plonsky, Luke; Egbert, Jesse; Laflair, Geoffrey T.

    2015-01-01

    Parametric analyses such as t tests and ANOVAs are the norm--if not the default--statistical tests found in quantitative applied linguistics research (Gass 2009). Applied statisticians and one applied linguist (Larson-Hall 2010, 2012; Larson-Hall and Herrington 2010), however, have argued that this approach may not be appropriate for small samples…

  20. Estimation of Carcinogenicity using Hierarchical Clustering and Nearest Neighbor Methodologies

    EPA Science Inventory

    Previously a hierarchical clustering (HC) approach and a nearest neighbor (NN) approach were developed to model acute aquatic toxicity end points. These approaches were developed to correlate the toxicity for large, noncongeneric data sets. In this study these approaches applie...

  1. Antigen identification starting from the genome: a "Reverse Vaccinology" approach applied to MenB.

    PubMed

    Palumbo, Emmanuelle; Fiaschi, Luigi; Brunelli, Brunella; Marchi, Sara; Savino, Silvana; Pizza, Mariagrazia

    2012-01-01

    Most of the vaccines available today, albeit very effective, have been developed using traditional "old-style" methodologies. Technologies developed in recent years have opened up new perspectives in the field of vaccinology and novel strategies are now being used to design improved or new vaccines against infections for which preventive measures do not exist. The Reverse Vaccinology (RV) approach is one of the most powerful examples of biotechnology applied to the field of vaccinology for identifying new protein-based vaccines. RV combines the availability of genomic data, the analyzing capabilities of new bioinformatic tools, and the application of high throughput expression and purification systems combined with serological screening assays for a coordinated screening process of the entire genomic repertoire of bacterial, viral, or parasitic pathogens. The application of RV to Neisseria meningitidis serogroup B represents the first success of this novel approach. In this chapter, we describe how this revolutionary approach can be easily applied to any pathogen.

  2. Structure-Based Druggability Assessment of the Mammalian Structural Proteome with Inclusion of Light Protein Flexibility

    PubMed Central

    Loving, Kathryn A.; Lin, Andy; Cheng, Alan C.

    2014-01-01

    Advances reported over the last few years and the increasing availability of protein crystal structure data have greatly improved structure-based druggability approaches. However, in practice, nearly all druggability estimation methods are applied to protein crystal structures as rigid proteins, with protein flexibility often not directly addressed. The inclusion of protein flexibility is important in correctly identifying the druggability of pockets that would be missed by methods based solely on the rigid crystal structure. These include cryptic pockets and flexible pockets often found at protein-protein interaction interfaces. Here, we apply an approach that uses protein modeling in concert with druggability estimation to account for light protein backbone movement and protein side-chain flexibility in protein binding sites. We assess the advantages and limitations of this approach on widely-used protein druggability sets. Applying the approach to all mammalian protein crystal structures in the PDB results in identification of 69 proteins with potential druggable cryptic pockets. PMID:25079060

  3. The Expanding Role of Applications in the Development and Validation of CFD at NASA

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    2010-01-01

    This paper focuses on the recent escalation in application of CFD to manned and unmanned flight projects at NASA and the need to often apply these methods to problems for which little or no previous validation data directly applies. The paper discusses the evolution of NASA.s CFD development from a strict Develop, Validate, Apply strategy to sometimes allowing for a Develop, Apply, Validate approach. The risks of this approach and some of its unforeseen benefits are discussed and tied to specific operational examples. There are distinct advantages for the CFD developer that is able to operate in this paradigm, and recommendations are provided for those inclined and willing to work in this environment.

  4. Role of Soft Computing Approaches in HealthCare Domain: A Mini Review.

    PubMed

    Gambhir, Shalini; Malik, Sanjay Kumar; Kumar, Yugal

    2016-12-01

    In the present era, soft computing approaches play a vital role in solving the different kinds of problems and provide promising solutions. Due to popularity of soft computing approaches, these approaches have also been applied in healthcare data for effectively diagnosing the diseases and obtaining better results in comparison to traditional approaches. Soft computing approaches have the ability to adapt itself according to problem domain. Another aspect is a good balance between exploration and exploitation processes. These aspects make soft computing approaches more powerful, reliable and efficient. The above mentioned characteristics make the soft computing approaches more suitable and competent for health care data. The first objective of this review paper is to identify the various soft computing approaches which are used for diagnosing and predicting the diseases. Second objective is to identify various diseases for which these approaches are applied. Third objective is to categories the soft computing approaches for clinical support system. In literature, it is found that large number of soft computing approaches have been applied for effectively diagnosing and predicting the diseases from healthcare data. Some of these are particle swarm optimization, genetic algorithm, artificial neural network, support vector machine etc. A detailed discussion on these approaches are presented in literature section. This work summarizes various soft computing approaches used in healthcare domain in last one decade. These approaches are categorized in five different categories based on the methodology, these are classification model based system, expert system, fuzzy and neuro fuzzy system, rule based system and case based system. Lot of techniques are discussed in above mentioned categories and all discussed techniques are summarized in the form of tables also. This work also focuses on accuracy rate of soft computing technique and tabular information is provided for each category including author details, technique, disease and utility/accuracy.

  5. The technology-science relationship: Some curriculum implications

    NASA Astrophysics Data System (ADS)

    Gardner, Paul L.

    1990-01-01

    Technology encompasses the goods and services which people make and provide to meet human needs, and the processes and systems used for their development and delivery. Although technology and science are related, a distinction can be made between their purposes and outcomes. This paper considers four possible approaches to teaching students about the relationship between technology and science. A technology-as-illustration approach treats technology as if it were applied science; artefacts are presented to illustrate scientific principles. A cognitive-motivational approach also treats technology as applied science, but presents technology early in the instructional sequence in order to promote student interest and understanding. In an artefact approach, learners study artefacts as systems in order to understand the scientific principles which explain their workings. Finally, a technology-as-process approach emphasises the role of technological capability; in this approach, scientific concepts do not have privileged status as a basis for selecting curriculum content.

  6. Student Improvement by Applying the Numbered Heads Together (NHT) Approach to Basic Subjects of Vocational Competence in a Vocational High School in Indonesia

    ERIC Educational Resources Information Center

    Wora, Veronika Marta; Hadisaputro, Ranto; Rohman, Ngatou; Bugis, Husin; Pambudi, Suharno Nugroho Agung

    2017-01-01

    This research aims to improve the learning activity and achievement of a 10th grade class made up of 30 students in a vocational high school located in the city of Surakarta, Indonesia, by applying the Numbered Heads Together (NHT) approach. The experiment was divided into two stages of four activities each: planning, implementation, observation,…

  7. Control system estimation and design for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Stefani, R. T.; Williams, T. L.; Yakowitz, S. J.

    1972-01-01

    The selection of an estimator which is unbiased when applied to structural parameter estimation is discussed. The mathematical relationships for structural parameter estimation are defined. It is shown that a conventional weighted least squares (CWLS) estimate is biased when applied to structural parameter estimation. Two approaches to bias removal are suggested: (1) change the CWLS estimator or (2) change the objective function. The advantages of each approach are analyzed.

  8. Toward a new methodological paradigm for testing theories of health behavior and health behavior change.

    PubMed

    Noar, Seth M; Mehrotra, Purnima

    2011-03-01

    Traditional theory testing commonly applies cross-sectional (and occasionally longitudinal) survey research to test health behavior theory. Since such correlational research cannot demonstrate causality, a number of researchers have called for the increased use of experimental methods for theory testing. We introduce the multi-methodological theory-testing (MMTT) framework for testing health behavior theory. The MMTT framework introduces a set of principles that broaden the perspective of how we view evidence for health behavior theory. It suggests that while correlational survey research designs represent one method of testing theory, the weaknesses of this approach demand that complementary approaches be applied. Such approaches include randomized lab and field experiments, mediation analysis of theory-based interventions, and meta-analysis. These alternative approaches to theory testing can demonstrate causality in a much more robust way than is possible with correlational survey research methods. Such approaches should thus be increasingly applied in order to more completely and rigorously test health behavior theory. Greater application of research derived from the MMTT may lead researchers to refine and modify theory and ultimately make theory more valuable to practitioners. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  9. The hybrid thermography approach applied to architectural structures

    NASA Astrophysics Data System (ADS)

    Sfarra, S.; Ambrosini, D.; Paoletti, D.; Nardi, I.; Pasqualoni, G.

    2017-07-01

    This work contains an overview of infrared thermography (IRT) method and its applications relating to the investigation of architectural structures. In this method, the passive approach is usually used in civil engineering, since it provides a panoramic view of the thermal anomalies to be interpreted also thanks to the use of photographs focused on the region of interest (ROI). The active approach, is more suitable for laboratory or indoor inspections, as well as for objects having a small size. The external stress to be applied is thermal, coming from non-natural apparatus such as lamps or hot / cold air jets. In addition, the latter permits to obtain quantitative information related to defects not detectable to the naked eyes. Very recently, the hybrid thermography (HIRT) approach has been introduced to the attention of the scientific panorama. It can be applied when the radiation coming from the sun, directly arrives (i.e., possibly without the shadow cast effect) on a surface exposed to the air. A large number of thermograms must be collected and a post-processing analysis is subsequently applied via advanced algorithms. Therefore, an appraisal of the defect depth can be obtained passing through the calculation of the combined thermal diffusivity of the materials above the defect. The approach is validated herein by working, in a first stage, on a mosaic sample having known defects while, in a second stage, on a Church built in L'Aquila (Italy) and covered with a particular masonry structure called apparecchio aquilano. The results obtained appear promising.

  10. Human ergology that promotes participatory approach to improving safety, health and working conditions at grassroots workplaces: achievements and actions.

    PubMed

    Kawakami, Tsuyoshi

    2011-12-01

    Participatory approaches are increasingly applied to improve safety, health and working conditions of grassroots workplaces in Asia. The core concepts and methods in human ergology research such as promoting real work life studies, relying on positive efforts of local people (daily life-technology), promoting active participation of local people to identify practical solutions, and learning from local human networks to reach grassroots workplaces, have provided useful viewpoints to devise such participatory training programmes. This study was aimed to study and analyze how human ergology approaches were applied in the actual development and application of three typical participatory training programmes: WISH (Work Improvement for Safe Home) with home workers in Cambodia, WISCON (Work Improvement in Small Construction Sites) with construction workers in Thailand, and WARM (Work Adjustment for Recycling and Managing Waste) with waste collectors in Fiji. The results revealed that all the three programmes, in the course of their developments, commonly applied direct observation methods of the work of target workers before devising the training programmes, learned from existing local good examples and efforts, and emphasized local human networks for cooperation. These methods and approaches were repeatedly applied in grassroots workplaces by taking advantage of their the sustainability and impacts. It was concluded that human ergology approaches largely contributed to the developments and expansion of participatory training programmes and could continue to support the self-help initiatives of local people for promoting human-centred work.

  11. Think Pair Share Using Realistic Mathematics Education Approach in Geometry Learning

    NASA Astrophysics Data System (ADS)

    Afthina, H.; Mardiyana; Pramudya, I.

    2017-09-01

    This research aims to determine the impact of mathematics learning applying Think Pair Share (TPS) using Realistic Mathematics Education (RME) viewed from mathematical-logical intelligence in geometry learning. Method that used in this research is quasi experimental research The result of this research shows that (1) mathematics achievement applying TPS using RME approach gives a better result than those applying direct learning model; (2) students with high mathematical-logical intelligence can reach a better mathematics achievement than those with average and low one, whereas students with average mathematical-logical intelligence can reach a better achievement than those with low one; (3) there is no interaction between learning model and the level of students’ mathematical-logical intelligence in giving a mathematics achievement. The impact of this research is that TPS model using RME approach can be applied in mathematics learning so that students can learn more actively and understand the material more, and mathematics learning become more meaningful. On the other hand, internal factors of students must become a consideration toward the success of students’ mathematical achievement particularly in geometry material.

  12. Nonlinear flap-lag axial equations of a rotating beam

    NASA Technical Reports Server (NTRS)

    Kaza, K. R. V.; Kvaternik, R. G.

    1977-01-01

    It is possible to identify essentially four approaches by which analysts have established either the linear or nonlinear governing equations of motion for a particular problem related to the dynamics of rotating elastic bodies. The approaches include the effective applied load artifice in combination with a variational principle and the use of Newton's second law, written as D'Alembert's principle, applied to the deformed configuration. A third approach is a variational method in which nonlinear strain-displacement relations and a first-degree displacement field are used. The method introduced by Vigneron (1975) for deriving the linear flap-lag equations of a rotating beam constitutes the fourth approach. The reported investigation shows that all four approaches make use of the geometric nonlinear theory of elasticity. An alternative method for deriving the nonlinear coupled flap-lag-axial equations of motion is also discussed.

  13. Identification Approach to Alleviate Effects of Unmeasured Heat Gains for MIMO Building Thermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Jie; Kim, Donghun; Braun, James E.

    It is important to have practical methods for constructing a good mathematical model for a building's thermal system for energy audits, retrofit analysis and advanced building controls, e.g. model predictive control. Identification approaches based on semi-physical model structures are popular in building science for those purposes. However conventional gray box identification approaches applied to thermal networks would fail when significant unmeasured heat gains present in estimation data. Although this situation is very common and practical, there has been little research to tackle this issue in building science. This paper presents an overall identification approach to alleviate influences of unmeasured disturbances,more » and hence to obtain improved gray-box building models. The approach was applied to an existing open space building and the performance is demonstrated.« less

  14. Weaknesses in Applying a Process Approach in Industry Enterprises

    NASA Astrophysics Data System (ADS)

    Kučerová, Marta; Mĺkva, Miroslava; Fidlerová, Helena

    2012-12-01

    The paper deals with a process approach as one of the main principles of the quality management. Quality management systems based on process approach currently represents one of a proofed ways how to manage an organization. The volume of sales, costs and profit levels are influenced by quality of processes and efficient process flow. As results of the research project showed, there are some weaknesses in applying of the process approach in the industrial routine and it has been often only a formal change of the functional management to process management in many organizations in Slovakia. For efficient process management it is essential that companies take attention to the way how to organize their processes and seek for their continuous improvement.

  15. Numerical approach in defining milling force taking into account curved cutting-edge of applied mills

    NASA Astrophysics Data System (ADS)

    Bondarenko, I. R.

    2018-03-01

    The paper tackles the task of applying the numerical approach to determine the cutting forces of carbon steel machining with curved cutting edge mill. To solve the abovementioned task the curved surface of the cutting edge was subject to step approximation, and the chips section was split into discrete elements. As a result, the cutting force was defined as the sum of elementary forces observed during the cut of every element. Comparison and analysis of calculations with regard to the proposed method and the method with Kienzle dependence showed its sufficient accuracy, which makes it possible to apply the method in practice.

  16. The Connection Between Forms of Guidance for Inquiry-Based Learning and the Communicative Approaches Applied—a Case Study in the Context of Pre-service Teachers

    NASA Astrophysics Data System (ADS)

    Lehtinen, Antti; Lehesvuori, Sami; Viiri, Jouni

    2017-09-01

    Recent research has argued that inquiry-based science learning should be guided by providing the learners with support. The research on guidance for inquiry-based learning has concentrated on how providing guidance affects learning through inquiry. How guidance for inquiry-based learning could promote learning about inquiry (e.g. epistemic practices) is in need of exploration. A dialogic approach to classroom communication and pedagogical link-making offers possibilities for learners to acquire these practices. The focus of this paper is to analyse the role of different forms of guidance for inquiry-based learning on building the communicative approach applied in classrooms. The data for the study comes from an inquiry-based physics lesson implemented by a group of five pre-service primary science teachers to a class of sixth graders. The lesson was video recorded and the discussions were transcribed. The data was analysed by applying two existing frameworks—one for the forms of guidance provided and another for the communicative approaches applied. The findings illustrate that providing non-specific forms of guidance, such as prompts, caused the communicative approach to be dialogic. On the other hand, providing the learners with specific forms of guidance, such as explanations, shifted the communication to be more authoritative. These results imply that different forms of guidance provided by pre-service teachers can affect the communicative approach applied in inquiry-based science lessons, which affects the possibilities learners are given to connect their existing ideas to the scientific view. Future research should focus on validating these results by also analysing inservice teachers' lessons.

  17. Systemic multimodal approach to speech therapy treatment in autistic children.

    PubMed

    Tamas, Daniela; Marković, Slavica; Milankov, Vesela

    2013-01-01

    Conditions in which speech therapy treatment is applied in autistic children are often not in accordance with characteristics of opinions and learning of people with autism. A systemic multimodal approach means motivating autistic people to develop their language speech skill through the procedure which allows reliving of their personal experience according to the contents that are presented in the their natural social environment. This research was aimed at evaluating the efficiency of speech treatment based on the systemic multimodal approach to the work with autistic children. The study sample consisted of 34 children, aged from 8 to 16 years, diagnosed to have different autistic disorders, whose results showed a moderate and severe clinical picture of autism on the Childhood Autism Rating Scale. The applied instruments for the evaluation of ability were the Childhood Autism Rating Scale and Ganzberg II test. The study subjects were divided into two groups according to the type of treatment: children who were covered by the continuing treatment and systemic multimodal approach in the treatment, and children who were covered by classical speech treatment. It is shown that the systemic multimodal approach in teaching autistic children affects the stimulation of communication, socialization, self-service and work as well as that the progress achieved in these areas of functioning was retainable after long time, too. By applying the systemic multimodal approach when dealing with autistic children and by comparing their achievements on tests applied before, during and after the application of this mode, it has been concluded that certain improvement has been achieved in the functionality within the diagnosed category. The results point to a possible direction in the creation of new methods, plans and programs in dealing with autistic children based on empirical and interactive learning.

  18. Surface chemistry at Swiss Universities of Applied Sciences.

    PubMed

    Brodard, Pierre; Pfeifer, Marc E; Adlhart, Christian D; Pieles, Uwe; Shahgaldian, Patrick

    2014-01-01

    In the Swiss Universities of Applied Sciences, a number of research groups are involved in surface science, with different methodological approaches and a broad range of sophisticated characterization techniques. A snapshot of the current research going on in different groups from the University of Applied Sciences and Arts Western Switzerland (HES-SO), the Zurich University of Applied Sciences (ZHAW) and the University of Applied Sciences and Arts Northwestern Switzerland (FHNW) is given.

  19. A Comparison of Seventh Grade Thai Students' Reading Comprehension and Motivation to Read English through Applied Instruction Based on the Genre-Based Approach and the Teacher's Manual

    ERIC Educational Resources Information Center

    Sawangsamutchai, Yutthasak; Rattanavich, Saowalak

    2016-01-01

    The objective of this research is to compare the English reading comprehension and motivation to read of seventh grade Thai students taught with applied instruction through the genre-based approach and teachers' manual. A randomized pre-test post-test control group design was used through the cluster random sampling technique. The data were…

  20. A Research Planning Assessment for Applications of Artificial Intelligence in Manufacturing.

    DTIC Science & Technology

    1986-01-01

    to apply,based on their needs.___ *PROJECT OESCRIPTION/ APPROACH : -The project will apply A_ in r’ejpesent ni ri O siri ct...of this project are essential for the practical implementation of Al-based approaches to improving unit processes. This work will enable advances in ...July 1985 to 1 August 1985. The authors wish to thank all workshop participants for their contributions to this effort. In particular, we wish to

  1. Selection of a turbine cooling system applying multi-disciplinary design considerations.

    PubMed

    Glezer, B

    2001-05-01

    The presented paper describes a multi-disciplinary cooling selection approach applied to major gas turbine engine hot section components, including turbine nozzles, blades, discs, combustors and support structures, which maintain blade tip clearances. The paper demonstrates benefits of close interaction between participating disciplines starting from early phases of the hot section development. The approach targets advancements in engine performance and cost by optimizing the design process, often requiring compromises within individual disciplines.

  2. A network approach for identifying and delimiting biogeographical regions.

    PubMed

    Vilhena, Daril A; Antonelli, Alexandre

    2015-04-24

    Biogeographical regions (geographically distinct assemblages of species and communities) constitute a cornerstone for ecology, biogeography, evolution and conservation biology. Species turnover measures are often used to quantify spatial biodiversity patterns, but algorithms based on similarity can be sensitive to common sampling biases in species distribution data. Here we apply a community detection approach from network theory that incorporates complex, higher-order presence-absence patterns. We demonstrate the performance of the method by applying it to all amphibian species in the world (c. 6,100 species), all vascular plant species of the USA (c. 17,600) and a hypothetical data set containing a zone of biotic transition. In comparison with current methods, our approach tackles the challenges posed by transition zones and succeeds in retrieving a larger number of commonly recognized biogeographical regions. This method can be applied to generate objective, data-derived identification and delimitation of the world's biogeographical regions.

  3. Applying Sociology to the Teaching of Applied Sociology.

    ERIC Educational Resources Information Center

    Wallace, Richard Cheever

    A college-level applied sociology course in which students use sociological theory or research methodology to solve social problems is described. Guidelines for determining appropriate projects are: (1) the student must feel there is a substantial need for the project; (2) the project must be approachable through recognized sociological…

  4. Becoming an Expert: Developing Expertise in an Applied Discipline

    ERIC Educational Resources Information Center

    Kuhlmann, Diane Orlich; Ardichvili, Alexandre

    2015-01-01

    Purpose: This paper aims to examine the development of expertise in an applied discipline by addressing the research question: How is professional expertise developed in an applied profession? Design/methodology/approach: Using a grounded theory methodology (GTM), nine technical-tax experts, and three experienced, non-expert tax professionals were…

  5. Building "Applied Linguistic Historiography": Rationale, Scope, and Methods

    ERIC Educational Resources Information Center

    Smith, Richard

    2016-01-01

    In this article I argue for the establishment of "Applied Linguistic Historiography" (ALH), that is, a new domain of enquiry within applied linguistics involving a rigorous, scholarly, and self-reflexive approach to historical research. Considering issues of rationale, scope, and methods in turn, I provide reasons why ALH is needed and…

  6. The Contextual Interference Effect in Applied Settings

    ERIC Educational Resources Information Center

    Barreiros, Joao; Figueiredo, Teresa; Godinho, Mario

    2007-01-01

    This paper analyses the research literature that approaches the contextual interference effect in applied settings. In contrast to the laboratory settings, in which high interference conditions depress acquisition and promote learning evaluated in retention and transfer tests, in applied settings most of the studies (60%) fail to observe positive…

  7. A Computer-Assisted Approach for Conducting Information Technology Applied Instructions

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Hwang, Gwo-Jen; Tsai, Pei Jin; Yang, Tzu-Chi

    2009-01-01

    The growing popularity of computer and network technologies has attracted researchers to investigate the strategies and the effects of information technology applied instructions. Previous research has not only demonstrated the benefits of applying information technologies to the learning process, but has also revealed the difficulty of applying…

  8. Characterization of Copper Corrosion Products in Drinking Water by Combining Electrochemical and Surface Analyses

    EPA Science Inventory

    This study focuses on the application of electrochemical approaches to drinking water copper corrosion problems. Applying electrochemical approaches combined with copper solubility measurements, and solid surface analysis approaches were discussed. Tafel extrapolation and Electro...

  9. Characterization of Copper Corrosion Products Formed in Drinking Water by Combining Electrochemical and Surface Analyses

    EPA Science Inventory

    This study focuses on the application of electrochemical approaches to drinking water copper corrosion problems. Applying electrochemical approaches combined with copper solubility measurements, and solid surface analysis approaches were discussed. Tafel extrapolation and Electro...

  10. An Approach for Autonomy: A Collaborative Communication Framework for Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Dufrene, Warren Russell, Jr.

    2005-01-01

    Research done during the last three years has studied the emersion properties of Complex Adaptive Systems (CAS). The deployment of Artificial Intelligence (AI) techniques applied to remote Unmanned Aerial Vehicles has led the author to investigate applications of CAS within the field of Autonomous Multi-Agent Systems. The core objective of current research efforts is focused on the simplicity of Intelligent Agents (IA) and the modeling of these agents within complex systems. This research effort looks at the communication, interaction, and adaptability of multi-agents as applied to complex systems control. The embodiment concept applied to robotics has application possibilities within multi-agent frameworks. A new framework for agent awareness within a virtual 3D world concept is possible where the vehicle is composed of collaborative agents. This approach has many possibilities for applications to complex systems. This paper describes the development of an approach to apply this virtual framework to the NASA Goddard Space Flight Center (GSFC) tetrahedron structure developed under the Autonomous Nano Technology Swarm (ANTS) program and the Super Miniaturized Addressable Reconfigurable Technology (SMART) architecture program. These projects represent an innovative set of novel concepts deploying adaptable, self-organizing structures composed of many tetrahedrons. This technology is pushing current applied Agents Concepts to new levels of requirements and adaptability.

  11. Predicting the future: opportunities and challenges for the chemical industry to apply 21st-century toxicity testing.

    PubMed

    Settivari, Raja S; Ball, Nicholas; Murphy, Lynea; Rasoulpour, Reza; Boverhof, Darrell R; Carney, Edward W

    2015-03-01

    Interest in applying 21st-century toxicity testing tools for safety assessment of industrial chemicals is growing. Whereas conventional toxicology uses mainly animal-based, descriptive methods, a paradigm shift is emerging in which computational approaches, systems biology, high-throughput in vitro toxicity assays, and high-throughput exposure assessments are beginning to be applied to mechanism-based risk assessments in a time- and resource-efficient fashion. Here we describe recent advances in predictive safety assessment, with a focus on their strategic application to meet the changing demands of the chemical industry and its stakeholders. The opportunities to apply these new approaches is extensive and include screening of new chemicals, informing the design of safer and more sustainable chemical alternatives, filling information gaps on data-poor chemicals already in commerce, strengthening read-across methodology for categories of chemicals sharing similar modes of action, and optimizing the design of reduced-risk product formulations. Finally, we discuss how these predictive approaches dovetail with in vivo integrated testing strategies within repeated-dose regulatory toxicity studies, which are in line with 3Rs principles to refine, reduce, and replace animal testing. Strategic application of these tools is the foundation for informed and efficient safety assessment testing strategies that can be applied at all stages of the product-development process.

  12. Capteur de CO{2} à fibres optiques par absorption moléculaire à 4,3 μm

    NASA Astrophysics Data System (ADS)

    Bendamardji, S.; Alayli, Y.; Huard, S.

    1996-04-01

    This paper describes a remote optical fibre sensor for the carbon dioxide detection by molecular absorption in the near infrared (4.3 μm) corresponding to fundamental mode ν3. To overcome the problem of the strong attenuation signal of optical fibre in the near infrared, we have used the opto-suppling technique which changes the working wavelength from 4.3 μm to 860 nm and permits the use of standard optical fibre 50/125. The simulation of absorption has been obtained by original modelisation of the absorption spectrum and the establishment of the calibration curves takes to the sensor to detect a partial pressures greater than 100 μbar with a minimal error margin of 100 μbar, which is acceptable considering the future use of the device. The sensor has been designed to monitor the CO{2} rate in enriched greenhouses. Cet article décrit un capteur à fibres optiques de gaz carbonique par absorption moléculaire dans l'infrarouge moyen (4,3 μm) correspondant au mode fondamental ν3. La liaison entre le site de mesure et le site de contrôle est assurée par un fibre optique standard 50/125 après une transposition de longueur d'onde de 4,3 μm à 860 nm par opto-alimentation. La simulation de l'absorption a été obtenue par modélisation originale du spectre d'absorption et l'établissement des courbes d'étalonnage prévoit une marge d'erreur minimale de 100 μbar, ce qui est suffisant pour l'application du dispositif à la régulation de taux CO{2} dans les serres agricoles enrichies par de gaz.

  13. Commerce de detail de l'essence automobile: Modelisation de l'impact a court terme des facteurs endogenes et exogenes sur les ventes d'essence dans les stations-service a Montreal

    NASA Astrophysics Data System (ADS)

    Nguimbus, Raphael

    La determination de l'impact des facteurs sous controle et hors controle qui influencent les volumes de vente des magasins de detail qui vendent des produits homogenes et fortement substituables constitue le coeur de cette these. Il s'agit d'estimer un ensemble de coefficients stables et asymtotiquement efficaces non correles avec les effets specifiques aleatoires des sites d'essence dans le marche de Montreal (Quebec, Canada) durant is periode 1993--1997. Le modele econometrique qui est ainsi specifie et teste, isole un ensemble de quatre variables dont le prix de detail affiche dans un site d'essence ordinaire, la capacite de service du site pendant les heures de pointe, les heures de service et le nombre de sites concurrents au voisinage du site dans un rayon de deux kilometres. Ces quatre facteurs influencent les ventes d'essence dans les stations-service. Les donnees en panel avec les methodes d'estimation robustes (estimateur a distance minimale) sont utilisees pour estimer les parametres du modele de vente. Nous partons avec l'hypothese generale selon laquelle il se developpe une force d'attraction qui attire les clients automobilistes dans chaque site, et qui lui permet de realiser les ventes. Cette capacite d'attraction varie d'un site a un autre et cela est du a la combinaison de l'effort marketing et de l'environnement concurrentiel autour du site. Les notions de voisinage et de concurrence spatiale expliquent les comportements des decideurs qui gerent les sites. Le but de cette these est de developper un outil d'aide a la decision (modele analytique) pour permettre aux gestionnaires des chaines de stations-service d'affecter efficacement les ressources commerciales dans ies points de vente.

  14. Etude de la dynamique des porteurs dans des nanofils de silicium par spectroscopie terahertz

    NASA Astrophysics Data System (ADS)

    Beaudoin, Alexandre

    Ce memoire presente une etude des proprietes de conduction electrique et de la dynamique temporelle des porteurs de charges dans des nanofils de silicium sondes par rayonnement terahertz. Les cas de nanofils de silicium non intentionnellement dopes et dopes type n sont compares pour differentes configurations du montage experimental. Les mesures de spectroscopie terahertz en transmission montre qu'il est possible de detecter la presence de dopants dans les nanofils via leur absorption du rayonnement terahertz (˜ 1--12 meV). Les difficultes de modelisation de la transmission d'une impulsion electromagnetique dans un systeme de nanofils sont egalement discutees. La detection differentielle, une modification au systeme de spectroscopie terahertz, est testee et ses performances sont comparees au montage de caracterisation standard. Les instructions et des recommendations pour la mise en place de ce type de mesure sont incluses. Les resultats d'une experience de pompe optique-sonde terahertz sont egalement presentes. Dans cette experience, les porteurs de charge temporairement crees suite a l'absorption de la pompe optique (lambda ˜ 800 nm) dans les nanofils (les photoporteurs) s'ajoutent aux porteurs initialement presents et augmentent done l'absorption du rayonnement terahertz. Premierement, l'anisotropie de l'absorption terahertz et de la pompe optique par les nanofils est demontree. Deuxiemement, le temps de recombinaison des photoporteurs est etudie en fonction du nombre de photoporteurs injectes. Une hypothese expliquant les comportements observes pour les nanofils non-dopes et dopes-n est presentee. Troisiemement, la photoconductivite est extraite pour les nanofils non-dopes et dopes-n sur une plage de 0.5 a 2 THz. Un lissage sur la photoconductivite permet d'estimer le nombre de dopants dans les nanofils dopes-n. Mots-cles: nanofil, silicium, terahertz, conductivite, spectroscopie, photoconductivite.

  15. A Null Space Control of Two Wheels Driven Mobile Manipulator Using Passivity Theory

    NASA Astrophysics Data System (ADS)

    Shibata, Tsuyoshi; Murakami, Toshiyuki

    This paper describes a control strategy of null space motion of a two wheels driven mobile manipulator. Recently, robot is utilized in various industrial fields and it is preferable for the robot manipulator to have multiple degrees of freedom motion. Several studies of kinematics for null space motion have been proposed. However stability analysis of null space motion is not enough. Furthermore, these approaches apply to stable systems, but they do not apply unstable systems. Then, in this research, base of manipulator equips with two wheels driven mobile robot. This robot is called two wheels driven mobile manipulator, which becomes unstable system. In the proposed approach, a control design of null space uses passivity based stabilizing. A proposed controller is decided so that closed-loop system of robot dynamics satisfies passivity. This is passivity based control. Then, control strategy is that stabilizing of the robot system applies to work space observer based approach and null space control while keeping end-effector position. The validity of the proposed approach is verified by simulations and experiments of two wheels driven mobile manipulator.

  16. The Tao of Supervision: Taoist Insights into the Theory and Practice of Educational Supervision.

    ERIC Educational Resources Information Center

    Glanz, Jeffrey

    1997-01-01

    There are three approaches to educational supervision: the applied science approach, the interpretive-practical approach, and the critical/emancipatory approach. From a Taoist perspective, conflicting supervision theories or proposals should be welcomed, not resisted. By accepting a diversity of views to inform practice, a balance or centeredness…

  17. A Model for Applying Lexical Approach in Teaching Russian Grammar.

    ERIC Educational Resources Information Center

    Gettys, Serafima

    The lexical approach to teaching Russian grammar is explained, an instructional sequence is outlined, and a classroom study testing the effectiveness of the approach is reported. The lexical approach draws on research on cognitive psychology, second language acquisition theory, and research on learner language. Its bases in research and its…

  18. Developing a Materialist Anti-Racist Approach to Language Activism

    ERIC Educational Resources Information Center

    Flores, Nelson

    2017-01-01

    The aim of this paper is to propose a materialist anti-racist approach to language activism. This approach combines Joshua Fishman's pioneering work on language activism with critical race theory and the recent materialist turn in applied linguistics. A materialist anti-racist approach to language activism, positions language policy within broader…

  19. Causal Models for Mediation Analysis: An Introduction to Structural Mean Models.

    PubMed

    Zheng, Cheng; Atkins, David C; Zhou, Xiao-Hua; Rhew, Isaac C

    2015-01-01

    Mediation analyses are critical to understanding why behavioral interventions work. To yield a causal interpretation, common mediation approaches must make an assumption of "sequential ignorability." The current article describes an alternative approach to causal mediation called structural mean models (SMMs). A specific SMM called a rank-preserving model (RPM) is introduced in the context of an applied example. Particular attention is given to the assumptions of both approaches to mediation. Applying both mediation approaches to the college student drinking data yield notable differences in the magnitude of effects. Simulated examples reveal instances in which the traditional approach can yield strongly biased results, whereas the RPM approach remains unbiased in these cases. At the same time, the RPM approach has its own assumptions that must be met for correct inference, such as the existence of a covariate that strongly moderates the effect of the intervention on the mediator and no unmeasured confounders that also serve as a moderator of the effect of the intervention or the mediator on the outcome. The RPM approach to mediation offers an alternative way to perform mediation analysis when there may be unmeasured confounders.

  20. Deriving a no expected sensitization induction level for fragrance ingredients without animal testing: An integrated approach applied to specific case studies.

    PubMed

    Natsch, Andreas; Emter, Roger; Haupt, Tina; Ellis, Graham

    2018-06-01

    Cosmetic regulations prohibit animal testing for the purpose of safety assessment and recent REACH guidance states that the local lymph node assay (LLNA) in mice shall only be conducted if in vitro data cannot give sufficient information for classification and labelling. However, Quantitative Risk Assessment (QRA) for fragrance ingredients requires a NESIL, a dose not expected to cause induction of skin sensitization in humans. In absence of human data, this is derived from the LLNA and it remains a key challenge for risk assessors to derive this value from non-animal data. Here we present a workflow using structural information, reactivity data and KeratinoSens results to predict a LLNA result as a point of departure. Specific additional tests (metabolic activation, complementary reactivity tests) are applied in selected cases depending on the chemical domain of a molecule. Finally, in vitro and in vivo data on close analogues are used to estimate uncertainty of the prediction in the specific chemical domain. This approach was applied to three molecules which were subsequently tested in the LLNA and 22 molecules with available and sometimes discordant human and LLNA data. Four additional case studies illustrate how this approach is being applied to recently developed molecules in the absence of animal data. Estimation of uncertainty and how this can be applied to determine a final NESIL for risk assessment is discussed. We conclude that, in the data-rich domain of fragrance ingredients, sensitization risk assessment without animal testing is possible in most cases by this integrated approach.

  1. Development of theoretical approach for describing electronic properties of hetero-interface systems under applied bias voltage.

    PubMed

    Iida, Kenji; Noda, Masashi; Nobusada, Katsuyuki

    2017-02-28

    We have developed a theoretical approach for describing the electronic properties of hetero-interface systems under an applied electrode bias. The finite-temperature density functional theory is employed for controlling the chemical potential in their interfacial region, and thereby the electronic charge of the system is obtained. The electric field generated by the electronic charging is described as a saw-tooth-like electrostatic potential. Because of the continuum approximation of dielectrics sandwiched between electrodes, we treat dielectrics with thicknesses in a wide range from a few nanometers to more than several meters. Furthermore, the approach is implemented in our original computational program named grid-based coupled electron and electromagnetic field dynamics (GCEED), facilitating its application to nanostructures. Thus, the approach is capable of comprehensively revealing electronic structure changes in hetero-interface systems with an applied bias that are practically useful for experimental studies. We calculate the electronic structure of a SiO 2 -graphene-boron nitride (BN) system in which an electrode bias is applied between the graphene layer and an electrode attached on the SiO 2 film. The electronic energy barrier between graphene and BN is varied with an applied bias, and the energy variation depends on the thickness of the BN film. This is because the density of states of graphene is so low that the graphene layer cannot fully screen the electric field generated by the electrodes. We have demonstrated that the electronic properties of hetero-interface systems are well controlled by the combination of the electronic charging and the generated electric field.

  2. Ecological hierarchies and self-organisation - Pattern analysis, modelling and process integration across scales

    USGS Publications Warehouse

    Reuter, H.; Jopp, F.; Blanco-Moreno, J. M.; Damgaard, C.; Matsinos, Y.; DeAngelis, D.L.

    2010-01-01

    A continuing discussion in applied and theoretical ecology focuses on the relationship of different organisational levels and on how ecological systems interact across scales. We address principal approaches to cope with complex across-level issues in ecology by applying elements of hierarchy theory and the theory of complex adaptive systems. A top-down approach, often characterised by the use of statistical techniques, can be applied to analyse large-scale dynamics and identify constraints exerted on lower levels. Current developments are illustrated with examples from the analysis of within-community spatial patterns and large-scale vegetation patterns. A bottom-up approach allows one to elucidate how interactions of individuals shape dynamics at higher levels in a self-organisation process; e.g., population development and community composition. This may be facilitated by various modelling tools, which provide the distinction between focal levels and resulting properties. For instance, resilience in grassland communities has been analysed with a cellular automaton approach, and the driving forces in rodent population oscillations have been identified with an agent-based model. Both modelling tools illustrate the principles of analysing higher level processes by representing the interactions of basic components.The focus of most ecological investigations on either top-down or bottom-up approaches may not be appropriate, if strong cross-scale relationships predominate. Here, we propose an 'across-scale-approach', closely interweaving the inherent potentials of both approaches. This combination of analytical and synthesising approaches will enable ecologists to establish a more coherent access to cross-level interactions in ecological systems. ?? 2010 Gesellschaft f??r ??kologie.

  3. Tennis Coaching: Applying the Game Sense Approach

    ERIC Educational Resources Information Center

    Pill, Shane; Hewitt, Mitchell

    2017-01-01

    This article demonstrates the game sense approach for teaching tennis to novice players. In a game sense approach, learning is positioned within modified games to emphasize the way rules shape game behavior, tactical awareness, decision-making and the development of contextualized stroke mechanics.

  4. Long-Term Memory: A State-Space Approach

    ERIC Educational Resources Information Center

    Kiss, George R.

    1972-01-01

    Some salient concepts derived from the information sciences and currently used in theories of human memory are critically reviewed. The application of automata theory is proposed as a new approach in this field. The approach is illustrated by applying it to verbal memory. (Author)

  5. The adaptive, cut-cell Cartesian approach (warts and all)

    NASA Technical Reports Server (NTRS)

    Powell, Kenneth G.

    1995-01-01

    Solution-adaptive methods based on cutting bodies out of Cartesian grids are gaining popularity now that the ways of circumventing the accuracy problems associated with small cut cells have been developed. Researchers are applying Cartesian-based schemes to a broad class of problems now, and, although there is still development work to be done, it is becoming clearer which problems are best suited to the approach (and which are not). The purpose of this paper is to give a candid assessment, based on applying Cartesian schemes to a variety of problems, of the strengths and weaknesses of the approach as it is currently implemented.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goins, Bobby

    A systems based approach will be used to evaluate the nitrogen delivery process. This approach involves principles found in Lean, Reliability, Systems Thinking, and Requirements. This unique combination of principles and thought process yields a very in depth look into the system to which it is applied. By applying a systems based approach to the nitrogen delivery process there should be improvements in cycle time, efficiency, and a reduction in the required number of personnel needed to sustain the delivery process. This will in turn reduce the amount of demurrage charges that the site incurs. In addition there should bemore » less frustration associated with the delivery process.« less

  7. Economic order quantity (EOQ) by game theory approach in probabilistic supply chain system under service level constraint for items with imperfect quality

    NASA Astrophysics Data System (ADS)

    Setiawan, R.

    2018-03-01

    In this paper, Economic Order Quantity (EOQ) of probabilistic two-level supply – chain system for items with imperfect quality has been analyzed under service level constraint. A firm applies an active service level constraint to avoid unpredictable shortage terms in the objective function. Mathematical analysis of optimal result is delivered using two equilibrium scheme concept in game theory approach. Stackelberg’s equilibrium for cooperative strategy and Stackelberg’s Equilibrium for noncooperative strategy. This is a new approach to game theory result in inventory system whether service level constraint is applied by a firm in his moves.

  8. Classroom Interaction: A Sociological Approach

    ERIC Educational Resources Information Center

    Calonico, James M.; Calonico, Beth Ann

    1972-01-01

    The authors employ Bales' IPA and apply hypotheses from Homans' Human Group'' to present a sociological approach to the scientific study of classroom interaction at the elementary school level. (Authors)

  9. What methods are used to apply positive deviance within healthcare organisations? A systematic review

    PubMed Central

    Baxter, Ruth; Taylor, Natalie; Kellar, Ian; Lawton, Rebecca

    2016-01-01

    Background The positive deviance approach focuses on those who demonstrate exceptional performance, despite facing the same constraints as others. ‘Positive deviants’ are identified and hypotheses about how they succeed are generated. These hypotheses are tested and then disseminated within the wider community. The positive deviance approach is being increasingly applied within healthcare organisations, although limited guidance exists and different methods, of varying quality, are used. This paper systematically reviews healthcare applications of the positive deviance approach to explore how positive deviance is defined, the quality of existing applications and the methods used within them, including the extent to which staff and patients are involved. Methods Peer-reviewed articles, published prior to September 2014, reporting empirical research on the use of the positive deviance approach within healthcare, were identified from seven electronic databases. A previously defined four-stage process for positive deviance in healthcare was used as the basis for data extraction. Quality assessments were conducted using a validated tool, and a narrative synthesis approach was followed. Results 37 of 818 articles met the inclusion criteria. The positive deviance approach was most frequently applied within North America, in secondary care, and to address healthcare-associated infections. Research predominantly identified positive deviants and generated hypotheses about how they succeeded. The approach and processes followed were poorly defined. Research quality was low, articles lacked detail and comparison groups were rarely included. Applications of positive deviance typically lacked staff and/or patient involvement, and the methods used often required extensive resources. Conclusion Further research is required to develop high quality yet practical methods which involve staff and patients in all stages of the positive deviance approach. The efficacy and efficiency of positive deviance must be assessed and compared with other quality improvement approaches. PROSPERO registration number CRD42014009365. PMID:26590198

  10. Jointly modeling longitudinal proportional data and survival times with an application to the quality of life data in a breast cancer trial.

    PubMed

    Song, Hui; Peng, Yingwei; Tu, Dongsheng

    2017-04-01

    Motivated by the joint analysis of longitudinal quality of life data and recurrence free survival times from a cancer clinical trial, we present in this paper two approaches to jointly model the longitudinal proportional measurements, which are confined in a finite interval, and survival data. Both approaches assume a proportional hazards model for the survival times. For the longitudinal component, the first approach applies the classical linear mixed model to logit transformed responses, while the second approach directly models the responses using a simplex distribution. A semiparametric method based on a penalized joint likelihood generated by the Laplace approximation is derived to fit the joint model defined by the second approach. The proposed procedures are evaluated in a simulation study and applied to the analysis of breast cancer data motivated this research.

  11. Shifting visions: "delegation" policies and the building of a "rights-based" approach to maternal mortality.

    PubMed

    Freedman, Lynn P

    2002-01-01

    "Rights-based" approaches fold human rights principles into the ongoing work of health policy making and programming. The example of delegation of anesthesia provision for emergency obstetric care is used to demonstrate how a rights-based approach, applied to this problem in the context of high-mortality countries, requires decision makers to shift from an individual, ethics-based, clinical perspective to a structural, rights-based, public health perspective. This fluid and context-sensitive approach to human rights also applies at the international level, where the direction of overall maternal mortality reduction strategy is set. By contrasting family planning programs and maternal mortality programs, this commentary argues for choosing the human rights approach that speaks most effectively to the power dynamics underlying the particular health problem being addressed. In the case of maternal death in high-mortality countries, this means a strategic focus on the health care system itself.

  12. Monitoring hemodynamics and oxygenation of the kidney in rats by a combined near-infrared spectroscopy and invasive probe approach

    NASA Astrophysics Data System (ADS)

    Grosenick, Dirk; Cantow, Kathleen; Arakelyan, Karen; Wabnitz, Heidrun; Flemming, Bert; Skalweit, Angela; Ladwig, Mechthild; Macdonald, Rainer; Niendorf, Thoralf; Seeliger, Erdmann

    2015-07-01

    We have developed a hybrid approach to investigate the dynamics of perfusion and oxygenation in the kidney of rats under pathophysiologically relevant conditions. Our approach combines near-infrared spectroscopy to quantify hemoglobin concentration and oxygen saturation in the renal cortex, and an invasive probe method for measuring total renal blood flow by an ultrasonic probe, perfusion by laser-Doppler fluxmetry, and tissue oxygen tension via fluorescence quenching. Hemoglobin concentration and oxygen saturation were determined from experimental data by a Monte Carlo model. The hybrid approach was applied to investigate and compare temporal changes during several types of interventions such as arterial and venous occlusions, as well as hyperoxia, hypoxia and hypercapnia induced by different mixtures of the inspired gas. The approach was also applied to study the effects of the x-ray contrast medium iodixanol on the kidney.

  13. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  14. Comparison of Pixel-Based and Object-Based Classification Using Parameters and Non-Parameters Approach for the Pattern Consistency of Multi Scale Landcover

    NASA Astrophysics Data System (ADS)

    Juniati, E.; Arrofiqoh, E. N.

    2017-09-01

    Information extraction from remote sensing data especially land cover can be obtained by digital classification. In practical some people are more comfortable using visual interpretation to retrieve land cover information. However, it is highly influenced by subjectivity and knowledge of interpreter, also takes time in the process. Digital classification can be done in several ways, depend on the defined mapping approach and assumptions on data distribution. The study compared several classifiers method for some data type at the same location. The data used Landsat 8 satellite imagery, SPOT 6 and Orthophotos. In practical, the data used to produce land cover map in 1:50,000 map scale for Landsat, 1:25,000 map scale for SPOT and 1:5,000 map scale for Orthophotos, but using visual interpretation to retrieve information. Maximum likelihood Classifiers (MLC) which use pixel-based and parameters approach applied to such data, and also Artificial Neural Network classifiers which use pixel-based and non-parameters approach applied too. Moreover, this study applied object-based classifiers to the data. The classification system implemented is land cover classification on Indonesia topographic map. The classification applied to data source, which is expected to recognize the pattern and to assess consistency of the land cover map produced by each data. Furthermore, the study analyse benefits and limitations the use of methods.

  15. Applying the Network Simulation Method for testing chaos in a resistively and capacitively shunted Josephson junction model

    NASA Astrophysics Data System (ADS)

    Bellver, Fernando Gimeno; Garratón, Manuel Caravaca; Soto Meca, Antonio; López, Juan Antonio Vera; Guirao, Juan L. G.; Fernández-Martínez, Manuel

    In this paper, we explore the chaotic behavior of resistively and capacitively shunted Josephson junctions via the so-called Network Simulation Method. Such a numerical approach establishes a formal equivalence among physical transport processes and electrical networks, and hence, it can be applied to efficiently deal with a wide range of differential systems. The generality underlying that electrical equivalence allows to apply the circuit theory to several scientific and technological problems. In this work, the Fast Fourier Transform has been applied for chaos detection purposes and the calculations have been carried out in PSpice, an electrical circuit software. Overall, it holds that such a numerical approach leads to quickly computationally solve Josephson differential models. An empirical application regarding the study of the Josephson model completes the paper.

  16. Diagnostics and Active Control of Aircraft Interior Noise

    NASA Technical Reports Server (NTRS)

    Fuller, C. R.

    1998-01-01

    This project deals with developing advanced methods for investigating and controlling interior noise in aircraft. The work concentrates on developing and applying the techniques of Near Field Acoustic Holography (NAH) and Principal Component Analysis (PCA) to the aircraft interior noise dynamic problem. This involves investigating the current state of the art, developing new techniques and then applying them to the particular problem being studied. The knowledge gained under the first part of the project was then used to develop and apply new, advanced noise control techniques for reducing interior noise. A new fully active control approach based on the PCA was developed and implemented on a test cylinder. Finally an active-passive approach based on tunable vibration absorbers was to be developed and analytically applied to a range of test structures from simple plates to aircraft fuselages.

  17. Toward methodological emancipation in applied health research.

    PubMed

    Thorne, Sally

    2011-04-01

    In this article, I trace the historical groundings of what have become methodological conventions in the use of qualitative approaches to answer questions arising from the applied health disciplines and advocate an alternative logic more strategically grounded in the epistemological orientations of the professional health disciplines. I argue for an increasing emphasis on the modification of conventional qualitative approaches to the particular knowledge demands of the applied practice domain, challenging the merits of what may have become unwarranted attachment to theorizing. Reorienting our methodological toolkits toward the questions arising within an evidence-dominated policy agenda, I encourage my applied health disciplinary colleagues to make themselves useful to that larger project by illuminating that which quantitative research renders invisible, problematizing the assumptions on which it generates conclusions, and filling in the gaps in knowledge needed to make decisions on behalf of people and populations.

  18. Carrying BioMath education in a Leaky Bucket.

    PubMed

    Powell, James A; Kohler, Brynja R; Haefner, James W; Bodily, Janice

    2012-09-01

    In this paper, we describe a project-based mathematical lab implemented in our Applied Mathematics in Biology course. The Leaky Bucket Lab allows students to parameterize and test Torricelli's law and develop and compare their own alternative models to describe the dynamics of water draining from perforated containers. In the context of this lab students build facility in a variety of applied biomathematical tools and gain confidence in applying these tools in data-driven environments. We survey analytic approaches developed by students to illustrate the creativity this encourages as well as prepare other instructors to scaffold the student learning experience. Pedagogical results based on classroom videography support the notion that the Biology-Applied Math Instructional Model, the teaching framework encompassing the lab, is effective in encouraging and maintaining high-level cognition among students. Research-based pedagogical approaches that support the lab are discussed.

  19. “Nobody tosses a dwarf!” The relation between the empirical and normative reexamined

    PubMed Central

    Leget, C.; Borry, P.; De Vries, R.

    2009-01-01

    This article discusses the relation between empirical and normative approaches in bioethics. The issue of dwarf tossing, while admittedly unusual, is chosen as point of departure because it challenges the reader to look upon several central bioethical themes – including human dignity, autonomy, and the protection of vulnerable people – with fresh eyes. After an overview of current approaches to the integration of empirical and normative ethics, we consider five ways that the empirical and normative can be brought together to speak to the problem of dwarf tossing: prescriptive applied ethics, theorist ethics, critical applied ethics, particularist ethics and integrated empirical ethics. We defend a position of critical applied ethics that allows for a two-way relation between empirical and normative theories. The approach we endorse acknowledges that a social practice can and should be judged by both the gathering of empirical data and by the normative ethics. Critical applied ethics uses a five stage process that includes: (a) determination of the problem, (b) description of the problem, (c) empirical study of effects and alternatives, (d) normative weighing and (e) evaluation of the effects of a decision. In each stage, we explore the perspective from both the empirical (sociological) and the normative ethical poles that, in our view, should operate as two independent focuses of the ellipse that is called bioethics. We conclude by applying our five stage critical applied ethics to the example of dwarf tossing. PMID:19338523

  20. 'Nobody tosses a dwarf!' The relation between the empirical and the normative reexamined.

    PubMed

    Leget, Carlo; Borry, Pascal; de Vries, Raymond

    2009-05-01

    This article discusses the relation between empirical and normative approaches in bioethics. The issue of dwarf tossing, while admittedly unusual, is chosen as a point of departure because it challenges the reader to look with fresh eyes upon several central bioethical themes, including human dignity, autonomy, and the protection of vulnerable people. After an overview of current approaches to the integration of empirical and normative ethics, we consider five ways that the empirical and normative can be brought together to speak to the problem of dwarf tossing: prescriptive applied ethics, theoretical ethics, critical applied ethics, particularist ethics and integrated empirical ethics. We defend a position of critical applied ethics that allows for a two-way relation between empirical and normative theories. Against efforts fully to integrate the normative and the empirical into one synthesis, we propose that the two should stand in tension and relation to one another. The approach we endorse acknowledges that a social practice can and should be judged both by the gathering of empirical data and by normative ethics. Critical applied ethics uses a five stage process that includes: (a) determination of the problem, (b) description of the problem, (c) empirical study of effects and alternatives, (d) normative weighing and (e) evaluation of the effects of a decision. In each stage, we explore the perspective from both the empirical (sociological) and the normative ethical point of view. We conclude by applying our five-stage critical applied ethics to the example of dwarf tossing.

  1. Evaluation of different approaches to modeling the second-order ionospheric delay on GPS measurements

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, M.; Desai, S. D.; Butala, M. D.; Komjathy, A.

    2013-12-01

    This work evaluates various approaches to compute the second order ionospheric correction (SOIC) to Global Positioning System (GPS) measurements. When estimating the reference frame using GPS, applying this correction is known to primarily affect the realization of the origin of the Earth's reference frame along the spin axis (Z coordinate). Therefore, the Z translation relative to the International Terrestrial Reference Frame 2008 is used as the metric to evaluate various published approaches to determining the slant total electron content (TEC) for the SOIC: getting the slant TEC from GPS measurements, and using the vertical total electron content (TEC) given by a Global Ionospheric Model (GIM) to transform it to slant TEC via a mapping function. All of these approaches agree to 1 mm if the ionospheric shell height needed in GIM-based approaches is set to 600 km. The commonly used shell height of 450 km introduces an offset of 1 to 2 mm. When the SOIC is not applied, the Z axis translation can be reasonably modeled with a ratio of +0.23 mm/TEC units of the daily median GIM vertical TEC. Also, precise point positioning (PPP) solutions (positions and clocks) determined with and without SOIC differ by less than 1 mm only if they are based upon GPS orbit and clock solutions that have consistently applied or not applied the correction, respectively. Otherwise, deviations of few millimeters in the north component of the PPP solutions can arise due to inconsistencies with the satellite orbit and clock products, and those deviations exhibit a dependency on solar cycle conditions.

  2. Spatial-temporal event detection in climate parameter imagery.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKenna, Sean Andrew; Gutierrez, Karen A.

    Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to themore » earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.« less

  3. From a Reductionist to a Holistic Approach in Preventive Nutrition to Define New and More Ethical Paradigms

    PubMed Central

    Fardet, Anthony; Rock, Edmond

    2015-01-01

    This concept paper intends to define four new paradigms for improving nutrition research. First, the consequences of applying a reductionist versus a holistic approach to nutrition science will be discussed. The need for a more focused preventive nutrition approach, as opposed to a curative one, will then be presented on the basis of the ‘healthy core metabolism’ concept. This will lead us to propose a new classification of food products based on processing for future epidemiological studies. As a result of applying the holistic approach, health food potential will be redefined based on both food structure and nutrient density. These new paradigms should help define a more ethical preventive nutrition for humans to improve public recommendations while preserving the environment. PMID:27417812

  4. From a Reductionist to a Holistic Approach in Preventive Nutrition to Define New and More Ethical Paradigms.

    PubMed

    Fardet, Anthony; Rock, Edmond

    2015-10-28

    This concept paper intends to define four new paradigms for improving nutrition research. First, the consequences of applying a reductionist versus a holistic approach to nutrition science will be discussed. The need for a more focused preventive nutrition approach, as opposed to a curative one, will then be presented on the basis of the 'healthy core metabolism' concept. This will lead us to propose a new classification of food products based on processing for future epidemiological studies. As a result of applying the holistic approach, health food potential will be redefined based on both food structure and nutrient density. These new paradigms should help define a more ethical preventive nutrition for humans to improve public recommendations while preserving the environment.

  5. Instantaneous, phase-averaged, and time-averaged pressure from particle image velocimetry

    NASA Astrophysics Data System (ADS)

    de Kat, Roeland

    2015-11-01

    Recent work on pressure determination using velocity data from particle image velocimetry (PIV) resulted in approaches that allow for instantaneous and volumetric pressure determination. However, applying these approaches is not always feasible (e.g. due to resolution, access, or other constraints) or desired. In those cases pressure determination approaches using phase-averaged or time-averaged velocity provide an alternative. To assess the performance of these different pressure determination approaches against one another, they are applied to a single data set and their results are compared with each other and with surface pressure measurements. For this assessment, the data set of a flow around a square cylinder (de Kat & van Oudheusden, 2012, Exp. Fluids 52:1089-1106) is used. RdK is supported by a Leverhulme Trust Early Career Fellowship.

  6. Financial Planning for Information Technology: Conventional Approaches Need Not Apply.

    ERIC Educational Resources Information Center

    Falduto, Ellen F.

    1999-01-01

    Rapid advances in information technology have rendered conventional approaches to planning and budgeting useless, and no single method is universally appropriate. The most successful planning efforts are consistent with the institution's overall plan, and may combine conventional, opportunistic, and entrepreneurial approaches. Chief financial…

  7. An Agile Course-Delivery Approach

    ERIC Educational Resources Information Center

    Capellan, Mirkeya

    2009-01-01

    In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…

  8. Predictive failure analysis: planning for the worst so that it never happens!

    PubMed

    Hipple, Jack

    2008-01-01

    This article reviews an alternative approach to failure analysis involving a deliberate saboteurial approach rather than a checklist approach to disaster and emergency preparedness. This process is in the form of an algorithm that is easily applied to any planning situation.

  9. Procedures for Selecting Items for Computerized Adaptive Tests.

    ERIC Educational Resources Information Center

    Kingsbury, G. Gage; Zara, Anthony R.

    1989-01-01

    Several classical approaches and alternative approaches to item selection for computerized adaptive testing (CAT) are reviewed and compared. The study also describes procedures for constrained CAT that may be added to classical item selection approaches to allow them to be used for applied testing. (TJH)

  10. Discovery of novel Pim-1 kinase inhibitors by a hierarchical multistage virtual screening approach based on SVM model, pharmacophore, and molecular docking.

    PubMed

    Ren, Ji-Xia; Li, Lin-Li; Zheng, Ren-Lin; Xie, Huan-Zhang; Cao, Zhi-Xing; Feng, Shan; Pan, You-Li; Chen, Xin; Wei, Yu-Quan; Yang, Sheng-Yong

    2011-06-27

    In this investigation, we describe the discovery of novel potent Pim-1 inhibitors by employing a proposed hierarchical multistage virtual screening (VS) approach, which is based on support vector machine-based (SVM-based VS or SB-VS), pharmacophore-based VS (PB-VS), and docking-based VS (DB-VS) methods. In this approach, the three VS methods are applied in an increasing order of complexity so that the first filter (SB-VS) is fast and simple, while successive ones (PB-VS and DB-VS) are more time-consuming but are applied only to a small subset of the entire database. Evaluation of this approach indicates that it can be used to screen a large chemical library rapidly with a high hit rate and a high enrichment factor. This approach was then applied to screen several large chemical libraries, including PubChem, Specs, and Enamine as well as an in-house database. From the final hits, 47 compounds were selected for further in vitro Pim-1 inhibitory assay, and 15 compounds show nanomolar level or low micromolar inhibition potency against Pim-1. In particular, four of them were found to have new scaffolds which have potential for the chemical development of Pim-1 inhibitors.

  11. The Aortic Bifurcation Angle as a Factor in Application of the Outback for Femoropopliteal Lesions in Ipsilateral Versus Contralateral Approaches.

    PubMed

    Raskin, Daniel; Khaitovich, Boris; Balan, Shmuel; Silverberg, Daniel; Halak, Moshe; Rimon, Uri

    2018-01-01

    To assess the technical success of the Outback reentry device in contralateral versus ipsilateral approaches for femoropopliteal arterial occlusion. A retrospective review of patients treated for critical limb ischemia (CLI) using the Outback between January 2013 and July 2016 was performed. Age, gender, length and site of the occlusion, approach site, aortic bifurcation angle, and reentry site were recorded. Calcification score was assigned at both aortic bifurcation and reentry site. Technical success was assessed. During the study period, a total of 1300 endovascular procedures were performed on 489 patients for CLI. The Outback was applied on 50 femoropopliteal chronic total occlusions. Thirty-nine contralateral and 11 ipsilateral antegrade femoral were accessed. The device was used successfully in 41 patients (82%). There were nine failures, all in the contralateral approach group. Six due to inability to deliver the device due to acute aortic bifurcation angle and three due to failure to achieve luminal reentry. Procedural success was significantly affected by the aortic bifurcation angle (p = 0.013). The Outback has high technical success rates in treatment of femoropopliteal occlusion, when applied from either an ipsi- or contralateral approach. When applied in contralateral access, acute aortic bifurcation angle predicts procedural failure.

  12. Bayesian approach for counting experiment statistics applied to a neutrino point source analysis

    NASA Astrophysics Data System (ADS)

    Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.

    2013-12-01

    In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.

  13. First-Year Students' Approaches to Learning, and Factors Related to Change or Stability in Their Deep Approach during a Pharmacy Course

    ERIC Educational Resources Information Center

    Varunki, Maaret; Katajavuori, Nina; Postareff, Liisa

    2017-01-01

    Research shows that a surface approach to learning is more common among students in the natural sciences, while students representing the "soft" sciences are more likely to apply a deep approach. However, findings conflict concerning the stability of approaches to learning in general. This study explores the variation in students'…

  14. The Right Approach in Practice: A Discussion of the Applicability of EFL Writing Practices in a Saudi Context

    ERIC Educational Resources Information Center

    Oraif, Iman M.

    2016-01-01

    The aim of this paper is to describe the different approaches applied to teaching writing in the L2 context and the way these different methods have been established so far. The perspectives include a product approach, genre approach and process approach. Each has its own merits and objectives for application. Regarding the study context, it may…

  15. Engine Data Interpretation System (EDIS)

    NASA Technical Reports Server (NTRS)

    Cost, Thomas L.; Hofmann, Martin O.

    1990-01-01

    A prototype of an expert system was developed which applies qualitative or model-based reasoning to the task of post-test analysis and diagnosis of data resulting from a rocket engine firing. A combined component-based and process theory approach is adopted as the basis for system modeling. Such an approach provides a framework for explaining both normal and deviant system behavior in terms of individual component functionality. The diagnosis function is applied to digitized sensor time-histories generated during engine firings. The generic system is applicable to any liquid rocket engine but was adapted specifically in this work to the Space Shuttle Main Engine (SSME). The system is applied to idealized data resulting from turbomachinery malfunction in the SSME.

  16. The Application of the Monte Carlo Approach to Cognitive Diagnostic Computerized Adaptive Testing With Content Constraints

    ERIC Educational Resources Information Center

    Mao, Xiuzhen; Xin, Tao

    2013-01-01

    The Monte Carlo approach which has previously been implemented in traditional computerized adaptive testing (CAT) is applied here to cognitive diagnostic CAT to test the ability of this approach to address multiple content constraints. The performance of the Monte Carlo approach is compared with the performance of the modified maximum global…

  17. Some Thoughts on Applied Geography.

    ERIC Educational Resources Information Center

    Gritzner, Charles F.

    1979-01-01

    The geography student should be offered the option of applied geography courses as well as the more conservative humanistic approach, in order to respond to the challenges presented by existing societal needs and vocational opportunities. (Author/CK)

  18. A Constructive Neural-Network Approach to Modeling Psychological Development

    ERIC Educational Resources Information Center

    Shultz, Thomas R.

    2012-01-01

    This article reviews a particular computational modeling approach to the study of psychological development--that of constructive neural networks. This approach is applied to a variety of developmental domains and issues, including Piagetian tasks, shift learning, language acquisition, number comparison, habituation of visual attention, concept…

  19. Learning and Information Approaches for Inference in Dynamic Data-Driven Geophysical Applications

    NASA Astrophysics Data System (ADS)

    Ravela, S.

    2015-12-01

    Many Geophysical inference problems are characterized by non-linear processes, high-dimensional models and complex uncertainties. A dynamic coupling between models, estimation, and sampling is typically sought to efficiently characterize and reduce uncertainty. This process is however fraught with several difficulties. Among them, the key difficulties are the ability to deal with model errors, efficacy of uncertainty quantification and data assimilation. In this presentation, we present three key ideas from learning and intelligent systems theory and apply them to two geophysical applications. The first idea is the use of Ensemble Learning to compensate for model error, the second is to develop tractable Information Theoretic Learning to deal with non-Gaussianity in inference, and the third is a Manifold Resampling technique for effective uncertainty quantification. We apply these methods, first to the development of a cooperative autonomous observing system using sUAS for studying coherent structures. We apply this to Second, we apply this to the problem of quantifying risk from hurricanes and storm surges in a changing climate. Results indicate that learning approaches can enable new effectiveness in cases where standard approaches to model reduction, uncertainty quantification and data assimilation fail.

  20. Predicting the Future: Opportunities and Challenges for the Chemical Industry to Apply 21st-Century Toxicity Testing

    PubMed Central

    Settivari, Raja S; Ball, Nicholas; Murphy, Lynea; Rasoulpour, Reza; Boverhof, Darrell R; Carney, Edward W

    2015-01-01

    Interest in applying 21st-century toxicity testing tools for safety assessment of industrial chemicals is growing. Whereas conventional toxicology uses mainly animal-based, descriptive methods, a paradigm shift is emerging in which computational approaches, systems biology, high-throughput in vitro toxicity assays, and high-throughput exposure assessments are beginning to be applied to mechanism-based risk assessments in a time- and resource-efficient fashion. Here we describe recent advances in predictive safety assessment, with a focus on their strategic application to meet the changing demands of the chemical industry and its stakeholders. The opportunities to apply these new approaches is extensive and include screening of new chemicals, informing the design of safer and more sustainable chemical alternatives, filling information gaps on data-poor chemicals already in commerce, strengthening read-across methodology for categories of chemicals sharing similar modes of action, and optimizing the design of reduced-risk product formulations. Finally, we discuss how these predictive approaches dovetail with in vivo integrated testing strategies within repeated-dose regulatory toxicity studies, which are in line with 3Rs principles to refine, reduce, and replace animal testing. Strategic application of these tools is the foundation for informed and efficient safety assessment testing strategies that can be applied at all stages of the product-development process. PMID:25836969

  1. ITS evaluation -- phase 3 (2010)

    DOT National Transportation Integrated Search

    2011-05-01

    This report documents the results of applying a previously developed, standardized approach for : evaluating intelligent transportation systems (ITS) projects to 17 ITS earmark projects. The evaluation : approach was based on a questionnaire to inves...

  2. The Applied Behavior Analytic Heritage of PBS: A Dynamic Model of Action-Oriented Research

    ERIC Educational Resources Information Center

    Dunlap, Glen; Horner, Robert H., Ed.

    2006-01-01

    In the past two decades, positive behavior support (PBS) has emerged from applied behavior analysis (ABA) as a newly fashioned approach to problems of behavioral adaptation. ABA was established in the 1960s as a science in which learning principles are systematically applied to produce socially important changes in behavior, whereas PBS was…

  3. Rethinking Diffusion Theory in an Applied Context: Role of Environmental Values in Adoption of Home Energy Conservation

    ERIC Educational Resources Information Center

    Priest, Susanna Hornig; Greenhalgh, Ted; Neill, Helen R.; Young, Gabriel Reuben

    2015-01-01

    Diffusion theory, developed and popularized within communication research by Everett Rogers, is a venerable approach with much to recommend it as a theoretical foundation for applied communication research. In developing an applied project for a home energy conservation (energy efficiency retrofit) program in the state of Nevada, we utilized key…

  4. Applying Western Organization Development in China: Lessons from a Case of Success

    ERIC Educational Resources Information Center

    Wang, Jia

    2010-01-01

    Purpose: The purpose of this paper is to explore a successful case of a Chinese state-owned enterprise (SOE) as it applied western organization development (OD) approaches. Specifically, this study seeks to answer two questions: How has western organization development and change (OD/C) been applied in one Chinese SOE? and What lessons can be…

  5. Teachers' Reflective Practice in the Context of Twenty-First Century Learning: Applying Vagle's Five-Component Post-Intentional Plan for Phenomenological Research

    ERIC Educational Resources Information Center

    Benade, Leon

    2016-01-01

    Vagle's "post-intentional phenomenological research approach" applies post-structural thinking to intentionality. I apply his five-component research process, reflect on some initial findings of semi-structured interview discussions with 25 participants, and consider a meta-reflection by some participants on those findings. My larger…

  6. Optimization of coupled systems: A critical overview of approaches

    NASA Technical Reports Server (NTRS)

    Balling, R. J.; Sobieszczanski-Sobieski, J.

    1994-01-01

    A unified overview is given of problem formulation approaches for the optimization of multidisciplinary coupled systems. The overview includes six fundamental approaches upon which a large number of variations may be made. Consistent approach names and a compact approach notation are given. The approaches are formulated to apply to general nonhierarchic systems. The approaches are compared both from a computational viewpoint and a managerial viewpoint. Opportunities for parallelism of both computation and manpower resources are discussed. Recommendations regarding the need for future research are advanced.

  7. Old and new approaches to the interpretation of acid-base metabolism, starting from historical data applied to diabetic acidosis.

    PubMed

    Mioni, Roberto; Marega, Alessandra; Lo Cicero, Marco; Montanaro, Domenico

    2016-11-01

    The approach to acid-base chemistry in medicine includes several methods. Currently, the two most popular procedures are derived from Stewart's studies and from the bicarbonate/BE-based classical formulation. Another method, unfortunately little known, follows the Kildeberg theory applied to acid-base titration. By using the data produced by Dana Atchley in 1933, regarding electrolytes and blood gas analysis applied to diabetes, we compared the three aforementioned methods, in order to highlight their strengths and their weaknesses. The results obtained, by reprocessing the data of Atchley, have shown that Kildeberg's approach, unlike the other two methods, is consistent, rational and complete for describing the organ-physiological behavior of the hydrogen ion turnover in human organism. In contrast, the data obtained using the Stewart approach and the bicarbonate-based classical formulation are misleading and fail to specify which organs or systems are involved in causing or maintaining the diabetic acidosis. Stewart's approach, despite being considered 'quantitative', does not propose in any way the concept of 'an amount of acid' and becomes even more confusing, because it is not clear how to distinguish between 'strong' and 'weak' ions. As for Stewart's approach, the classical method makes no distinction between hydrogen ions managed by the intermediate metabolism and hydroxyl ions handled by the kidney, but, at least, it is based on the concept of titration (base-excess) and indirectly defines the concept of 'an amount of acid'. In conclusion, only Kildeberg's approach offers a complete understanding of the causes and remedies against any type of acid-base disturbance.

  8. From IHE Audit Trails to XES Event Logs Facilitating Process Mining.

    PubMed

    Paster, Ferdinand; Helm, Emmanuel

    2015-01-01

    Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.

  9. Applying the highway safety manual to Georgia.

    DOT National Transportation Integrated Search

    2015-08-01

    This report examines the Highway Safety Manual (HSM) from the perspective of applying its : methods and approaches within the state of Georgia. The work presented here focuses : specifically on data requirements and methods that may be of particular ...

  10. A Multivariate Methodological Workflow for the Analysis of FTIR Chemical Mapping Applied on Historic Paint Stratigraphies

    PubMed Central

    Sciutto, Giorgia; Oliveri, Paolo; Catelli, Emilio; Bonacini, Irene

    2017-01-01

    In the field of applied researches in heritage science, the use of multivariate approach is still quite limited and often chemometric results obtained are often underinterpreted. Within this scenario, the present paper is aimed at disseminating the use of suitable multivariate methodologies and proposes a procedural workflow applied on a representative group of case studies, of considerable importance for conservation purposes, as a sort of guideline on the processing and on the interpretation of this FTIR data. Initially, principal component analysis (PCA) is performed and the score values are converted into chemical maps. Successively, the brushing approach is applied, demonstrating its usefulness for a deep understanding of the relationships between the multivariate map and PC score space, as well as for the identification of the spectral bands mainly involved in the definition of each area localised within the score maps. PMID:29333162

  11. Artful Interventions for Workplace Bullying: Exploring Forum Theatre

    ERIC Educational Resources Information Center

    Edwards, Margot; Blackwood, Kate Marie

    2017-01-01

    Purpose: This paper aims to explore the phenomenon of workplace bullying in response to recent calls for the development of different approaches and provide an exploration of artful approaches to intervention. Design/methodology/approach: The paper offers a unique conceptualisation of workplace bullying and applies a phenomenological lens to the…

  12. Junior College Faculty Job Satisfaction.

    ERIC Educational Resources Information Center

    Frankel, Joanne

    Some of the research done to date concerning job satisfaction of junior college faculty is reviewed in this "Brief." Part I of the "Brief" describes four frameworks that have been applied to the analysis of job satisfaction: the traditional approach, the two-factor approach, the need hierarchy, and the cognitive dissonance approach. Part II…

  13. Applying the Sport Education Model to Tennis

    ERIC Educational Resources Information Center

    Ayvazo, Shiri

    2009-01-01

    The physical education field abounds with theoretically sound curricular approaches such as fitness education, skill theme approach, tactical approach, and sport education. In an era that emphasizes authentic sport experiences, the Sport Education Model includes unique features that sets it apart from other curricular models and can be a valuable…

  14. Participatory Research: New Approaches to the Research to Practice Dilemma.

    ERIC Educational Resources Information Center

    Meyer, Luanna H.; Park, Hyun-Sook; Grenot-Scheyer, Marquita; Schwartz, Ilene; Harry, Beth

    1998-01-01

    This article presents a rationale for incorporating elements of participatory research approaches into intervention research intended to improve practice. After an overview of the research-to-practice problem, it illustrates how the incorporation of participatory research approaches applied to various decision points can enhance the construction…

  15. A Market Failure Approach to Linguistic Justice

    ERIC Educational Resources Information Center

    Robichaud, David

    2017-01-01

    This paper will consider language management from the perspective of efficiency, and will set the grounds for a new approach to linguistic justice: a market failure approach. The principle of efficiency emphasises the need to satisfy individuals' preferences in an optimal way. Applying this principle with regard to language would justify language…

  16. Neuroevolutional Approach to Cerebral Palsy and Speech.

    ERIC Educational Resources Information Center

    Mysak, Edward D.

    Intended for cerebral palsy specialists, the book emphasizes the contribution that a neuroevolutional approach to therapy can make to habilitation goals of the child with cerebral palsy and applies the basic principles of the Bobath approach to therapy. The first section discusses cerebral palsy as a reflection of disturbed neuro-ontogenisis and…

  17. A soil moisture accounting-procedure with a Richards' equation-based soil texture-dependent parameterization

    USDA-ARS?s Scientific Manuscript database

    Given a time series of potential evapotranspiration and rainfall data, there are at least two approaches for estimating vertical percolation rates. One approach involves solving Richards' equation (RE) with a plant uptake model. An alternative approach involves applying a simple soil moisture accoun...

  18. A comparison of two sampling approaches for assessing the urban forest canopy cover from aerial photography.

    Treesearch

    Ucar Zennure; Pete Bettinger; Krista Merry; Jacek Siry; J.M. Bowker

    2016-01-01

    Two different sampling approaches for estimating urban tree canopy cover were applied to two medium-sized cities in the United States, in conjunction with two freely available remotely sensed imagery products. A random point-based sampling approach, which involved 1000 sample points, was compared against a plot/grid sampling (cluster sampling) approach that involved a...

  19. Singular perturbation and time scale approaches in discrete control systems

    NASA Technical Reports Server (NTRS)

    Naidu, D. S.; Price, D. B.

    1988-01-01

    After considering a singularly perturbed discrete control system, a singular perturbation approach is used to obtain outer and correction subsystems. A time scale approach is then applied via block diagonalization transformations to decouple the system into slow and fast subsystems. To a zeroth-order approximation, the singular perturbation and time-scale approaches are found to yield equivalent results.

  20. Systematic Approach for Calculating the Concentrations of Chemical Species in Multiequilibrium Problems: Inclusion of the Ionic Strength Effects

    ERIC Educational Resources Information Center

    Baeza-Baeza, Juan J.; Garcia-Alvarez-Coque, M. Celia

    2012-01-01

    A general systematic approach including ionic strength effects is proposed for the numerical calculation of concentrations of chemical species in multiequilibrium problems. This approach extends the versatility of the approach presented in a previous article and is applied using the Solver option of the Excel spreadsheet to solve real problems…

  1. Alternate approach slab reinforcement.

    DOT National Transportation Integrated Search

    2010-06-01

    The upper mat of reinforcing steel, in exposed concrete bridge approach slabs, is prone to corrosion damage. Chlorides applied to the highways : for winter maintenance can penetrate this concrete layer. Eventually chlorides reach the steel and begin ...

  2. Quantum description of light propagation in generalized media

    NASA Astrophysics Data System (ADS)

    Häyrynen, Teppo; Oksanen, Jani

    2016-02-01

    Linear quantum input-output relation based models are widely applied to describe the light propagation in a lossy medium. The details of the interaction and the associated added noise depend on whether the device is configured to operate as an amplifier or an attenuator. Using the traveling wave (TW) approach, we generalize the linear material model to simultaneously account for both the emission and absorption processes and to have point-wise defined noise field statistics and intensity dependent interaction strengths. Thus, our approach describes the quantum input-output relations of linear media with net attenuation, amplification or transparency without pre-selection of the operation point. The TW approach is then applied to investigate materials at thermal equilibrium, inverted materials, the transparency limit where losses are compensated, and the saturating amplifiers. We also apply the approach to investigate media in nonuniform states which can be e.g. consequences of a temperature gradient over the medium or a position dependent inversion of the amplifier. Furthermore, by using the generalized model we investigate devices with intensity dependent interactions and show how an initial thermal field transforms to a field having coherent statistics due to gain saturation.

  3. Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework

    NASA Astrophysics Data System (ADS)

    Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.

    2016-03-01

    A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.

  4. A comparison of two closely-related approaches to aerodynamic design optimization

    NASA Technical Reports Server (NTRS)

    Shubin, G. R.; Frank, P. D.

    1991-01-01

    Two related methods for aerodynamic design optimization are compared. The methods, called the implicit gradient approach and the variational (or optimal control) approach, both attempt to obtain gradients necessary for numerical optimization at a cost significantly less than that of the usual black-box approach that employs finite difference gradients. While the two methods are seemingly quite different, they are shown to differ (essentially) in that the order of discretizing the continuous problem, and of applying calculus, is interchanged. Under certain circumstances, the two methods turn out to be identical. We explore the relationship between these methods by applying them to a model problem for duct flow that has many features in common with transonic flow over an airfoil. We find that the gradients computed by the variational method can sometimes be sufficiently inaccurate to cause the optimization to fail.

  5. Approach for Autonomous Control of Unmanned Aerial Vehicle Using Intelligent Agents for Knowledge Creation

    NASA Technical Reports Server (NTRS)

    Dufrene, Warren R., Jr.

    2004-01-01

    This paper describes the development of a planned approach for Autonomous operation of an Unmanned Aerial Vehicle (UAV). A Hybrid approach will seek to provide Knowledge Generation thru the application of Artificial Intelligence (AI) and Intelligent Agents (IA) for UAV control. The application of many different types of AI techniques for flight will be explored during this research effort. The research concentration will be directed to the application of different AI methods within the UAV arena. By evaluating AI approaches, which will include Expert Systems, Neural Networks, Intelligent Agents, Fuzzy Logic, and Complex Adaptive Systems, a new insight may be gained into the benefits of AI techniques applied to achieving true autonomous operation of these systems thus providing new intellectual merit to this research field. The major area of discussion will be limited to the UAV. The systems of interest include small aircraft, insects, and miniature aircraft. Although flight systems will be explored, the benefits should apply to many Unmanned Vehicles such as: Rovers, Ocean Explorers, Robots, and autonomous operation systems. The flight system will be broken down into control agents that will represent the intelligent agent approach used in AI. After the completion of a successful approach, a framework of applying a Security Overseer will be added in an attempt to address errors, emergencies, failures, damage, or over dynamic environment. The chosen control problem was the landing phase of UAV operation. The initial results from simulation in FlightGear are presented.

  6. Investigation of smoothness-increasing accuracy-conserving filters for improving streamline integration through discontinuous fields.

    PubMed

    Steffen, Michael; Curtis, Sean; Kirby, Robert M; Ryan, Jennifer K

    2008-01-01

    Streamline integration of fields produced by computational fluid mechanics simulations is a commonly used tool for the investigation and analysis of fluid flow phenomena. Integration is often accomplished through the application of ordinary differential equation (ODE) integrators--integrators whose error characteristics are predicated on the smoothness of the field through which the streamline is being integrated--smoothness which is not available at the inter-element level of finite volume and finite element data. Adaptive error control techniques are often used to ameliorate the challenge posed by inter-element discontinuities. As the root of the difficulties is the discontinuous nature of the data, we present a complementary approach of applying smoothness-enhancing accuracy-conserving filters to the data prior to streamline integration. We investigate whether such an approach applied to uniform quadrilateral discontinuous Galerkin (high-order finite volume) data can be used to augment current adaptive error control approaches. We discuss and demonstrate through numerical example the computational trade-offs exhibited when one applies such a strategy.

  7. Divergent expansion, Borel summability and three-dimensional Navier-Stokes equation.

    PubMed

    Costin, Ovidiu; Luo, Guo; Tanveer, Saleh

    2008-08-13

    We describe how the Borel summability of a divergent asymptotic expansion can be expanded and applied to nonlinear partial differential equations (PDEs). While Borel summation does not apply for non-analytic initial data, the present approach generates an integral equation (IE) applicable to much more general data. We apply these concepts to the three-dimensional Navier-Stokes (NS) system and show how the IE approach can give rise to local existence proofs. In this approach, the global existence problem in three-dimensional NS systems, for specific initial condition and viscosity, becomes a problem of asymptotics in the variable p (dual to 1/t or some positive power of 1/t). Furthermore, the errors in numerical computations in the associated IE can be controlled rigorously, which is very important for nonlinear PDEs such as NS when solutions are not known to exist globally.Moreover, computation of the solution of the IE over an interval [0,p0] provides sharper control of its p-->infinity behaviour. Preliminary numerical computations give encouraging results.

  8. Ethics in technological culture: a programmatic proposal for a pragmatist approach.

    PubMed

    Keulartz, Jozef; Schermer, Maartje; Korthals, Michiel; Swierstra, Tsjalling

    2004-01-01

    Neither traditional philosophy nor current applied ethics seem able to cope adequately with the highly dynamic character of our modern technological culture. This is because they have insufficient insight into the moral significance of technological artifacts and systems. Here, much can be learned from recent science and technology studies (STS). They have opened up the black box of technological developments and have revealed the intimate intertwinement of technology and society in minute detail. However, while applied ethics is characterized by a certain "technology blindness," the most influential approaches within STS show a "normative deficit" and display on agnostic or even antagonistic attitude toward ethics. To repair the blind spots of both applied ethics and STS, the authors sketch the contours of a pragmatist approach. They will explore the tasks and tools of a pragmatist ethics and pay special attention to the exploration of future worlds disclosed and shaped by technology and the management of deep value conflicts inherent in a pluralistic society.

  9. Gas Chromatography Data Classification Based on Complex Coefficients of an Autoregressive Model

    DOE PAGES

    Zhao, Weixiang; Morgan, Joshua T.; Davis, Cristina E.

    2008-01-01

    This paper introduces autoregressive (AR) modeling as a novel method to classify outputs from gas chromatography (GC). The inverse Fourier transformation was applied to the original sensor data, and then an AR model was applied to transform data to generate AR model complex coefficients. This series of coefficients effectively contains a compressed version of all of the information in the original GC signal output. We applied this method to chromatograms resulting from proliferating bacteria species grown in culture. Three types of neural networks were used to classify the AR coefficients: backward propagating neural network (BPNN), radial basis function-principal component analysismore » (RBF-PCA) approach, and radial basis function-partial least squares regression (RBF-PLSR) approach. This exploratory study demonstrates the feasibility of using complex root coefficient patterns to distinguish various classes of experimental data, such as those from the different bacteria species. This cognition approach also proved to be robust and potentially useful for freeing us from time alignment of GC signals.« less

  10. Information Retrieval Using Hadoop Big Data Analysis

    NASA Astrophysics Data System (ADS)

    Motwani, Deepak; Madan, Madan Lal

    This paper concern on big data analysis which is the cognitive operation of probing huge amounts of information in an attempt to get uncovers unseen patterns. Through Big Data Analytics Applications such as public and private organization sectors have formed a strategic determination to turn big data into cut throat benefit. The primary occupation of extracting value from big data give rise to a process applied to pull information from multiple different sources; this process is known as extract transforms and lode. This paper approach extract information from log files and Research Paper, awareness reduces the efforts for blueprint finding and summarization of document from several positions. The work is able to understand better Hadoop basic concept and increase the user experience for research. In this paper, we propose an approach for analysis log files for finding concise information which is useful and time saving by using Hadoop. Our proposed approach will be applied on different research papers on a specific domain and applied for getting summarized content for further improvement and make the new content.

  11. Wave Period and Coastal Bathymetry Estimations from Satellite Images

    NASA Astrophysics Data System (ADS)

    Danilo, Celine; Melgani, Farid

    2016-08-01

    We present an approach for wave period and coastal water depth estimation. The approach based on wave observations, is entirely independent of ancillary data and can theoretically be applied to SAR or optical images. In order to demonstrate its feasibility we apply our method to more than 50 Sentinel-1A images of the Hawaiian Islands, well-known for its long waves. Six wave buoys are available to compare our results with in-situ measurements. The results on Sentinel-1A images show that half of the images were unsuitable for applying the method (no swell or wavelength too small to be captured by the SAR). On the other half, 78% of the estimated wave periods are in accordance with buoy measurements. In addition, we present preliminary results of the estimation of the coastal water depth on a Landsat-8 image (with characteristics close to Sentinel-2A). With a squared correlation coefficient of 0.7 for ground truth measurement, this approach reveals promising results for monitoring coastal bathymetry.

  12. Integrating community-based participatory research and informatics approaches to improve the engagement and health of underserved populations

    PubMed Central

    Schaefbauer, Chris L; Campbell, Terrance R; Senteio, Charles; Siek, Katie A; Bakken, Suzanne; Veinot, Tiffany C

    2016-01-01

    Objective We compare 5 health informatics research projects that applied community-based participatory research (CBPR) approaches with the goal of extending existing CBPR principles to address issues specific to health informatics research. Materials and methods We conducted a cross-case analysis of 5 diverse case studies with 1 common element: integration of CBPR approaches into health informatics research. After reviewing publications and other case-related materials, all coauthors engaged in collaborative discussions focused on CBPR. Researchers mapped each case to an existing CBPR framework, examined each case individually for success factors and barriers, and identified common patterns across cases. Results Benefits of applying CBPR approaches to health informatics research across the cases included the following: developing more relevant research with wider impact, greater engagement with diverse populations, improved internal validity, more rapid translation of research into action, and the development of people. Challenges of applying CBPR to health informatics research included requirements to develop strong, sustainable academic-community partnerships and mismatches related to cultural and temporal factors. Several technology-related challenges, including needs to define ownership of technology outputs and to build technical capacity with community partners, also emerged from our analysis. Finally, we created several principles that extended an existing CBPR framework to specifically address health informatics research requirements. Conclusions Our cross-case analysis yielded valuable insights regarding CBPR implementation in health informatics research and identified valuable lessons useful for future CBPR-based research. The benefits of applying CBPR approaches can be significant, particularly in engaging populations that are typically underserved by health care and in designing patient-facing technology. PMID:26228766

  13. A Complex Systems Approach to Causal Discovery in Psychiatry.

    PubMed

    Saxe, Glenn N; Statnikov, Alexander; Fenyo, David; Ren, Jiwen; Li, Zhiguo; Prasad, Meera; Wall, Dennis; Bergman, Nora; Briggs, Ernestine C; Aliferis, Constantin

    2016-01-01

    Conventional research methodologies and data analytic approaches in psychiatric research are unable to reliably infer causal relations without experimental designs, or to make inferences about the functional properties of the complex systems in which psychiatric disorders are embedded. This article describes a series of studies to validate a novel hybrid computational approach--the Complex Systems-Causal Network (CS-CN) method-designed to integrate causal discovery within a complex systems framework for psychiatric research. The CS-CN method was first applied to an existing dataset on psychopathology in 163 children hospitalized with injuries (validation study). Next, it was applied to a much larger dataset of traumatized children (replication study). Finally, the CS-CN method was applied in a controlled experiment using a 'gold standard' dataset for causal discovery and compared with other methods for accurately detecting causal variables (resimulation controlled experiment). The CS-CN method successfully detected a causal network of 111 variables and 167 bivariate relations in the initial validation study. This causal network had well-defined adaptive properties and a set of variables was found that disproportionally contributed to these properties. Modeling the removal of these variables resulted in significant loss of adaptive properties. The CS-CN method was successfully applied in the replication study and performed better than traditional statistical methods, and similarly to state-of-the-art causal discovery algorithms in the causal detection experiment. The CS-CN method was validated, replicated, and yielded both novel and previously validated findings related to risk factors and potential treatments of psychiatric disorders. The novel approach yields both fine-grain (micro) and high-level (macro) insights and thus represents a promising approach for complex systems-oriented research in psychiatry.

  14. Combining molecular dynamics and an electrodiffusion model to calculate ion channel conductance

    NASA Astrophysics Data System (ADS)

    Wilson, Michael A.; Nguyen, Thuy Hien; Pohorille, Andrew

    2014-12-01

    Establishing the relation between the structures and functions of protein ion channels, which are protein assemblies that facilitate transmembrane ion transport through water-filled pores, is at the forefront of biological and medical sciences. A reliable way to determine whether our understanding of this relation is satisfactory is to reproduce the measured ionic conductance over a broad range of applied voltages. This can be done in molecular dynamics simulations by way of applying an external electric field to the system and counting the number of ions that traverse the channel per unit time. Since this approach is computationally very expensive we develop a markedly more efficient alternative in which molecular dynamics is combined with an electrodiffusion equation. This alternative approach applies if steady-state ion transport through channels can be described with sufficient accuracy by the one-dimensional diffusion equation in the potential given by the free energy profile and applied voltage. The theory refers only to line densities of ions in the channel and, therefore, avoids ambiguities related to determining the surface area of the channel near its endpoints or other procedures connecting the line and bulk ion densities. We apply the theory to a simple, model system based on the trichotoxin channel. We test the assumptions of the electrodiffusion equation, and determine the precision and consistency of the calculated conductance. We demonstrate that it is possible to calculate current/voltage dependence and accurately reconstruct the underlying (equilibrium) free energy profile, all from molecular dynamics simulations at a single voltage. The approach developed here applies to other channels that satisfy the conditions of the electrodiffusion equation.

  15. Didactical suggestion for a Dynamic Hybrid Intelligent e-Learning Environment (DHILE) applying the PENTHA ID Model

    NASA Astrophysics Data System (ADS)

    dall'Acqua, Luisa

    2011-08-01

    The teleology of our research is to propose a solution to the request of "innovative, creative teaching", proposing a methodology to educate creative Students in a society characterized by multiple reference points and hyper dynamic knowledge, continuously subject to reviews and discussions. We apply a multi-prospective Instructional Design Model (PENTHA ID Model), defined and developed by our research group, which adopts a hybrid pedagogical approach, consisting of elements of didactical connectivism intertwined with aspects of social constructivism and enactivism. The contribution proposes an e-course structure and approach, applying the theoretical design principles of the above mentioned ID Model, describing methods, techniques, technologies and assessment criteria for the definition of lesson modes in an e-course.

  16. Applying Genomic and Bioinformatic Resources to Human Adenovirus Genomes for Use in Vaccine Development and for Applications in Vector Development for Gene Delivery

    PubMed Central

    Seto, Jason; Walsh, Michael P.; Mahadevan, Padmanabhan; Zhang, Qiwei; Seto, Donald

    2010-01-01

    Technological advances and increasingly cost-effect methodologies in DNA sequencing and computational analysis are providing genome and proteome data for human adenovirus research. Applying these tools, data and derived knowledge to the development of vaccines against these pathogens will provide effective prophylactics. The same data and approaches can be applied to vector development for gene delivery in gene therapy and vaccine delivery protocols. Examination of several field strain genomes and their analyses provide examples of data that are available using these approaches. An example of the development of HAdV-B3 both as a vaccine and also as a vector is presented. PMID:21994597

  17. A comparison of two conformal mapping techniques applied to an aerobrake body

    NASA Technical Reports Server (NTRS)

    Hommel, Mark J.

    1987-01-01

    Conformal mapping is a classical technique which has been utilized for solving problems in aerodynamics and hydrodynamics. Conformal mapping has been successfully applied in the construction of grids around airfoils, engine inlets and other aircraft configurations. Conformal mapping techniques were applied to an aerobrake body having an axis of symmetry. Two different approaches were utilized: (1) Karman-Trefftz transformation; and (2) Point Wise Schwarz Christoffel transformation. In both cases, the aerobrake body was mapped onto a near circle, and a grid was generated in the mapped plane. The mapped body and grid were then mapped back into physical space and the properties of the associated grids were examined. Advantages and disadvantages of both approaches are discussed.

  18. Description of a user-oriented geographic information system - The resource analysis program

    NASA Technical Reports Server (NTRS)

    Tilmann, S. E.; Mokma, D. L.

    1980-01-01

    This paper describes the Resource Analysis Program, an applied geographic information system. Several applications are presented which utilized soil, and other natural resource data, to develop integrated maps and data analyses. These applications demonstrate the methods of analysis and the philosophy of approach used in the mapping system. The applications are evaluated in reference to four major needs of a functional mapping system: data capture, data libraries, data analysis, and mapping and data display. These four criteria are then used to describe an effort to develop the next generation of applied mapping systems. This approach uses inexpensive microcomputers for field applications and should prove to be a viable entry point for users heretofore unable or unwilling to venture into applied computer mapping.

  19. Fractography of ceramic and metal failures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1984-01-01

    STP 827 is organized into the two broad areas of ceramics and metals. The ceramics section covers fracture analysis techniques, surface analysis techniques, and applied fractography. The metals section covers failure analysis techniques, and latest approaches to fractography, and applied fractography.

  20. Optimal dynamic control of invasions: applying a systematic conservation approach.

    PubMed

    Adams, Vanessa M; Setterfield, Samantha A

    2015-06-01

    The social, economic, and environmental impacts of invasive plants are well recognized. However, these variable impacts are rarely accounted for in the spatial prioritization of funding for weed management. We examine how current spatially explicit prioritization methods can be extended to identify optimal budget allocations to both eradication and control measures of invasive species to minimize the costs and likelihood of invasion. Our framework extends recent approaches to systematic prioritization of weed management to account for multiple values that are threatened by weed invasions with a multi-year dynamic prioritization approach. We apply our method to the northern portion of the Daly catchment in the Northern Territory, which has significant conservation values that are threatened by gamba grass (Andropogon gayanus), a highly invasive species recognized by the Australian government as a Weed of National Significance (WONS). We interface Marxan, a widely applied conservation planning tool, with a dynamic biophysical model of gamba grass to optimally allocate funds to eradication and control programs under two budget scenarios comparing maximizing gain (MaxGain) and minimizing loss (MinLoss) optimization approaches. The prioritizations support previous findings that a MinLoss approach is a better strategy when threats are more spatially variable than conservation values. Over a 10-year simulation period, we find that a MinLoss approach reduces future infestations by ~8% compared to MaxGain in the constrained budget scenarios and ~12% in the unlimited budget scenarios. We find that due to the extensive current invasion and rapid rate of spread, allocating the annual budget to control efforts is more efficient than funding eradication efforts when there is a constrained budget. Under a constrained budget, applying the most efficient optimization scenario (control, minloss) reduces spread by ~27% compared to no control. Conversely, if the budget is unlimited it is more efficient to fund eradication efforts and reduces spread by ~65% compared to no control.

  1. A Classical Based Derivation of Time Dilation Providing First Order Accuracy to Schwarzschild's Solution of Einstein's Field Equations

    NASA Astrophysics Data System (ADS)

    Austin, Rickey W.

    In Einstein's theory of Special Relativity (SR), one method to derive relativistic kinetic energy is via applying the classical work-energy theorem to relativistic momentum. This approach starts with a classical based work-energy theorem and applies SR's momentum to the derivation. One outcome of this derivation is relativistic kinetic energy. From this derivation, it is rather straight forward to form a kinetic energy based time dilation function. In the derivation of General Relativity a common approach is to bypass classical laws as a starting point. Instead a rigorous development of differential geometry and Riemannian space is constructed, from which classical based laws are derived. This is in contrast to SR's approach of starting with classical laws and applying the consequences of the universal speed of light by all observers. A possible method to derive time dilation due to Newtonian gravitational potential energy (NGPE) is to apply SR's approach to deriving relativistic kinetic energy. It will be shown this method gives a first order accuracy compared to Schwarzschild's metric. The SR's kinetic energy and the newly derived NGPE derivation are combined to form a Riemannian metric based on these two energies. A geodesic is derived and calculations compared to Schwarzschild's geodesic for an orbiting test mass about a central, non-rotating, non-charged massive body. The new metric results in high accuracy calculations when compared to Einsteins General Relativity's prediction. The new method provides a candidate approach for starting with classical laws and deriving General Relativity effects. This approach mimics SR's method of starting with classical mechanics when deriving relativistic equations. As a compliment to introducing General Relativity, it provides a plausible scaffolding method from classical physics when teaching introductory General Relativity. A straight forward path from classical laws to General Relativity will be derived. This derivation provides a minimum first order accuracy to Schwarzschild's solution to Einstein's field equations.

  2. Measuring and Maximising Research Impact in Applied Social Science Research Settings. Good Practice Guide

    ERIC Educational Resources Information Center

    Stanwick, John; Hargreaves, Jo

    2012-01-01

    This guide describes the National Centre for Vocational Education Research (NCVER) approach to measuring impact using examples from its own case studies, as well as showing how to maximise the impact of applied social science research. Applied social science research needs to demonstrate that it is relevant and useful both to public policy and…

  3. Identification of the Competencies Needed to Apply Social Marketing to Extension Programming: Results of a Delphi Study

    ERIC Educational Resources Information Center

    Warner, Laura A.; Stubbs, Eric; Murphrey, Theresa Pesl; Huynh, Phuong

    2016-01-01

    The purpose of this study was to identify the specific competencies needed to apply social marketing, a promising approach to behavior change, to Extension programming. A modified Delphi study was used to achieve group consensus among a panel of experts on the skills, characteristics, and knowledge needed to successfully apply this behavior change…

  4. Management strategy evaluation of pheromone-baited trapping techniques to improve management of invasive sea lamprey

    USGS Publications Warehouse

    Dawson, Heather; Jones, Michael L.; Irwin, Brian J.; Johnson, Nicholas; Wagner, Michael C.; Szymanski, Melissa

    2016-01-01

    We applied a management strategy evaluation (MSE) model to examine the potential cost-effectiveness of using pheromone-baited trapping along with conventional lampricide treatment to manage invasive sea lamprey. Four pheromone-baited trapping strategies were modeled: (1) stream activation wherein pheromone was applied to existing traps to achieve 10−12 mol/L in-stream concentration, (2) stream activation plus two additional traps downstream with pheromone applied at 2.5 mg/hr (reverse-intercept approach), (3) trap activation wherein pheromone was applied at 10 mg/hr to existing traps, and (4) trap activation and reverse-intercept approach. Each new strategy was applied, with remaining funds applied to conventional lampricide control. Simulating deployment of these hybrid strategies on fourteen Lake Michigan streams resulted in increases of 17 and 11% (strategies 1 and 2) and decreases of 4 and 7% (strategies 3 and 4) of the lakewide mean abundance of adult sea lamprey relative to status quo. MSE revealed performance targets for trap efficacy to guide additional research because results indicate that combining lampricides and high efficacy trapping technologies can reduce sea lamprey abundance on average without increasing control costs.

  5. Valuing a long-term care facility.

    PubMed

    Mellen, C M

    1992-10-01

    The business valuation industry generally uses at least one of three basic approaches to value a long-term care facility: the cost approach, sales comparison approach, or income approach. The approach that is chosen and the resulting weight that is applied to it depend largely on the circumstances involved. Because a long-term care facility is a business enterprise, more weight usually is given to the income approach which factors into the estimate of value both the tangible and intangible assets of the facility.

  6. Parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method of ledre profile attributes

    NASA Astrophysics Data System (ADS)

    Hastuti, S.; Harijono; Murtini, E. S.; Fibrianto, K.

    2018-03-01

    This current study is aimed to investigate the use of parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method. Ledre as Bojonegoro unique local food product was used as point of interest, in which 319 panelists were involved in the study. The result showed that ledre is characterized as easy-crushed texture, sticky in mouth, stingy sensation and easy to swallow. It has also strong banana flavour with brown in colour. Compared to eggroll and semprong, ledre has more variances in terms of taste as well the roll length. As RATA questionnaire is designed to collect categorical data, non-parametric approach is the common statistical procedure. However, similar results were also obtained as parametric approach, regardless the fact of non-normal distributed data. Thus, it suggests that parametric approach can be applicable for consumer study with large number of respondents, even though it may not satisfy the assumption of ANOVA (Analysis of Variances).

  7. An integrated model of learning.

    PubMed

    Trigg, A M; Cordova, F D

    1987-01-01

    Worldwide, most educational systems are based on three levels of education that utilize the pedagogical approaches to learning. In the 1960s, scholars formulated another approach to education that has become known as andragogy and has been applied to adult education. Several innovative scholars have seen how andragogy can be applied to teaching children. As a result, both andragogy and pedagogy are viewed as the opposite ends of the educational spectrum. Both of these approaches have a place and function within the modern educational framework. If one assumes that the goal of education is for the acquisition and application of knowledge, then both of these approaches can be used effectively for the attainment of that goal. In order to utilize these approaches effectively, an integrated model of learning has been developed that consists of initial teaching and exploratory learning phases. This model has both the directive and flexible qualities found in the theories of pedagogy and andragogy. With careful consideration and analysis this educational model can be utilized effectively within most educational systems.

  8. Comparison of regression coefficient and GIS-based methodologies for regional estimates of forest soil carbon stocks.

    PubMed

    Campbell, J Elliott; Moen, Jeremie C; Ney, Richard A; Schnoor, Jerald L

    2008-03-01

    Estimates of forest soil organic carbon (SOC) have applications in carbon science, soil quality studies, carbon sequestration technologies, and carbon trading. Forest SOC has been modeled using a regression coefficient methodology that applies mean SOC densities (mass/area) to broad forest regions. A higher resolution model is based on an approach that employs a geographic information system (GIS) with soil databases and satellite-derived landcover images. Despite this advancement, the regression approach remains the basis of current state and federal level greenhouse gas inventories. Both approaches are analyzed in detail for Wisconsin forest soils from 1983 to 2001, applying rigorous error-fixing algorithms to soil databases. Resulting SOC stock estimates are 20% larger when determined using the GIS method rather than the regression approach. Average annual rates of increase in SOC stocks are 3.6 and 1.0 million metric tons of carbon per year for the GIS and regression approaches respectively.

  9. A fully automatic three-step liver segmentation method on LDA-based probability maps for multiple contrast MR images.

    PubMed

    Gloger, Oliver; Kühn, Jens; Stanski, Adam; Völzke, Henry; Puls, Ralf

    2010-07-01

    Automatic 3D liver segmentation in magnetic resonance (MR) data sets has proven to be a very challenging task in the domain of medical image analysis. There exist numerous approaches for automatic 3D liver segmentation on computer tomography data sets that have influenced the segmentation of MR images. In contrast to previous approaches to liver segmentation in MR data sets, we use all available MR channel information of different weightings and formulate liver tissue and position probabilities in a probabilistic framework. We apply multiclass linear discriminant analysis as a fast and efficient dimensionality reduction technique and generate probability maps then used for segmentation. We develop a fully automatic three-step 3D segmentation approach based upon a modified region growing approach and a further threshold technique. Finally, we incorporate characteristic prior knowledge to improve the segmentation results. This novel 3D segmentation approach is modularized and can be applied for normal and fat accumulated liver tissue properties. Copyright 2010 Elsevier Inc. All rights reserved.

  10. Innovative Phase Change Approach for Significant Energy Savings

    DTIC Science & Technology

    2016-09-01

    September 2016 Innovative Phase Change Approach For Significant Energy Savings September 2016 8 After conducting a market survey...FINAL REPORT Innovative Phase Change Approach for Significant Energy Savings ESTCP Project EW-201138 SEPTEMBER 2016 Dr. Aly H Shaaban Applied...5a. CONTRACT NUMBER W912HQ-11-C-0011 Innovative Phase Change Approach for Significant Energy Savings 5b. GRANT NUMBER 5c. PROGRAM

  11. Administrative Technology and the School Executive: Applying the Systems Approach to Educational Administration.

    ERIC Educational Resources Information Center

    Knezevich, Stephen J., Ed.

    In this era of rapid social change, educational administrators have discovered that new approaches to problem solving and decision making are needed. Systems analysis could afford a promising approach to administrative problems by providing a number of systematic techniques designed to sharpen administrative decision making, enhance efficiency,…

  12. Revisiting Approaches to Learning in Science and Engineering: A Case Study

    ERIC Educational Resources Information Center

    Gynnild, V.; Myrhaug, D.

    2012-01-01

    Several studies have applied the dichotomy of deep and surface approaches to learning in a range of disciplinary contexts. Existing questionnaires have largely assumed the existence of these constructs; however, in a recent study Case and Marshall (2004) described two additional context-specific approaches to learning in engineering. The current…

  13. Automating Media Centers and Small Libraries: A Microcomputer-Based Approach.

    ERIC Educational Resources Information Center

    Meghabghab, Dania Bilal

    Although the general automation process can be applied to most libraries, small libraries and media centers require a customized approach. Using a systematic approach, this guide covers each step and aspect of automation in a small library setting, and combines the principles of automation with field- tested activities. After discussing needs…

  14. Supporting Blended-Learning: Tool Requirements and Solutions with OWLish

    ERIC Educational Resources Information Center

    Álvarez, Ainhoa; Martín, Maite; Fernández-Castro, Isabel; Urretavizcaya, Maite

    2016-01-01

    Currently, most of the educational approaches applied to higher education combine face-to-face (F2F) and computer-mediated instruction in a Blended-Learning (B-Learning) approach. One of the main challenges of these approaches is fully integrating the traditional brick-and-mortar classes with online learning environments in an efficient and…

  15. Developing and Applying Green Building Technology in an Indigenous Community: An Engaged Approach to Sustainability Education

    ERIC Educational Resources Information Center

    Riley, David R.; Thatcher, Corinne E.; Workman, Elizabeth A.

    2006-01-01

    Purpose: This paper aims to disseminate an innovative approach to sustainability education in construction-related fields in which teaching, research, and service are integrated to provide a unique learning experience for undergraduate students, faculty members, and community partners. Design/methodology/approach: The paper identifies the need for…

  16. A Cultural Formulation Approach to Career Assessment and Career Counseling with Asian American Clients

    ERIC Educational Resources Information Center

    Leong, Frederick T. L.; Hardin, Erin E.; Gupta, Arpana

    2010-01-01

    Using the cultural formulations approach to career assessment and career counseling, the current article applies it specifically to Asian American clients. The approach is illustrated by using the "Diagnostic and Statistical Manual of Mental Disorders" fourth edition ("DSM-IV") Outline for Cultural Formulations that consists of the following five…

  17. Nonunitary and unitary approach to Eigenvalue problem of Boson operators and squeezed coherent states

    NASA Technical Reports Server (NTRS)

    Wunsche, A.

    1993-01-01

    The eigenvalue problem of the operator a + zeta(boson creation operator) is solved for arbitrarily complex zeta by applying a nonunitary operator to the vacuum state. This nonunitary approach is compared with the unitary approach leading for the absolute value of zeta less than 1 to squeezed coherent states.

  18. Student-Centered Transformative Learning in Leadership Education: An Examination of the Teaching and Learning Process

    ERIC Educational Resources Information Center

    Haber-Curran, Paige; Tillapaugh, Daniel W.

    2015-01-01

    Innovative and learner-centered approaches to teaching and learning are vital for the applied field of leadership education, yet little research exists on such pedagogical approaches within the field. Using a phenomenological approach in analyzing 26 students' reflective narratives, the authors explore students' experiences of and process of…

  19. Team Approach to Staffing the Reference Center: A Speculation.

    ERIC Educational Resources Information Center

    Lawson, Mollie D.; And Others

    This document applies theories of participatory management to a proposal for a model that uses a team approach to staffing university library reference centers. In particular, the Ward Edwards Library at Central Missouri State University is examined in terms of the advantages and disadvantages of its current approach. Special attention is given to…

  20. Unifying Approach to Analytical Chemistry and Chemical Analysis: Problem-Oriented Role of Chemical Analysis.

    ERIC Educational Resources Information Center

    Pardue, Harry L.; Woo, Jannie

    1984-01-01

    Proposes an approach to teaching analytical chemistry and chemical analysis in which a problem to be resolved is the focus of a course. Indicates that this problem-oriented approach is intended to complement detailed discussions of fundamental and applied aspects of chemical determinations and not replace such discussions. (JN)

  1. Strengthening the Focus on Business Results: The Need for Systems Approaches in Organizational Behavior Management

    ERIC Educational Resources Information Center

    Hyten, Cloyd

    2009-01-01

    Current Organizational Behavior Management (OBM) research and practice may be characterized as either behavior focused or results focused. These two approaches stem from different origins and have different characteristics. The behavior-focused approach stems from applied behavior analysis (ABA) methods and emphasizes direct observation of and…

  2. The Power of Behavioural Approaches--We Need a Revival

    ERIC Educational Resources Information Center

    Buckley, Sue

    2008-01-01

    Behavioural approaches can be used effectively to teach new skills and to change behaviours that are challenging and not socially adaptive. The behaviour modification approach--now called applied behaviour analysis--is based on the assumption that all behaviours are learned, both the useful ones (new skills) and the ones that are not so useful…

  3. Understanding the Factors that Characterise School-Community Partnerships: The Case of the Logan Healthy Schools Project

    ERIC Educational Resources Information Center

    Thomas, Melinda; Rowe, Fiona; Harris, Neil

    2010-01-01

    Purpose: The purpose of this study is to examine the factors that characterise effective school-community partnerships that support the sustainability of school health initiatives applied within a health-promoting schools approach. Design/methodology/approach: The study used an explanatory case study approach of five secondary schools…

  4. Applying What Works: A Case for Deliberate Psychological Education in Undergraduate Business Ethics

    ERIC Educational Resources Information Center

    Schmidt, Christopher Drees; Davidson, Kathleen M.; Adkins, Christopher

    2013-01-01

    The teaching of business ethics continues to be a topic of great concern as both businesses and business schools seek to develop effective approaches for fostering ethical behavior. Responses to this objective have been varied, and consistent empirical evidence for a particular approach has not emerged. One approach, deliberate psychological…

  5. Risk and ReORIENTations: An Asianist Approach to Teaching Afro-Haitian Dance

    ERIC Educational Resources Information Center

    Young, Angeline

    2018-01-01

    I present an action research study that enacts an Asianist somatic movement education approach to teaching Afro-Haitian dance at Arizona State University as a response to my Chinese American body's experience of hegemony in postsecondary dance curriculum. My pedagogical framework uses an intercultural teaching approach that applies a somatic…

  6. Modern Approaches to Foreign Language Teaching: World Experience

    ERIC Educational Resources Information Center

    Shumskyi, Oleksandr

    2016-01-01

    The problem of applying communicative approach to foreign language teaching of students in non-language departments of higher education institutions in a number of countries has been analyzed in the paper. The brief overview of main historic milestones in the development of communicative approach has been presented. It has been found out that…

  7. Approaches to Training Teachers of Adults in the UK

    ERIC Educational Resources Information Center

    Chychuk, Vadym

    2015-01-01

    The article deals with the theoretical foundations of teacher training for adult students in the UK. It has been found out that the system of adult education is based on the andragogical approach that reveals patterns, psychological and pedagogical factors of effective learning. In applying the andragogical approach to adult education the…

  8. The Teaching-Learning Environment, an Information-Dynamic Approach

    ERIC Educational Resources Information Center

    De Blasio, Cataldo; Järvinen, Mika

    2014-01-01

    In the present study a generalized approach is given for the description of acquisition procedures with a particular focus on the knowledge acquisition process. The learning progression is given as an example here letting the theory to be applied to different situations. An analytical approach is performed starting from the generalized fundamental…

  9. Hedonic approaches based on spatial econometrics and spatial statistics: application to evaluation of project benefits

    NASA Astrophysics Data System (ADS)

    Tsutsumi, Morito; Seya, Hajime

    2009-12-01

    This study discusses the theoretical foundation of the application of spatial hedonic approaches—the hedonic approach employing spatial econometrics or/and spatial statistics—to benefits evaluation. The study highlights the limitations of the spatial econometrics approach since it uses a spatial weight matrix that is not employed by the spatial statistics approach. Further, the study presents empirical analyses by applying the Spatial Autoregressive Error Model (SAEM), which is based on the spatial econometrics approach, and the Spatial Process Model (SPM), which is based on the spatial statistics approach. SPMs are conducted based on both isotropy and anisotropy and applied to different mesh sizes. The empirical analysis reveals that the estimated benefits are quite different, especially between isotropic and anisotropic SPM and between isotropic SPM and SAEM; the estimated benefits are similar for SAEM and anisotropic SPM. The study demonstrates that the mesh size does not affect the estimated amount of benefits. Finally, the study provides a confidence interval for the estimated benefits and raises an issue with regard to benefit evaluation.

  10. A comparison of the stochastic and machine learning approaches in hydrologic time series forecasting

    NASA Astrophysics Data System (ADS)

    Kim, T.; Joo, K.; Seo, J.; Heo, J. H.

    2016-12-01

    Hydrologic time series forecasting is an essential task in water resources management and it becomes more difficult due to the complexity of runoff process. Traditional stochastic models such as ARIMA family has been used as a standard approach in time series modeling and forecasting of hydrological variables. Due to the nonlinearity in hydrologic time series data, machine learning approaches has been studied with the advantage of discovering relevant features in a nonlinear relation among variables. This study aims to compare the predictability between the traditional stochastic model and the machine learning approach. Seasonal ARIMA model was used as the traditional time series model, and Random Forest model which consists of decision tree and ensemble method using multiple predictor approach was applied as the machine learning approach. In the application, monthly inflow data from 1986 to 2015 of Chungju dam in South Korea were used for modeling and forecasting. In order to evaluate the performances of the used models, one step ahead and multi-step ahead forecasting was applied. Root mean squared error and mean absolute error of two models were compared.

  11. Integrated Proteomic Approaches for Understanding Toxicity of Environmental Chemicals

    EPA Science Inventory

    To apply quantitative proteomic analysis to the evaluation of toxicity of environmental chemicals, we have developed an integrated proteomic technology platform. This platform has been applied to the analysis of the toxic effects and pathways of many important environmental chemi...

  12. The Case Method for Teaching College Economics

    ERIC Educational Resources Information Center

    Griffith, John R., Jr.

    1971-01-01

    The author explains the use of the case method of applying the principles of economics to current economic policy issues. This approach allows the classroom economics teacher to teach the student how to apply economic principles to real life problems. (Author)

  13. A multicriteria decision making approach applied to improving maintenance policies in healthcare organizations.

    PubMed

    Carnero, María Carmen; Gómez, Andrés

    2016-04-23

    Healthcare organizations have far greater maintenance needs for their medical equipment than other organization, as many are used directly with patients. However, the literature on asset management in healthcare organizations is very limited. The aim of this research is to provide more rational application of maintenance policies, leading to an increase in quality of care. This article describes a multicriteria decision-making approach which integrates Markov chains with the multicriteria Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH), to facilitate the best choice of combination of maintenance policies by using the judgements of a multi-disciplinary decision group. The proposed approach takes into account the level of acceptance that a given alternative would have among professionals. It also takes into account criteria related to cost, quality of care and impact of care cover. This multicriteria approach is applied to four dialysis subsystems: patients infected with hepatitis C, infected with hepatitis B, acute and chronic; in all cases, the maintenance strategy obtained consists of applying corrective and preventive maintenance plus two reserve machines. The added value in decision-making practices from this research comes from: (i) integrating the use of Markov chains to obtain the alternatives to be assessed by a multicriteria methodology; (ii) proposing the use of MACBETH to make rational decisions on asset management in healthcare organizations; (iii) applying the multicriteria approach to select a set or combination of maintenance policies in four dialysis subsystems of a health care organization. In the multicriteria decision making approach proposed, economic criteria have been used, related to the quality of care which is desired for patients (availability), and the acceptance that each alternative would have considering the maintenance and healthcare resources which exist in the organization, with the inclusion of a decision-making group. This approach is better suited to actual health care organization practice and depending on the subsystem analysed, improvements are introduced that are not included in normal maintenance policies; in this way, not only have different maintenance policies been suggested, but also alternatives that, in each case and according to viability, provide a more complete decision tool for the maintenance manager.

  14. Audio frequency in vivo optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Adie, Steven G.; Kennedy, Brendan F.; Armstrong, Julian J.; Alexandrov, Sergey A.; Sampson, David D.

    2009-05-01

    We present a new approach to optical coherence elastography (OCE), which probes the local elastic properties of tissue by using optical coherence tomography to measure the effect of an applied stimulus in the audio frequency range. We describe the approach, based on analysis of the Bessel frequency spectrum of the interferometric signal detected from scatterers undergoing periodic motion in response to an applied stimulus. We present quantitative results of sub-micron excitation at 820 Hz in a layered phantom and the first such measurements in human skin in vivo.

  15. Putting engineering back into protein engineering: bioinformatic approaches to catalyst design.

    PubMed

    Gustafsson, Claes; Govindarajan, Sridhar; Minshull, Jeremy

    2003-08-01

    Complex multivariate engineering problems are commonplace and not unique to protein engineering. Mathematical and data-mining tools developed in other fields of engineering have now been applied to analyze sequence-activity relationships of peptides and proteins and to assist in the design of proteins and peptides with specified properties. Decreasing costs of DNA sequencing in conjunction with methods to quickly synthesize statistically representative sets of proteins allow modern heuristic statistics to be applied to protein engineering. This provides an alternative approach to expensive assays or unreliable high-throughput surrogate screens.

  16. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1991-01-01

    The difficulty of developing reliable distributed software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems which are substantially easier to develop, fault-tolerance, and self-managing. Six years of research on ISIS are reviewed, describing the model, the types of applications to which ISIS was applied, and some of the reasoning that underlies a recent effort to redesign and reimplement ISIS as a much smaller, lightweight system.

  17. Recent Analytical Techniques Advances in the Carotenoids and Their Derivatives Determination in Various Matrixes.

    PubMed

    Giuffrida, Daniele; Donato, Paola; Dugo, Paola; Mondello, Luigi

    2018-04-04

    In the present perspective, different approaches to the carotenoids analysis will be discussed providing a brief overview of the most advanced both monodimensional and bidimensional liquid chromatographic methodologies applied to the carotenoids analysis, followed by a discussion on the recents advanced supercritical fluid chromatography × liquid chromatography bidimensional approach with photodiode-array and mass spectrometry detection. Moreover a discussion on the online supercritical fluid extraction-supercritical fluid chromatography with tandem mass spectrometry detection applied to the determination of carotenoids and apocarotenoids will also be provided.

  18. Calculation of short-wave signal amplitude on the basis of the waveguide approach and the method of characteristics

    NASA Astrophysics Data System (ADS)

    Mikhailov, S. Ia.; Tumatov, K. I.

    The paper compares the results obtained using two methods to calculate the amplitude of a short-wave signal field incident on or reflected from a perfectly conducting earth. A technique is presented for calculating the geometric characteristics of the field based on the waveguide approach. It is shown that applying an extended system of characteristic equations to calculate the field amplitude is inadmissible in models which include the discontinuity second derivatives of the permittivity unless a suitable treament of the discontinuity points is applied.

  19. Strip Yield Model Numerical Application to Different Geometries and Loading Conditions

    NASA Technical Reports Server (NTRS)

    Hatamleh, Omar; Forman, Royce; Shivakumar, Venkataraman; Lyons, Jed

    2006-01-01

    A new numerical method based on the strip-yield analysis approach was developed for calculating the Crack Tip Opening Displacement (CTOD). This approach can be applied for different crack configurations having infinite and finite geometries, and arbitrary applied loading conditions. The new technique adapts the boundary element / dislocation density method to obtain crack-face opening displacements at any point on a crack, and succeeds by obtaining requisite values as a series of definite integrals, the functional parts of each being evaluated exactly in a closed form.

  20. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  1. A Service Design Thinking Approach for Stakeholder-Centred eHealth.

    PubMed

    Lee, Eunji

    2016-01-01

    Studies have described the opportunities and challenges of applying service design techniques to health services, but empirical evidence on how such techniques can be implemented in the context of eHealth services is still lacking. This paper presents how a service design thinking approach can be applied for specification of an existing and new eHealth service by supporting evaluation of the current service and facilitating suggestions for the future service. We propose Service Journey Modelling Language and Service Journey Cards to engage stakeholders in the design of eHealth services.

  2. Lagrange multiplier for perishable inventory model considering warehouse capacity planning

    NASA Astrophysics Data System (ADS)

    Amran, Tiena Gustina; Fatima, Zenny

    2017-06-01

    This paper presented Lagrange Muktiplier approach for solving perishable raw material inventory planning considering warehouse capacity. A food company faced an issue of managing perishable raw materials and marinades which have limited shelf life. Another constraint to be considered was the capacity of the warehouse. Therefore, an inventory model considering shelf life and raw material warehouse capacity are needed in order to minimize the company's inventory cost. The inventory model implemented in this study was the adapted economic order quantity (EOQ) model which is optimized using Lagrange multiplier. The model and solution approach were applied to solve a case industry in a food manufacturer. The result showed that the total inventory cost decreased 2.42% after applying the proposed approach.

  3. A new decision sciences for complex systems.

    PubMed

    Lempert, Robert J

    2002-05-14

    Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an approach to decision-making under conditions of deep uncertainty that is ideally suited to applying complex systems to policy analysis. The article demonstrates the approach on the policy problem of global climate change, with a particular focus on the role of technology policies in a robust, adaptive strategy for greenhouse gas abatement.

  4. A Novel Ontology Approach to Support Design for Reliability considering Environmental Effects

    PubMed Central

    Sun, Bo; Li, Yu; Ye, Tianyuan

    2015-01-01

    Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis. PMID:25821857

  5. A novel ontology approach to support design for reliability considering environmental effects.

    PubMed

    Sun, Bo; Li, Yu; Ye, Tianyuan; Ren, Yi

    2015-01-01

    Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis.

  6. A novel approach to describing and detecting performance anti-patterns

    NASA Astrophysics Data System (ADS)

    Sheng, Jinfang; Wang, Yihan; Hu, Peipei; Wang, Bin

    2017-08-01

    Anti-pattern, as an extension to pattern, describes a widely used poor solution which can bring negative influence to application systems. Aiming at the shortcomings of the existing anti-pattern descriptions, an anti-pattern description method based on first order predicate is proposed. This method synthesizes anti-pattern forms and symptoms, which makes the description more accurate and has good scalability and versatility as well. In order to improve the accuracy of anti-pattern detection, a Bayesian classification method is applied in validation for detection results, which can reduce false negatives and false positives of anti-pattern detection. Finally, the proposed approach in this paper is applied to a small e-commerce system, the feasibility and effectiveness of the approach is demonstrated further through experiments.

  7. An effective model for ergonomic optimization applied to a new automotive assembly line

    NASA Astrophysics Data System (ADS)

    Duraccio, Vincenzo; Elia, Valerio; Forcina, Antonio

    2016-06-01

    An efficient ergonomic optimization can lead to a significant improvement in production performance and a considerable reduction of costs. In the present paper new model for ergonomic optimization is proposed. The new approach is based on the criteria defined by National Institute of Occupational Safety and Health and, adapted to Italian legislation. The proposed model provides an ergonomic optimization, by analyzing ergonomic relations between manual work in correct conditions. The model includes a schematic and systematic analysis method of the operations, and identifies all possible ergonomic aspects to be evaluated. The proposed approach has been applied to an automotive assembly line, where the operation repeatability makes the optimization fundamental. The proposed application clearly demonstrates the effectiveness of the new approach.

  8. A Structured Approach to Teaching Applied Problem Solving through Technology Assessment.

    ERIC Educational Resources Information Center

    Fischbach, Fritz A.; Sell, Nancy J.

    1986-01-01

    Describes an approach to problem solving based on real-world problems. Discusses problem analysis and definitions, preparation of briefing documents, solution finding techniques (brainstorming and synectics), solution evaluation and judgment, and implementation. (JM)

  9. Linear regression crash prediction models : issues and proposed solutions.

    DOT National Transportation Integrated Search

    2010-05-01

    The paper develops a linear regression model approach that can be applied to : crash data to predict vehicle crashes. The proposed approach involves novice data aggregation : to satisfy linear regression assumptions; namely error structure normality ...

  10. Evaluating the Similarity of Complex Drinking-Water Disinfection By-Product Mixtures: Overview of the Issues

    EPA Science Inventory

    The Presentation describes the advantages and challenges of working with Whole Mixtures, discusses an exploratory approach for evaluating sufficient similarity, and challenges of applying such approaches to other environmental mixtures.

  11. An Analytical Approach to Salary Evaluation for Educational Personnel

    ERIC Educational Resources Information Center

    Bruno, James Edward

    1969-01-01

    "In this study a linear programming model for determining an 'optimal' salary schedule was derived then applied to an educational salary structure. The validity of the model and the effectiveness of the approach were established. (Author)

  12. Visualising crystal packing interactions in solid-state NMR: Concepts and applications

    NASA Astrophysics Data System (ADS)

    Zilka, Miri; Sturniolo, Simone; Brown, Steven P.; Yates, Jonathan R.

    2017-10-01

    In this article, we introduce and apply a methodology, based on density functional theory and the gauge-including projector augmented wave approach, to explore the effects of packing interactions on solid-state nuclear magnetic resonance (NMR) parameters. A visual map derived from a so-termed "magnetic shielding contribution field" can be made of the contributions to the magnetic shielding of a specific site—partitioning the chemical shift to specific interactions. The relation to the established approaches of examining the molecule to crystal change in the chemical shift and the nuclear independent chemical shift is established. The results are applied to a large sample of 71 molecular crystals and three further specific examples from supermolecular chemistry and pharmaceuticals. This approach extends the NMR crystallography toolkit and provides insight into the development of both cluster based approaches to the predictions of chemical shifts and for empirical predictions of chemical shifts in solids.

  13. Implementation of 7e learning cycle model using technology based constructivist teaching (TBCT) approach to improve students' understanding achievment in mechanical wave material

    NASA Astrophysics Data System (ADS)

    Warliani, Resti; Muslim, Setiawan, Wawan

    2017-05-01

    This study aims to determine the increase in the understanding achievement in senior high school students through the Learning Cycle 7E with technology based constructivist teaching approach (TBCT). This study uses a pretest-posttest control group design. The participants were 67 high school students of eleventh grade in Garut city with two class in control and experiment class. Experiment class applying the Learning Cycle 7E through TBCT approach and control class applying the 7E Learning Cycle through Constructivist Teaching approach (CT). Data collection tools from mechanical wave concept test with totally 22 questions with reability coefficient was found 0,86. The findings show the increase of the understanding achievement of the experiment class is in the amount of 0.51 was higher than the control class that is in the amount of 0.33.

  14. Fault detection and diagnosis using neural network approaches

    NASA Technical Reports Server (NTRS)

    Kramer, Mark A.

    1992-01-01

    Neural networks can be used to detect and identify abnormalities in real-time process data. Two basic approaches can be used, the first based on training networks using data representing both normal and abnormal modes of process behavior, and the second based on statistical characterization of the normal mode only. Given data representative of process faults, radial basis function networks can effectively identify failures. This approach is often limited by the lack of fault data, but can be facilitated by process simulation. The second approach employs elliptical and radial basis function neural networks and other models to learn the statistical distributions of process observables under normal conditions. Analytical models of failure modes can then be applied in combination with the neural network models to identify faults. Special methods can be applied to compensate for sensor failures, to produce real-time estimation of missing or failed sensors based on the correlations codified in the neural network.

  15. Improving Psychological Measurement: Does It Make a Difference? A Comment on Nesselroade and Molenaar (2016).

    PubMed

    Maydeu-Olivares, Alberto

    2016-01-01

    Nesselroade and Molenaar advocate the use of an idiographic filter approach. This is a fixed-effects approach, which may limit the number of individuals that can be simultaneously modeled, and it is not clear how to model the presence of subpopulations. Most important, Nesselroade and Molenaar's proposal appears to be best suited for modeling long time series on a few variables for a few individuals. Long time series are not common in psychological applications. Can it be applied to the usual longitudinal data we face? These are characterized by short time series (four to five points in time), hundreds of individuals, and dozens of variables. If so, what do we gain? Applied settings most often involve between-individual decisions. I conjecture that their approach will not outperform common, simpler, methods. However, when intraindividual decisions are involved, their approach may have an edge.

  16. Applying a weed risk assessment approach to GM crops.

    PubMed

    Keese, Paul K; Robold, Andrea V; Myers, Ruth C; Weisman, Sarah; Smith, Joe

    2014-12-01

    Current approaches to environmental risk assessment of genetically modified (GM) plants are modelled on chemical risk assessment methods, which have a strong focus on toxicity. There are additional types of harms posed by plants that have been extensively studied by weed scientists and incorporated into weed risk assessment methods. Weed risk assessment uses robust, validated methods that are widely applied to regulatory decision-making about potentially problematic plants. They are designed to encompass a broad variety of plant forms and traits in different environments, and can provide reliable conclusions even with limited data. The knowledge and experience that underpin weed risk assessment can be harnessed for environmental risk assessment of GM plants. A case study illustrates the application of the Australian post-border weed risk assessment approach to a representative GM plant. This approach is a valuable tool to identify potential risks from GM plants.

  17. Fingerprinting microbiomes towards screening for microbial antibiotic resistance.

    PubMed

    Jin, Naifu; Zhang, Dayi; Martin, Francis L

    2017-05-22

    There is an increasing need to investigate microbiomes in their entirety in a variety of contexts ranging from environmental to human health scenarios. This requirement is becoming increasingly important with the emergence of antibiotic resistance. In general, more conventional approaches are too expensive and/or time-consuming and often predicated on prior knowledge of the microorganisms one wishes to study. Herein, we propose the use of biospectroscopy tools as relatively high-throughput, non-destructive approaches to profile microbiomes under study. Fourier-transform infrared (FTIR) or Raman spectroscopy both generate fingerprint spectra of biological material and such spectra can readily be subsequently classed according to biochemical changes in the microbiota, such as emergence of antibiotic resistance. FTIR spectroscopy techniques generally can only be applied to desiccated material whereas Raman approaches can be applied to more hydrated samples. The ability to readily fingerprint microbiomes could lend itself to new approaches in determining microbial behaviours and emergence of antibiotic resistance.

  18. Complete Hamiltonian analysis of cosmological perturbations at all orders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nandi, Debottam; Shankaranarayanan, S., E-mail: debottam@iisertvm.ac.in, E-mail: shanki@iisertvm.ac.in

    2016-06-01

    In this work, we present a consistent Hamiltonian analysis of cosmological perturbations at all orders. To make the procedure transparent, we consider a simple model and resolve the 'gauge-fixing' issues and extend the analysis to scalar field models and show that our approach can be applied to any order of perturbation for any first order derivative fields. In the case of Galilean scalar fields, our procedure can extract constrained relations at all orders in perturbations leading to the fact that there is no extra degrees of freedom due to the presence of higher time derivatives of the field in themore » Lagrangian. We compare and contrast our approach to the Lagrangian approach (Chen et al. [2006]) for extracting higher order correlations and show that our approach is efficient and robust and can be applied to any model of gravity and matter fields without invoking slow-roll approximation.« less

  19. Spatial modeling in ecology: the flexibility of eigenfunction spatial analyses.

    PubMed

    Griffith, Daniel A; Peres-Neto, Pedro R

    2006-10-01

    Recently, analytical approaches based on the eigenfunctions of spatial configuration matrices have been proposed in order to consider explicitly spatial predictors. The present study demonstrates the usefulness of eigenfunctions in spatial modeling applied to ecological problems and shows equivalencies of and differences between the two current implementations of this methodology. The two approaches in this category are the distance-based (DB) eigenvector maps proposed by P. Legendre and his colleagues, and spatial filtering based upon geographic connectivity matrices (i.e., topology-based; CB) developed by D. A. Griffith and his colleagues. In both cases, the goal is to create spatial predictors that can be easily incorporated into conventional regression models. One important advantage of these two approaches over any other spatial approach is that they provide a flexible tool that allows the full range of general and generalized linear modeling theory to be applied to ecological and geographical problems in the presence of nonzero spatial autocorrelation.

  20. Definition of sampling units begets conclusions in ecology: the case of habitats for plant communities.

    PubMed

    Mörsdorf, Martin A; Ravolainen, Virve T; Støvern, Leif Einar; Yoccoz, Nigel G; Jónsdóttir, Ingibjörg Svala; Bråthen, Kari Anne

    2015-01-01

    In ecology, expert knowledge on habitat characteristics is often used to define sampling units such as study sites. Ecologists are especially prone to such approaches when prior sampling frames are not accessible. Here we ask to what extent can different approaches to the definition of sampling units influence the conclusions that are drawn from an ecological study? We do this by comparing a formal versus a subjective definition of sampling units within a study design which is based on well-articulated objectives and proper methodology. Both approaches are applied to tundra plant communities in mesic and snowbed habitats. For the formal approach, sampling units were first defined for each habitat in concave terrain of suitable slope using GIS. In the field, these units were only accepted as the targeted habitats if additional criteria for vegetation cover were fulfilled. For the subjective approach, sampling units were defined visually in the field, based on typical plant communities of mesic and snowbed habitats. For each approach, we collected information about plant community characteristics within a total of 11 mesic and seven snowbed units distributed between two herding districts of contrasting reindeer density. Results from the two approaches differed significantly in several plant community characteristics in both mesic and snowbed habitats. Furthermore, differences between the two approaches were not consistent because their magnitude and direction differed both between the two habitats and the two reindeer herding districts. Consequently, we could draw different conclusions on how plant diversity and relative abundance of functional groups are differentiated between the two habitats depending on the approach used. We therefore challenge ecologists to formalize the expert knowledge applied to define sampling units through a set of well-articulated rules, rather than applying it subjectively. We see this as instrumental for progress in ecology as only rules based on expert knowledge are transparent and lead to results reproducible by other ecologists.

  1. Estimating Soil Hydraulic Parameters using Gradient Based Approach

    NASA Astrophysics Data System (ADS)

    Rai, P. K.; Tripathi, S.

    2017-12-01

    The conventional way of estimating parameters of a differential equation is to minimize the error between the observations and their estimates. The estimates are produced from forward solution (numerical or analytical) of differential equation assuming a set of parameters. Parameter estimation using the conventional approach requires high computational cost, setting-up of initial and boundary conditions, and formation of difference equations in case the forward solution is obtained numerically. Gaussian process based approaches like Gaussian Process Ordinary Differential Equation (GPODE) and Adaptive Gradient Matching (AGM) have been developed to estimate the parameters of Ordinary Differential Equations without explicitly solving them. Claims have been made that these approaches can straightforwardly be extended to Partial Differential Equations; however, it has been never demonstrated. This study extends AGM approach to PDEs and applies it for estimating parameters of Richards equation. Unlike the conventional approach, the AGM approach does not require setting-up of initial and boundary conditions explicitly, which is often difficult in real world application of Richards equation. The developed methodology was applied to synthetic soil moisture data. It was seen that the proposed methodology can estimate the soil hydraulic parameters correctly and can be a potential alternative to the conventional method.

  2. A Computer Vision Approach to Identify Einstein Rings and Arcs

    NASA Astrophysics Data System (ADS)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  3. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  4. Identifying Interacting Genetic Variations by Fish-Swarm Logic Regression

    PubMed Central

    Yang, Aiyuan; Yan, Chunxia; Zhu, Feng; Zhao, Zhongmeng; Cao, Zhi

    2013-01-01

    Understanding associations between genotypes and complex traits is a fundamental problem in human genetics. A major open problem in mapping phenotypes is that of identifying a set of interacting genetic variants, which might contribute to complex traits. Logic regression (LR) is a powerful multivariant association tool. Several LR-based approaches have been successfully applied to different datasets. However, these approaches are not adequate with regard to accuracy and efficiency. In this paper, we propose a new LR-based approach, called fish-swarm logic regression (FSLR), which improves the logic regression process by incorporating swarm optimization. In our approach, a school of fish agents are conducted in parallel. Each fish agent holds a regression model, while the school searches for better models through various preset behaviors. A swarm algorithm improves the accuracy and the efficiency by speeding up the convergence and preventing it from dropping into local optimums. We apply our approach on a real screening dataset and a series of simulation scenarios. Compared to three existing LR-based approaches, our approach outperforms them by having lower type I and type II error rates, being able to identify more preset causal sites, and performing at faster speeds. PMID:23984382

  5. AucPR: an AUC-based approach using penalized regression for disease prediction with high-dimensional omics data.

    PubMed

    Yu, Wenbao; Park, Taesung

    2014-01-01

    It is common to get an optimal combination of markers for disease classification and prediction when multiple markers are available. Many approaches based on the area under the receiver operating characteristic curve (AUC) have been proposed. Existing works based on AUC in a high-dimensional context depend mainly on a non-parametric, smooth approximation of AUC, with no work using a parametric AUC-based approach, for high-dimensional data. We propose an AUC-based approach using penalized regression (AucPR), which is a parametric method used for obtaining a linear combination for maximizing the AUC. To obtain the AUC maximizer in a high-dimensional context, we transform a classical parametric AUC maximizer, which is used in a low-dimensional context, into a regression framework and thus, apply the penalization regression approach directly. Two kinds of penalization, lasso and elastic net, are considered. The parametric approach can avoid some of the difficulties of a conventional non-parametric AUC-based approach, such as the lack of an appropriate concave objective function and a prudent choice of the smoothing parameter. We apply the proposed AucPR for gene selection and classification using four real microarray and synthetic data. Through numerical studies, AucPR is shown to perform better than the penalized logistic regression and the nonparametric AUC-based method, in the sense of AUC and sensitivity for a given specificity, particularly when there are many correlated genes. We propose a powerful parametric and easily-implementable linear classifier AucPR, for gene selection and disease prediction for high-dimensional data. AucPR is recommended for its good prediction performance. Beside gene expression microarray data, AucPR can be applied to other types of high-dimensional omics data, such as miRNA and protein data.

  6. Integrated Approaches On Archaeo-Geophysical Data

    NASA Astrophysics Data System (ADS)

    Kucukdemirci, M.; Piro, S.; Zamuner, D.; Ozer, E.

    2015-12-01

    Key words: Ground Penetrating Radar (GPR), Magnetometry, Geophysical Data Integration, Principal Component Analyse (PCA), Aizanoi Archaeological Site An application of geophysical integration methods which often appealed are divided into two classes as qualitative and quantitative approaches. This work focused on the application of quantitative integration approaches, which involve the mathematical and statistical integration techniques, on the archaeo-geophysical data obtained in Aizanoi Archaeological Site,Turkey. Two geophysical methods were applied as Ground Penetrating Radar (GPR) and Magnetometry for archaeological prospection on the selected archaeological site. After basic data processing of each geophysical method, the mathematical approaches of Sums and Products and the statistical approach of Principal Component Analysis (PCA) have been applied for the integration. These integration approches were first tested on synthetic digital images before application to field data. Then the same approaches were applied to 2D magnetic maps and 2D GPR time slices which were obtained on the same unit grids in the archaeological site. Initially, the geophysical data were examined individually by referencing with archeological maps and informations obtained from archaeologists and some important structures as possible walls, roads and relics were determined. The results of all integration approaches provided very important and different details about the anomalies related to archaeological features. By using all those applications, integrated images can provide complementary informations as well about the archaeological relics under the ground. Acknowledgements The authors would like to thanks to Scientific and Technological Research Council of Turkey (TUBITAK), Fellowship for Visiting Scientists Programme for their support, Istanbul University Scientific Research Project Fund, (Project.No:12302) and archaeologist team of Aizanoi Archaeological site for their support during the field work.

  7. First-order logic theory for manipulating clinical practice guidelines applied to comorbid patients: a case study.

    PubMed

    Michalowski, Martin; Wilk, Szymon; Tan, Xing; Michalowski, Wojtek

    2014-01-01

    Clinical practice guidelines (CPGs) implement evidence-based medicine designed to help generate a therapy for a patient suffering from a single disease. When applied to a comorbid patient, the concurrent combination of treatment steps from multiple CPGs is susceptible to adverse interactions in the resulting combined therapy (i.e., a therapy established according to all considered CPGs). This inability to concurrently apply CPGs has been shown to be one of the key shortcomings of CPG uptake in a clinical setting1. Several research efforts are underway to address this issue such as the K4CARE2 and GuideLine INteraction Detection Assistant (GLINDA)3 projects and our previous research on applying constraint logic programming to developing a consistent combined therapy for a comorbid patient4. However, there is no generalized framework for mitigation that effectively captures general characteristics of the problem while handling nuances such as time and ordering requirements imposed by specific CPGs. In this paper we propose a first-order logic-based (FOL) approach for developing a generalized framework of mitigation. This approach uses a meta-algorithm and entailment properties to mitigate (i.e., identify and address) adverse interactions introduced by concurrently applied CPGs. We use an illustrative case study of a patient suffering from type 2 diabetes being treated for an onset of severe rheumatoid arthritis to show the expressiveness and robustness of our proposed FOL-based approach, and we discuss its appropriateness as the basis for the generalized theory.

  8. Mapping edge-based traffic measurements onto the internal links in MPLS network

    NASA Astrophysics Data System (ADS)

    Zhao, Guofeng; Tang, Hong; Zhang, Yi

    2004-09-01

    Applying multi-protocol label switching techniques to IP-based backbone for traffic engineering goals has shown advantageous. Obtaining a volume of load on each internal link of the network is crucial for traffic engineering applying. Though collecting can be available for each link, such as applying traditional SNMP scheme, the approach may cause heavy processing load and sharply degrade the throughput of the core routers. Then monitoring merely at the edge of the network and mapping the measurements onto the core provides a good alternative way. In this paper, we explore a scheme for traffic mapping with edge-based measurements in MPLS network. It is supposed that the volume of traffic on each internal link over the domain would be mapped onto by measurements available only at ingress nodes. We apply path-based measurements at ingress nodes without enabling measurements in the core of the network. We propose a method that can infer a path from the ingress to the egress node using label distribution protocol without collecting routing data from core routers. Based on flow theory and queuing theory, we prove that our approach is effective and present the algorithm for traffic mapping. We also show performance simulation results that indicate potential of our approach.

  9. Making Science Come Alive.

    ERIC Educational Resources Information Center

    Whitford, Dennis J.; Eisman, Greg A.

    1997-01-01

    The U.S. Naval Academy oceanography major is bucking the nationwide trend toward declining enrollments in science majors study tracks. Nontraditional approaches used include interdisciplinary and applied science, significant instructor experience in applying the major outside academia, hands-on laboratories in all classes, and an oceanography…

  10. Applying the Ecosystem Services Concept to Public Land Management

    EPA Science Inventory

    We examine the challenges opportunities involved in applying ecosystem services to public lands management, with an emphasis on the work of the USDA Forest Service. We review the history of economics approaches to landscape management, outline a conceptual framework defining the ...

  11. Advanced imaging techniques in brain tumors

    PubMed Central

    2009-01-01

    Abstract Perfusion, permeability and magnetic resonance spectroscopy (MRS) are now widely used in the research and clinical settings. In the clinical setting, qualitative, semi-quantitative and quantitative approaches such as review of color-coded maps to region of interest analysis and analysis of signal intensity curves are being applied in practice. There are several pitfalls with all of these approaches. Some of these shortcomings are reviewed, such as the relative low sensitivity of metabolite ratios from MRS and the effect of leakage on the appearance of color-coded maps from dynamic susceptibility contrast (DSC) magnetic resonance (MR) perfusion imaging and what correction and normalization methods can be applied. Combining and applying these different imaging techniques in a multi-parametric algorithmic fashion in the clinical setting can be shown to increase diagnostic specificity and confidence. PMID:19965287

  12. A New Approach of Measuring Hospital Performance for Low- and Middle-income Countries

    PubMed Central

    Sapkota, Vishnu Prasad; Supakankunti, Siripen

    2015-01-01

    Efficiency of the hospitals affects the price of health services. Health care payments have equity implications. Evidence on hospital performance can support to design the policy; however, the recent literature on hospital efficiency produced conflicting results. Consequently, policy decisions are uncertain. Even the most of evidence were produced by using data from high income countries. Conflicting results were produced particularly due to differences in methods of measuring performance. Recently a management approach has been developed to measure the hospital performance. This approach to measure the hospital performance is very useful from policy perspective to improve health system from cost-effective way in low and middle income countries. Measuring hospital performance through management approach has some basic characteristics such as scoring management practices through double blind survey, measuring hospital outputs using various indicators, estimating the relationship between management practices and outputs of the hospitals. This approach has been successfully applied to developed countries; however, some revisions are required without violating the fundamental principle of this approach to replicate in low- and middle-income countries. The process has been clearly defined and applied to Nepal. As the results of this, the approach produced expected results. The paper contributes to improve the approach to measure hospital performance. PMID:26617448

  13. Biotechnological approaches for improvement and conservation of prunus species

    USDA-ARS?s Scientific Manuscript database

    Biotechnology has contributed to improvement and conservation of Prunus species. Biotechnological approaches involving in vitro tissue culture, genetic transformation, molecular marker development and cryopreservation were applied to various Prunus species. This report provides an overview of biotec...

  14. Positivists, Postmodernists, Aristotelians, and the Challenger Disaster.

    ERIC Educational Resources Information Center

    Walzer, Arthur E.; Gross, Alan

    1994-01-01

    Examines the deliberations prior to the Challenger disaster from the perspective of three major approaches in recent scholarship in rhetoric as applied to technical communications: positivism, postmodernistic social constructionism, and classical Aristotelianism. Champions an approach based on Aristotle's "Rhetoric." (HB)

  15. Agricultural Conservation Planning Toolbox User's Manual

    USDA-ARS?s Scientific Manuscript database

    Agricultural Conservation Planning Framework (ACPF) comprises an approach for applying concepts of precision conservation to watershed planning in agricultural landscapes. To enable application of this approach, USDA/ARS has developed a set of Geographic Information System (GIS) based software tools...

  16. A Multimodal Approach to Counselor Supervision.

    ERIC Educational Resources Information Center

    Ponterotto, Joseph G.; Zander, Toni A.

    1984-01-01

    Represents an initial effort to apply Lazarus's multimodal approach to a model of counselor supervision. Includes continuously monitoring the trainee's behavior, affect, sensations, images, cognitions, interpersonal functioning, and when appropriate, biological functioning (diet and drugs) in the supervisory process. (LLL)

  17. A Bayesian Hierarchical Modeling Approach to Predicting Flow in Ungauged Basins

    EPA Science Inventory

    Recent innovative approaches to identifying and applying regression-based relationships between land use patterns (such as increasing impervious surface area and decreasing vegetative cover) and rainfall-runoff model parameters represent novel and promising improvements to predic...

  18. Experimental College Physics Course Based on Ausubel's Learning Theory.

    ERIC Educational Resources Information Center

    Moreira, Marco Antonio

    1978-01-01

    Compares the Ausubelian approach and the traditional one to the content organization of an introductory course in electromagnetism. States the differences between these approaches in terms of the student's ability to apply, relate, and differentiate electromagnetic concepts. (GA)

  19. Preventing Bullying through Positive Behavioral Interventions and Supports (PBIS): A Multitiered Approach to Prevention and Integration

    ERIC Educational Resources Information Center

    Bradshaw, Catherine P.

    2013-01-01

    Although bullying continues to be a growing public health concern in schools across the United States, there are considerable gaps in the American understanding of effective prevention approaches for addressing this seemingly intractable issue. This article applies a public health approach to addressing bullying through the multitiered Positive…

  20. Modeling Latent Interactions at Level 2 in Multilevel Structural Equation Models: An Evaluation of Mean-Centered and Residual-Centered Unconstrained Approaches

    ERIC Educational Resources Information Center

    Leite, Walter L.; Zuo, Youzhen

    2011-01-01

    Among the many methods currently available for estimating latent variable interactions, the unconstrained approach is attractive to applied researchers because of its relatively easy implementation with any structural equation modeling (SEM) software. Using a Monte Carlo simulation study, we extended and evaluated the unconstrained approach to…

  1. Strategy Revitalization in Academe: A Balanced Scorecard Approach

    ERIC Educational Resources Information Center

    McDevitt, Roselie; Giapponi, Catherine; Solomon, Norman

    2008-01-01

    Purpose: The purpose of this paper is to present a unique version of the balanced scorecard developed and applied by the faculty of a university division. Design/methodology/approach: The paper uses a case study approach and uses the experiences of the faculty of a business school to describe the process and benefits of developing a custom…

  2. Do Learning Approaches of Medical Students Affect Their Satisfaction with Problem-Based Learning?

    ERIC Educational Resources Information Center

    Gurpinar, Erol; Kulac, Esin; Tetik, Cihat; Akdogan, Ilgaz; Mamakli, Sumer

    2013-01-01

    The aim of this research was to determine the satisfaction of medical students with problem-based learning (PBL) and their approaches to learning to investigate the effect of learning approaches on their levels of satisfaction. The study group was composed of medical students from three different universities, which apply PBL at different levels…

  3. Gender Approach at Physical Culture Lessons at the Second Stage of Basic High Education

    ERIC Educational Resources Information Center

    Vorotilkin?, Irina M.; Anokhina, Olga V.; Galitsyn, Sergey V.; Byankina, Larisa V.; Chiligin, Dmitriy V.

    2016-01-01

    Gender approach in education is a specific impact on the development of boys and girls by the set of factors of education and training process. The objective of this research is the reasoning of applying gender approach at physical culture lessons and creating comfortable environment taking into account the psychophysiological differences of the…

  4. Costs of fire suppression forces based on cost-aggregation approach

    Treesearch

    Gonz& aacute; lez-Cab& aacute; Armando n; Charles W. McKetta; Thomas J. Mills

    1984-01-01

    A cost-aggregation approach has been developed for determining the cost of Fire Management Inputs (FMls)-the direct fireline production units (personnel and equipment) used in initial attack and large-fire suppression activities. All components contributing to an FMI are identified, computed, and summed to estimate hourly costs. This approach can be applied to any FMI...

  5. Coyote Papers: The University of Arizona Working Papers in Linguistics, Volume 11. Special Volume on Native American Languages.

    ERIC Educational Resources Information Center

    Weinberg, Jessica P., Ed.; O'Bryan, Erin L., Ed.; Moll, Laura A., Ed.; Haugan, Jason D., Ed.

    The five papers included in this volume approach the study of American Indian languages from a diverse array of methodological and theoretical approaches to linguistics. Two papers focus on approaches that come from the applied linguistics tradition, emphasizing ethnolinguistics and discourse analysis: Sonya Bird's paper "A Cross Cultural…

  6. Physics Learning with a Computer Algebra System: Towards a Learning Environment That Promotes Enhanced Problem Representations.

    ERIC Educational Resources Information Center

    Savelsbergh, Elwin R.; Ferguson-Hessler, Monica G. M.; de Jong, Ton

    An approach to teaching problem-solving based on using the computer software Mathematica is applied to the study of electrostatics and is compared with the normal approach to the module. Learning outcomes for both approaches were not significantly different. The experimental course successfully addressed a number of misconceptions. Students in the…

  7. 78 FR 12764 - Draft Office of Health Assessment and Translation Approach for Systematic Review and Evidence...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-25

    ... Approach--February 2013 might be needed, OHAT plans to apply it to two case-study evaluations. One case... Availability: Draft OHAT Approach--February 2013 will be available by February 26, 2013, and case-study... framework, describe the contents in the case-study protocols, and respond to questions from the public on...

  8. 12 CFR 3.134 - Guarantees and credit derivatives: PD substitution and LGD adjustment approaches.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Guarantees and credit derivatives: PD... derivatives: PD substitution and LGD adjustment approaches. (a) Scope. (1) This section applies to wholesale... exposure described in paragraph (a)(1) of this section by using the PD substitution approach or the LGD...

  9. Teaching Diversity through Service-Learning: An Integrative Praxis Pedagogical Approach

    ERIC Educational Resources Information Center

    Rice, Julie Steinkopf; Horn, Terri

    2014-01-01

    Service-learning has been shown to be an effective technique for teaching diversity; however, the literature is scant concerning theoretically informed approaches. This study fills that void by drawing upon the work of Freire, Rendón, and others. After describing how an integrative praxis approach is applied in a sociology course, the authors…

  10. Structured-Exercise-Program (SEP): An Effective Training Approach to Key Healthcare Professionals

    ERIC Educational Resources Information Center

    Miazi, Mosharaf H.; Hossain, Taleb; Tiroyakgosi, C.

    2014-01-01

    Structured exercise program is an effective approach to technology dependent resource limited healthcare area for professional training. The result of a recently conducted data analysis revealed this. The aim of the study is to know the effectiveness of the applied approach that was designed to observe the level of adherence to newly adopted…

  11. An Approach to Biased Item Identification Using Latent Trait Measurement Theory.

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.

    Because it is a true score model employing item parameters which are independent of the examined sample, item characteristic curve theory (ICC) offers several advantages over classical measurement theory. In this paper an approach to biased item identification using ICC theory is described and applied. The ICC theory approach is attractive in that…

  12. HPLC-MS/MS method for dexmedetomidine quantification with Design of Experiments approach: application to pediatric pharmacokinetic study.

    PubMed

    Szerkus, Oliwia; Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bartosińska, Ewa; Bujak, Renata; Borsuk, Agnieszka; Bienert, Agnieszka; Bartkowska-Śniatkowska, Alicja; Warzybok, Justyna; Wiczling, Paweł; Nasal, Antoni; Kaliszan, Roman; Markuszewski, Michał Jan; Siluk, Danuta

    2017-02-01

    The purpose of this work was to develop and validate a rapid and robust LC-MS/MS method for the determination of dexmedetomidine (DEX) in plasma, suitable for analysis of a large number of samples. Systematic approach, Design of Experiments, was applied to optimize ESI source parameters and to evaluate method robustness, therefore, a rapid, stable and cost-effective assay was developed. The method was validated according to US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (5-2500 pg/ml), Results: Experimental design approach was applied for optimization of ESI source parameters and evaluation of method robustness. The method was validated according to the US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (R 2 > 0.98). The accuracies, intra- and interday precisions were less than 15%. The stability data confirmed reliable behavior of DEX under tested conditions. Application of Design of Experiments approach allowed for fast and efficient analytical method development and validation as well as for reduced usage of chemicals necessary for regular method optimization. The proposed technique was applied to determination of DEX pharmacokinetics in pediatric patients undergoing long-term sedation in the intensive care unit.

  13. Adjusting for publication biases across similar interventions performed well when compared with gold standard data.

    PubMed

    Moreno, Santiago G; Sutton, Alex J; Ades, A E; Cooper, Nicola J; Abrams, Keith R

    2011-11-01

    To extend, apply, and evaluate a regression-based approach to adjusting meta-analysis for publication and related biases. The approach uses related meta-analyses to improve estimation by borrowing strength on the degree of bias. The proposed adjustment approach is described. Adjustments are applied both independently and by borrowing strength across journal-extracted data on the effectiveness of 12 antidepressant drugs from placebo-controlled trials. The methods are also applied to Food and Drug Administration (FDA) data obtained on the same 12 drugs. Results are compared, viewing the FDA observed data as gold standard. Estimates adjusted for publication biases made independently for each drug were very uncertain using both the journal and FDA data. Adjusted estimates were much more precise when borrowing strength across meta-analyses. Reassuringly, adjustments in this way made to the journal data agreed closely with the observed estimates from the FDA data, while the adjusted FDA results changed only minimally from those observed from the FDA data. The method worked well in the case study considered and therefore further evaluation is encouraged. It is suggested that this approach may be especially useful when adjusting several meta-analyses on similar interventions and outcomes, particularly when there are small numbers of studies. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. An innovations approach to decoupling of multibody dynamics and control

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.

    1989-01-01

    The problem of hinged multibody dynamics is solved using an extension of the innovations approach of linear filtering and prediction theory to the problem of mechanical system modeling and control. This approach has been used quite effectively to diagonalize the equations for filtering and prediction for linear state space systems. It has similar advantages in the study of dynamics and control of multibody systems. The innovations approach advanced here consists of expressing the equations of motion in terms of two closely related processes: (1) the innovations process e, a sequence of moments, obtained from the applied moments T by means of a spatially recursive Kalman filter that goes from the tip of the manipulator to its base; (2) a residual process, a sequence of velocities, obtained from the joint-angle velocities by means of an outward smoothing operations. The innovations e and the applied moments T are related by means of the relationships e = (I - L)T and T = (I + K)e. The operation (I - L) is a causal lower triangular matrix which is generated by a spatially recursive Kalman filter and the corresponding discrete-step Riccati equation. Hence, the innovations and the applied moments can be obtained from each other by means of a causal operation which is itself casually invertible.

  15. Reevaluating the conceptual framework for applied research on host-plant resistance.

    PubMed

    Stout, Michael J

    2013-06-01

    Applied research on host-plant resistance to arthropod pests has been guided over the past 60 years by a framework originally developed by Reginald Painter in his 1951 book, Insect Resistance in Crop Plants. Painter divided the "phenomena" of resistance into three "mechanisms," nonpreference (later renamed antixenosis), antibiosis, and tolerance. The weaknesses of this framework are discussed. In particular, this trichotomous framework does not encompass all known mechanisms of resistance, and the antixenosis and antibiosis categories are ambiguous and inseparable in practice. These features have perhaps led to a simplistic approach to understanding arthropod resistance in crop plants. A dichotomous scheme is proposed as a replacement, with a major division between resistance (plant traits that limit injury to the plant) and tolerance (plant traits that reduce amount of yield loss per unit injury), and the resistance category subdivided into constitutive/inducible and direct/indirect subcategories. The most important benefits of adopting this dichotomous scheme are to more closely align the basic and applied literatures on plant resistance and to encourage a more mechanistic approach to studying plant resistance in crop plants. A more mechanistic approach will be needed to develop novel approaches for integrating plant resistance into pest management programs. © 2012 Institute of Zoology, Chinese Academy of Sciences.

  16. Simulating ensembles of source water quality using a K-nearest neighbor resampling approach.

    PubMed

    Towler, Erin; Rajagopalan, Balaji; Seidel, Chad; Summers, R Scott

    2009-03-01

    Climatological, geological, and water management factors can cause significant variability in surface water quality. As drinking water quality standards become more stringent, the ability to quantify the variability of source water quality becomes more important for decision-making and planning in water treatment for regulatory compliance. However, paucity of long-term water quality data makes it challenging to apply traditional simulation techniques. To overcome this limitation, we have developed and applied a robust nonparametric K-nearest neighbor (K-nn) bootstrap approach utilizing the United States Environmental Protection Agency's Information Collection Rule (ICR) data. In this technique, first an appropriate "feature vector" is formed from the best available explanatory variables. The nearest neighbors to the feature vector are identified from the ICR data and are resampled using a weight function. Repetition of this results in water quality ensembles, and consequently the distribution and the quantification of the variability. The main strengths of the approach are its flexibility, simplicity, and the ability to use a large amount of spatial data with limited temporal extent to provide water quality ensembles for any given location. We demonstrate this approach by applying it to simulate monthly ensembles of total organic carbon for two utilities in the U.S. with very different watersheds and to alkalinity and bromide at two other U.S. utilities.

  17. Mandibular molar uprighting using mini-implants: different approaches for different clinical cases--two case reports.

    PubMed

    Derton, Nicola; Perini, Alessandro; Mutinelli, Sabrina; Gracco, Antonio

    2012-01-01

    To detail two different clinical protocols and case studies using mini-implant anchorage developed to respond to certain clinical conditions. Two clinical protocols are described to upright mesially tilted mandibular molars. In the first protocol, a single mini-implant is inserted distally to the molar to be uprighted, and an elastic traction chain is applied to the tooth. In the second clinical approach, two mini-implants are inserted mesially. A screw-suspended TMA sectional archwire is applied (Derton-Perini technique). Two cases, descriptive of the two different treatment protocols, are described. In the first case, the mandibular right second premolar was missing and the adjacent first molar needed to be uprighted. A single screw was inserted distally to the first molar, and an elastic chain was applied. In the second case, the mandibular left second molar was missing and the third molar needed to be uprighted. Two mini-implants were inserted mesially and a fully screw-supported sectional archwire was used to upright and bodily mesialize the third molar. Both uprighting approaches uprighted the molar axis without loss of anchorage. The two approaches to mandibular molar uprighting, developed as rational responses to different clinical cases, were both found to be effective.

  18. Agreement in functional assessment: graphic approaches to displaying respondent effects.

    PubMed

    Haley, Stephen M; Ni, Pengsheng; Coster, Wendy J; Black-Schaffer, Randie; Siebens, Hilary; Tao, Wei

    2006-09-01

    The objective of this study was to examine the agreement between respondents of summary scores from items representing three functional content areas (physical and mobility, personal care and instrumental, applied cognition) within the Activity Measure for Postacute Care (AM-PAC). We compare proxy vs. patient report in both hospital and community settings as represented by intraclass correlation coefficients and two graphic approaches. The authors conducted a prospective, cohort study of a convenience sample of adults (n = 47) receiving rehabilitation services either in hospital (n = 31) or community (n = 16) settings. In addition to using intraclass correlation coefficients (ICC) as indices of agreement, we applied two graphic approaches to serve as complements to help interpret the direction and magnitude of respondent disagreements. We created a "mountain plot" based on a cumulative distribution curve and a "survival-agreement plot" with step functions used in the analysis of survival data. ICCs on summary scores between patient and proxy report were physical and mobility ICC = 0.92, personal care and instrumental ICC = 0.93, and applied cognition ICC = 0.77. Although combined respondent agreement was acceptable, graphic approaches helped interpret differences in separate analyses of clinician and family agreement. Graphic analyses allow for a simple interpretation of agreement data and may be useful in determining the meaningfulness of the amount and direction of interrespondent variation.

  19. Importance of joint efforts for balanced process of designing and education

    NASA Astrophysics Data System (ADS)

    Mayorova, V. I.; Bannova, O. K.; Kristiansen, T.-H.; Igritsky, V. A.

    2015-06-01

    This paper discusses importance of a strategic planning and design process when developing long-term space exploration missions both robotic and manned. The discussion begins with reviewing current and/or traditional international perspectives on space development at the American, Russian and European space agencies. Some analogies and comparisons will be drawn upon analysis of several international student collaborative programs: Summer International workshops at the Bauman Moscow State Technical University, International European Summer Space School "Future Space Technologies and Experiments in Space", Summer school at Stuttgart University in Germany. The paper will focus on discussion about optimization of design and planning processes for successful space exploration missions and will highlight importance of the following: understanding connectivity between different levels of human being and machinery; simultaneous mission planning approach; reflections and correlations between disciplines involved in planning and executing space exploration missions; knowledge gained from different disciplines and through cross-applying and re-applying design approaches between variable space related fields of study and research. The conclusions will summarize benefits and complications of applying balanced design approach at all levels of the design process. Analysis of successes and failures of organizational efforts in space endeavors is used as a methodological approach to identify key questions to be researched as they often cause many planning and design processing problems.

  20. Unique migration of a dental needle into the parapharyngeal space: successful removal by an intraoral approach and simulation for tracking visibility in X-ray fluoroscopy.

    PubMed

    Okumura, Yuri; Hidaka, Hiroshi; Seiji, Kazumasa; Nomura, Kazuhiro; Takata, Yusuke; Suzuki, Takahiro; Katori, Yukio

    2015-02-01

    The first objective was to describe a novel case of migration of a broken dental needle into the parapharyngeal space. The second was to address the importance of simulation elucidating visualization of such a thin needle under X-ray fluoroscopy. Clinical case records (including computed tomography [CT] and surgical approaches) were reviewed, and a simulation experiment using a head phantom was conducted using the same settings applied intraoperatively. A 36-year-old man was referred after failure to locate a broken 31-G dental needle. Computed tomography revealed migration of the needle into the parapharyngeal space. Intraoperative X-ray fluoroscopy failed to identify the needle, so a steel wire was applied as a reference during X-ray to locate the foreign body. The needle was successfully removed using an intraoral approach with tonsillectomy under surgical microscopy. The simulation showed that the dental needle was able to be identified only after applying an appropriate compensating filter, contrasting with the steel wire. Meticulous preoperative simulation regarding visual identification of dental needle foreign bodies is mandatory. Intraoperative radiography and an intraoral approach with tonsillectomy under surgical microscopy offer benefits for accessing the parapharyngeal space, specifically for cases medial to the great vessels. © The Author(s) 2014.

  1. Learning About Dying and Living: An Applied Approach to End-of-Life Communication.

    PubMed

    Pagano, Michael P

    2016-08-01

    The purpose of this article is to expand on prior research in end-of-life communication and death and dying communication apprehension, by developing a unique course that utilizes a hospice setting and an applied, service-learning approach. Therefore, this essay describes and discusses both students' and my experiences over a 7-year period from 2008 through 2014. The courses taught during this time frame provided an opportunity to analyze students' responses, experiences, and discoveries across semesters/years and cocultures. This unique, 3-credit, 14-week, service-learning, end-of-life communication course was developed to provide an opportunity for students to learn the theories related to this field of study and to apply that knowledge through volunteer experiences via interactions with dying patients and their families. The 7 years of author's notes, plus the 91 students' electronically submitted three reflection essays each (273 total documents) across four courses/years, served as the data for this study. According to the students, verbally in class discussions and in numerous writing assignments, this course helped lower their death and dying communication apprehension and increased their willingness to interact with hospice patients and their families. Furthermore, the students' final research papers clearly demonstrated how utilizing a service-learning approach allowed them to apply classroom learnings and interactions with dying patients and their families at the hospice, to their analyses of end-of-life communication theories and behaviors. The results of these classes suggest that other, difficult topic courses (e.g., domestic violence, addiction, etc.) might benefit from a similar pedagogical approach.

  2. The Case Study Approach: Some Theoretical, Methodological and Applied Considerations

    DTIC Science & Technology

    2013-06-01

    For example, a variety of programs/software are available such as: NUDIST , ATLAS/ti, HyperRESEARCH, AQUAD etc (Kelle 1997; Barry 1998... Nudist compared." Sociological Research Online 3(3). Baruch, Y. & Labert, R. (2007). "Organizational anxiety: Applying psychological concepts into

  3. Toward an Applied Administrative Science.

    ERIC Educational Resources Information Center

    Dunbar, Roger L. M.

    1983-01-01

    A study of 65 articles from the 1981 volumes of "Administrative Science Quarterly" and "Harvard Business Review," using smallest space analysis, found that the few studies adopting subjective (instead of objective) approaches to analyzing organizational change were most likely to provide a basis for an applied administrative…

  4. 78 FR 58154 - Importation of Litchi Fruit From Australia

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-23

    ... treated with irradiation and subject to inspection. If irradiation is applied outside the United States... required irradiation treatment. If irradiation is to be applied upon arrival in the United States, the... systems approach that includes requirements for monitoring and oversight, irradiation treatment of the...

  5. A Novel Approach to Apply Gait Synchronized External Forces on the Pelvis using A-TPAD to Reduce Walking Effort

    PubMed Central

    Vashista, Vineet; Khan, Moiz; Agrawal, Sunil K.

    2017-01-01

    In this paper, we develop an intervention to apply external gait synchronized forces on the pelvis to reduce the user’s effort during walking. A cable-driven robot was used to apply the external forces and an adaptive frequency oscillator scheme was developed to adapt the timing of force actuation to the gait frequency during walking. The external forces were directed in the sagittal plane to assist the trailing leg during the forward propulsion and vertical deceleration of the pelvis during the gait cycle. A pilot experiment with five healthy subjects was conducted. The results showed that the subjects applied lower ground reaction forces in the vertical and anterior-posterior directions during the late stance phase. In summary, the current work provides a novel approach to study the role of external pelvic forces in altering the walking effort. These studies can provide better understanding for designing exoskeletons and prosthetic devices to reduce the overall walking effort. PMID:29623294

  6. Local regression type methods applied to the study of geophysics and high frequency financial data

    NASA Astrophysics Data System (ADS)

    Mariani, M. C.; Basu, K.

    2014-09-01

    In this work we applied locally weighted scatterplot smoothing techniques (Lowess/Loess) to Geophysical and high frequency financial data. We first analyze and apply this technique to the California earthquake geological data. A spatial analysis was performed to show that the estimation of the earthquake magnitude at a fixed location is very accurate up to the relative error of 0.01%. We also applied the same method to a high frequency data set arising in the financial sector and obtained similar satisfactory results. The application of this approach to the two different data sets demonstrates that the overall method is accurate and efficient, and the Lowess approach is much more desirable than the Loess method. The previous works studied the time series analysis; in this paper our local regression models perform a spatial analysis for the geophysics data providing different information. For the high frequency data, our models estimate the curve of best fit where data are dependent on time.

  7. [A juvenile nasopharyngeal angiofibroma: our 10-year experience in a tertiary centre].

    PubMed

    Şahin, Bayram; Çomoğlu, Şenol; Sönmez, Said; Polat, Beldan; Değer, Kemal

    2016-01-01

    This study aims to evaluate the demographic characteristics, tumor stage, surgical treatment and recurrence rate among patients operated with a juvenile nasopharyngeal angiofibroma. This retrospective study included 45 patients (44 males, 1 female; mean age 21 years, range 9 to 55 years) who underwent surgery at Istanbul University, Istanbul Medical Faculty, Department of Otorhinolaryngology clinic between March 2006 and July 2015. The patients were classified according to age, sex, presenting symptom, tumor stage, surgical procedure applied, preoperative embolization, perioperative blood transfusion, complications, and the presence of recurrence. The most common presenting symptoms were epistaxis (78%) and nasal obstruction (73%). Preoperative angiography was performed on all patients and embolization was applied in eligible patients (69%). Transnasal endoscopic approach in 31 patients, midfacial degloving in six patients, and lateral rhinotomy approach in three patients were applied. The overall recurrence rate was 31% (n=14). The most important factor in determining the risk of postoperative recurrence is the preoperative tumor stage. Preoperative embolization reduces the amount of perioperative bleeding. Endoscopic transnasal approach decreases the rate of complications and length of hospitalization.

  8. Single-molecule Force Spectroscopy Approach to Enzyme Catalysis*

    PubMed Central

    Alegre-Cebollada, Jorge; Perez-Jimenez, Raul; Kosuri, Pallav; Fernandez, Julio M.

    2010-01-01

    Enzyme catalysis has been traditionally studied using a diverse set of techniques such as bulk biochemistry, x-ray crystallography, and NMR. Recently, single-molecule force spectroscopy by atomic force microscopy has been used as a new tool to study the catalytic properties of an enzyme. In this approach, a mechanical force ranging up to hundreds of piconewtons is applied to the substrate of an enzymatic reaction, altering the conformational energy of the substrate-enzyme interactions during catalysis. From these measurements, the force dependence of an enzymatic reaction can be determined. The force dependence provides valuable new information about the dynamics of enzyme catalysis with sub-angstrom resolution, a feat unmatched by any other current technique. To date, single-molecule force spectroscopy has been applied to gain insight into the reduction of disulfide bonds by different enzymes of the thioredoxin family. This minireview aims to present a perspective on this new approach to study enzyme catalysis and to summarize the results that have already been obtained from it. Finally, the specific requirements that must be fulfilled to apply this new methodology to any other enzyme will be discussed. PMID:20382731

  9. Single-molecule force spectroscopy approach to enzyme catalysis.

    PubMed

    Alegre-Cebollada, Jorge; Perez-Jimenez, Raul; Kosuri, Pallav; Fernandez, Julio M

    2010-06-18

    Enzyme catalysis has been traditionally studied using a diverse set of techniques such as bulk biochemistry, x-ray crystallography, and NMR. Recently, single-molecule force spectroscopy by atomic force microscopy has been used as a new tool to study the catalytic properties of an enzyme. In this approach, a mechanical force ranging up to hundreds of piconewtons is applied to the substrate of an enzymatic reaction, altering the conformational energy of the substrate-enzyme interactions during catalysis. From these measurements, the force dependence of an enzymatic reaction can be determined. The force dependence provides valuable new information about the dynamics of enzyme catalysis with sub-angstrom resolution, a feat unmatched by any other current technique. To date, single-molecule force spectroscopy has been applied to gain insight into the reduction of disulfide bonds by different enzymes of the thioredoxin family. This minireview aims to present a perspective on this new approach to study enzyme catalysis and to summarize the results that have already been obtained from it. Finally, the specific requirements that must be fulfilled to apply this new methodology to any other enzyme will be discussed.

  10. A Systematic Approach to Applying Lean Techniques to Optimize an Office Process at the Y-12 National Security Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Credille, Jennifer; Owens, Elizabeth

    This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restrictedmore » to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.« less

  11. Recent advancements in GRACE mascon regularization and uncertainty assessment

    NASA Astrophysics Data System (ADS)

    Loomis, B. D.; Luthcke, S. B.

    2017-12-01

    The latest release of the NASA Goddard Space Flight Center (GSFC) global time-variable gravity mascon product applies a new regularization strategy along with new methods for estimating noise and leakage uncertainties. The critical design component of mascon estimation is the construction of the applied regularization matrices, and different strategies exist between the different centers that produce mascon solutions. The new approach from GSFC directly applies the pre-fit Level 1B inter-satellite range-acceleration residuals in the design of time-dependent regularization matrices, which are recomputed at each step of our iterative solution method. We summarize this new approach, demonstrating the simultaneous increase in recovered time-variable gravity signal and reduction in the post-fit inter-satellite residual magnitudes, until solution convergence occurs. We also present our new approach for estimating mascon noise uncertainties, which are calibrated to the post-fit inter-satellite residuals. Lastly, we present a new technique for end users to quickly estimate the signal leakage errors for any selected grouping of mascons, and we test the viability of this leakage assessment procedure on the mascon solutions produced by other processing centers.

  12. Multidomain approach for calculating compressible flows

    NASA Technical Reports Server (NTRS)

    Cambier, L.; Chazzi, W.; Veuillot, J. P.; Viviand, H.

    1982-01-01

    A multidomain approach for calculating compressible flows by using unsteady or pseudo-unsteady methods is presented. This approach is based on a general technique of connecting together two domains in which hyperbolic systems (that may differ) are solved with the aid of compatibility relations associated with these systems. Some examples of this approach's application to calculating transonic flows in ideal fluids are shown, particularly the adjustment of shock waves. The approach is then applied to treating a shock/boundary layer interaction problem in a transonic channel.

  13. Validity test of the IPD-Work consortium approach for creating comparable job strain groups between Job Content Questionnaire and Demand-Control Questionnaire.

    PubMed

    Choi, Bongkyoo; Ko, Sangbaek; Ostergren, Per-Olof

    2015-01-01

    This study aims to test the validity of the IPD-Work Consortium approach for creating comparable job strain groups between the Job Content Questionnaire (JCQ) and the Demand-Control Questionnaire (DCQ). A random population sample (N = 682) of all middle-aged Malmö males and females was given a questionnaire with the 14-item JCQ and 11-item DCQ for the job control and job demands. The JCQ job control and job demands scores were calculated in 3 different ways: using the 14-item JCQ standard scale formulas (method 1); dropping 3 job control items and using the 11-item JCQ standard scale formulas with additional scale weights (method 2); and the approach of the IPD Group (method 3), dropping 3 job control items, but using the simple 11-item summation-based scale formulas. The high job strain was defined as a combination of high demands and low control. Between the 2 questionnaires, false negatives for the high job strain were much greater than false positives (37-49% vs. 7-13%). When the method 3 was applied, the sensitivity of the JCQ for the high job strain against the DCQ was lowest (0.51 vs. 0.60-0.63 when the methods 1 and 2 were applied), although the specificity was highest (0.93 vs. 0.87-0.89 when the methods 1 and 2 were applied). The prevalence of the high job strain with the JCQ (the method 3 was applied) was considerably lower (4-7%) than with the JCQ (the methods 1 and 2 were applied) and the DCQ. The number of congruent cases for the high job strain between the 2 questionnaires was smallest when the method 3 was applied. The IPD-Work Consortium approach showed 2 major weaknesses to be used for epidemiological studies on the high job strain and health outcomes as compared to the standard JCQ methods: the greater misclassification of the high job strain and lower prevalence of the high job strain. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  14. A Comprehensive Planning Model

    ERIC Educational Resources Information Center

    Temkin, Sanford

    1972-01-01

    Combines elements of the problem solving approach inherent in methods of applied economics and operations research and the structural-functional analysis common in social science modeling to develop an approach for economic planning and resource allocation for schools and other public sector organizations. (Author)

  15. How Children Choose among Serial Recall Strategies.

    ERIC Educational Resources Information Center

    McGilly, Kate; Siegler, Robert S.

    1989-01-01

    Investigated the serial recall strategies of 96 children aged 5-8 years by applying a theoretical and methodological approach originally developed to investigate preschoolers' arithmetic strategies. Results indicated the use of multiple approaches for serial recall and adaptive strategy choices. (RJC)

  16. 75 FR 49491 - Telecommunications Relay Services and Speech-to-Speech Services for Individuals With Hearing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-13

    ... costs or any other approach that deviates from the incentive-based (or projected-`cost') approach... providing VRS-- should, in theory, apply equally to reliance on projected cost data in VRS rate setting...

  17. Presidential Address: Empowerment Evaluation.

    ERIC Educational Resources Information Center

    Fetterman, David

    1994-01-01

    Empowerment evaluation is the use of evaluation concepts and techniques to foster self-determination, focusing on helping people help themselves. This collaborative evaluation approach requires both qualitative and quantitative methodologies. It is a multifaceted approach that can be applied to evaluation in any area. (SLD)

  18. Possibilities of inversion of satellite third-order gravitational tensor onto gravity anomalies: a case study for central Europe

    NASA Astrophysics Data System (ADS)

    Pitoňák, Martin; Šprlák, Michal; Tenzer, Robert

    2017-05-01

    We investigate a numerical performance of four different schemes applied to a regional recovery of the gravity anomalies from the third-order gravitational tensor components (assumed to be observable in the future) synthetized at the satellite altitude of 200 km above the mean sphere. The first approach is based on applying a regional inversion without modelling the far-zone contribution or long-wavelength support. In the second approach we separate integral formulas into two parts, that is, the effects of the third-order disturbing tensor data within near and far zones. Whereas the far-zone contribution is evaluated by using existing global geopotential model (GGM) with spectral weights given by truncation error coefficients, the near-zone contribution is solved by applying a regional inversion. We then extend this approach for a smoothing procedure, in which we remove the gravitational contributions of the topographic-isostatic and atmospheric masses. Finally, we apply the remove-compute-restore (r-c-r) scheme in order to reduce the far-zone contribution by subtracting the reference (long-wavelength) gravity field, which is computed for maximum degree 80. We apply these four numerical schemes to a regional recovery of the gravity anomalies from individual components of the third-order gravitational tensor as well as from their combinations, while applying two different levels of a white noise. We validated our results with respect to gravity anomalies evaluated at the mean sphere from EGM2008 up to the degree 250. Not surprisingly, better fit in terms of standard deviation (STD) was attained using lower level of noise. The worst results were gained applying classical approach, STD values of our solution from Tzzz are 1.705 mGal (noise value with a standard deviation 0.01 × 10 - 15m - 1s - 2) and 2.005 mGal (noise value with a standard deviation 0.05 × 10 - 15m - 1s - 2), while the superior from r-c-r up to the degree 80, STD fit of gravity anomalies from Tzzz with respect to the same counterpart from EGM2008 is 0.510 mGal (noise value with a standard deviation 0.01 × 10 - 15m - 1s - 2) and 1.190 mGal (noise value with a standard deviation 0.05 × 10 - 15m - 1s - 2).

  19. Valuing national effects of digital health investments: an applied method.

    PubMed

    Hagens, Simon; Zelmer, Jennifer; Frazer, Cassandra; Gheorghiu, Bobby; Leaver, Chad

    2015-01-01

    This paper describes an approach which has been applied to value national outcomes of investments by federal, provincial and territorial governments, clinicians and healthcare organizations in digital health. Hypotheses are used to develop a model, which is revised and populated based upon the available evidence. Quantitative national estimates and qualitative findings are produced and validated through structured peer review processes. This methodology has applied in four studies since 2008.

  20. Developing a plan for primary health care facilities in Soweto, South Africa. Part II: Applying locational criteria.

    PubMed

    Doherty, J; Rispel, L; Webb, N

    1996-12-01

    This article is the second of a two-part series describing the development of a ten-year plan for primary health care facility development in Soweto. The first article concentrated on the political problems and general methodological approach of the project. This second article describes how the technical problem of planning in the context of scanty information was overcome. The reasoning behind the various assumptions and criteria which were used to assist the planning of the location of facilities is explained, as well as the process by which they were applied. The merits and limitations of this planning approach are discussed, and it is suggested that the approach may be useful to other facility planners, particularly in the developing world.

  1. An operational approach to high resolution agro-ecological zoning in West-Africa.

    PubMed

    Le Page, Y; Vasconcelos, Maria; Palminha, A; Melo, I Q; Pereira, J M C

    2017-01-01

    The objective of this work is to develop a simple methodology for high resolution crop suitability analysis under current and future climate, easily applicable and useful in Least Developed Countries. The approach addresses both regional planning in the context of climate change projections and pre-emptive short-term rural extension interventions based on same-year agricultural season forecasts, while implemented with off-the-shelf resources. The developed tools are applied operationally in a case-study developed in three regions of Guinea-Bissau and the obtained results, as well as the advantages and limitations of methods applied, are discussed. In this paper we show how a simple approach can easily generate information on climate vulnerability and how it can be operationally used in rural extension services.

  2. An effective model for ergonomic optimization applied to a new automotive assembly line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duraccio, Vincenzo; Elia, Valerio; Forcina, Antonio

    2016-06-08

    An efficient ergonomic optimization can lead to a significant improvement in production performance and a considerable reduction of costs. In the present paper new model for ergonomic optimization is proposed. The new approach is based on the criteria defined by National Institute of Occupational Safety and Health and, adapted to Italian legislation. The proposed model provides an ergonomic optimization, by analyzing ergonomic relations between manual work in correct conditions. The model includes a schematic and systematic analysis method of the operations, and identifies all possible ergonomic aspects to be evaluated. The proposed approach has been applied to an automotive assemblymore » line, where the operation repeatability makes the optimization fundamental. The proposed application clearly demonstrates the effectiveness of the new approach.« less

  3. Bayesian evidence computation for model selection in non-linear geoacoustic inference problems.

    PubMed

    Dettmer, Jan; Dosso, Stan E; Osler, John C

    2010-12-01

    This paper applies a general Bayesian inference approach, based on Bayesian evidence computation, to geoacoustic inversion of interface-wave dispersion data. Quantitative model selection is carried out by computing the evidence (normalizing constants) for several model parameterizations using annealed importance sampling. The resulting posterior probability density estimate is compared to estimates obtained from Metropolis-Hastings sampling to ensure consistent results. The approach is applied to invert interface-wave dispersion data collected on the Scotian Shelf, off the east coast of Canada for the sediment shear-wave velocity profile. Results are consistent with previous work on these data but extend the analysis to a rigorous approach including model selection and uncertainty analysis. The results are also consistent with core samples and seismic reflection measurements carried out in the area.

  4. Deconvolution When Classifying Noisy Data Involving Transformations.

    PubMed

    Carroll, Raymond; Delaigle, Aurore; Hall, Peter

    2012-09-01

    In the present study, we consider the problem of classifying spatial data distorted by a linear transformation or convolution and contaminated by additive random noise. In this setting, we show that classifier performance can be improved if we carefully invert the data before the classifier is applied. However, the inverse transformation is not constructed so as to recover the original signal, and in fact, we show that taking the latter approach is generally inadvisable. We introduce a fully data-driven procedure based on cross-validation, and use several classifiers to illustrate numerical properties of our approach. Theoretical arguments are given in support of our claims. Our procedure is applied to data generated by light detection and ranging (Lidar) technology, where we improve on earlier approaches to classifying aerosols. This article has supplementary materials online.

  5. Segmentation of bone pixels from EROI Image using clustering method for bone age assessment

    NASA Astrophysics Data System (ADS)

    Bakthula, Rajitha; Agarwal, Suneeta

    2016-03-01

    The bone age of a human can be identified using carpal and epiphysis bones ossification, which is limited to teen age. The accurate age estimation depends on best separation of bone pixels and soft tissue pixels in the ROI image. The traditional approaches like canny, sobel, clustering, region growing and watershed can be applied, but these methods requires proper pre-processing and accurate initial seed point estimation to provide accurate results. Therefore this paper proposes new approach to segment the bone from soft tissue and background pixels. First pixels are enhanced using BPE and the edges are identified by HIPI. Later a K-Means clustering is applied for segmentation. The performance of the proposed approach has been evaluated and compared with the existing methods.

  6. Development of a neural network technique for KSTAR Thomson scattering diagnostics.

    PubMed

    Lee, Seung Hun; Lee, J H; Yamada, I; Park, Jae Sun

    2016-11-01

    Neural networks provide powerful approaches of dealing with nonlinear data and have been successfully applied to fusion plasma diagnostics and control systems. Controlling tokamak plasmas in real time is essential to measure the plasma parameters in situ. However, the χ 2 method traditionally used in Thomson scattering diagnostics hampers real-time measurement due to the complexity of the calculations involved. In this study, we applied a neural network approach to Thomson scattering diagnostics in order to calculate the electron temperature, comparing the results to those obtained with the χ 2 method. The best results were obtained for 10 3 training cycles and eight nodes in the hidden layer. Our neural network approach shows good agreement with the χ 2 method and performs the calculation twenty times faster.

  7. Applying a Student Curriculum Discourse in Higher Education Teaching and Learning

    ERIC Educational Resources Information Center

    Mndzebele, S. L.; Mckenna, S.

    2013-01-01

    Indications of poor quality in students' written work necessitated the need for deeper investigations aimed at designing and applying appropriate teaching/learning and assessment innovations in the course curriculum. The project-exercise engaged a conceptual-explorative approach through: reviews/investigations; educational diagnosis;…

  8. Defining and Applying a Functionality Approach to Intellectual Disability

    ERIC Educational Resources Information Center

    Luckasson, R.; Schalock, R. L.

    2013-01-01

    Background: The current functional models of disability do not adequately incorporate significant changes of the last three decades in our understanding of human functioning, and how the human functioning construct can be applied to clinical functions, professional practices and outcomes evaluation. Methods: The authors synthesise current…

  9. Semiotic Work: Applied Linguistics and a Social Semiotic Account of Multimodality

    ERIC Educational Resources Information Center

    Kress, Gunther

    2015-01-01

    This article imagines a tussle between Multimodality, focused on "modes," and Applied Linguistics (AL), based on "language." A Social Semiotic approach to MM treats "speech" and "writing" as modes with distinct affordances, and, as all modes, treats them as "partial" means of communication. The…

  10. Integrating Opportunities: Applied Interdisciplinary Research in Undergraduate Geography and Geology Education

    ERIC Educational Resources Information Center

    Viertel, David C.; Burns, Diane M.

    2012-01-01

    Unique integrative learning approaches represent a fundamental opportunity for undergraduate students and faculty alike to combine interdisciplinary methods with applied spatial research. Geography and geoscience-related disciplines are particularly well-suited to adapt multiple methods within a holistic and reflective mentored research paradigm.…

  11. Populations, Natural Selection, and Applied Organizational Science.

    ERIC Educational Resources Information Center

    McKelvey, Bill; Aldrich, Howard

    1983-01-01

    Deficiencies in existing models in organizational science may be remedied by applying the population approach, with its concepts of taxonomy, classification, evolution, and population ecology; and natural selection theory, with its principles of variation, natural selection, heredity, and struggle for existence, to the idea of organizational forms…

  12. Applied Computational Chemistry for the Blind and Visually Impaired

    ERIC Educational Resources Information Center

    Wedler, Henry B.; Cohen, Sarah R.; Davis, Rebecca L.; Harrison, Jason G.; Siebert, Matthew R.; Willenbring, Dan; Hamann, Christian S.; Shaw, Jared T.; Tantillo, Dean J.

    2012-01-01

    We describe accommodations that we have made to our applied computational-theoretical chemistry laboratory to provide access for blind and visually impaired students interested in independent investigation of structure-function relationships. Our approach utilizes tactile drawings, molecular model kits, existing software, Bash and Perl scripts…

  13. Joyful Learning in Kindergarten. Revised Edition.

    ERIC Educational Resources Information Center

    Fisher, Bobbi

    Applying the conditions of natural learning to create caring kindergarten classroom environments may support students as lifelong learners. This book presents a natural learning classroom model for implementing a whole-language approach in kindergarten. The chapters are as follows: (1) "My Beliefs about How Children Learn"; (2) "Applying Whole…

  14. Reproducibility of objectively measured physical activity and sedentary time over two seasons in children; Comparing a day-by-day and a week-by-week approach

    PubMed Central

    Andersen, Lars Bo; Skrede, Turid; Ekelund, Ulf; Anderssen, Sigmund Alfred; Resaland, Geir Kåre

    2017-01-01

    Introduction Knowledge of reproducibility of accelerometer-determined physical activity (PA) and sedentary time (SED) estimates are a prerequisite to conduct high-quality epidemiological studies. Yet, estimates of reproducibility might differ depending on the approach used to analyze the data. The aim of the present study was to determine the reproducibility of objectively measured PA and SED in children by directly comparing a day-by-day and a week-by-week approach to data collected over two weeks during two different seasons 3–4 months apart. Methods 676 11-year-old children from the Active Smarter Kids study conducted in Sogn og Fjordane county, Norway, performed 7 days of accelerometer monitoring (ActiGraph GT3X+) during January-February and April-May 2015. Reproducibility was calculated using a day-by-day and a week-by-week approach applying mixed effect modelling and the Spearman Brown prophecy formula, and reported using intra-class correlation (ICC), Bland Altman plots and 95% limits of agreement (LoA). Results Applying a week-by-week approach, no variables provided ICC estimates ≥ 0.70 for one week of measurement in any model (ICC = 0.29–0.66 not controlling for season; ICC = 0.49–0.67 when controlling for season). LoA for these models approximated a factor of 1.3–1.7 of the sample PA level standard deviations. Compared to the week-by-week approach, the day-by-day approach resulted in too optimistic reliability estimates (ICC = 0.62–0.77 not controlling for season; ICC = 0.64–0.77 when controlling for season). Conclusions Reliability is lower when analyzed over different seasons and when using a week-by-week approach, than when applying a day-by-day approach and the Spearman Brown prophecy formula to estimate reliability over a short monitoring period. We suggest a day-by-day approach and the Spearman Brown prophecy formula to determine reliability be used with caution. Trial Registration The study is registered in Clinicaltrials.gov 7th April 2014 with identification number NCT02132494. PMID:29216318

  15. The friction cost method: a comment.

    PubMed

    Johannesson, M; Karlsson, G

    1997-04-01

    The friction cost method has been proposed as an alternative to the human-capital approach of estimating indirect costs. We argue that the friction cost method is based on implausible assumptions not supported by neoclassical economic theory. Furthermore consistently applying the friction cost method would mean that the method should also be applied in the estimation of direct costs, which would mean that the costs of health care programmes are substantially decreased. It is concluded that the friction cost method does not seem to be a useful alternative to the human-capital approach in the estimation of indirect costs.

  16. Evaluating the performance of distributed approaches for modal identification

    NASA Astrophysics Data System (ADS)

    Krishnan, Sriram S.; Sun, Zhuoxiong; Irfanoglu, Ayhan; Dyke, Shirley J.; Yan, Guirong

    2011-04-01

    In this paper two modal identification approaches appropriate for use in a distributed computing environment are applied to a full-scale, complex structure. The natural excitation technique (NExT) is used in conjunction with a condensed eigensystem realization algorithm (ERA), and the frequency domain decomposition with peak-picking (FDD-PP) are both applied to sensor data acquired from a 57.5-ft, 10 bay highway sign truss structure. Monte-Carlo simulations are performed on a numerical example to investigate the statistical properties and sensitivity to noise of the two distributed algorithms. Experimental results are provided and discussed.

  17. Empirically Examining the Performance of Approaches to Multi-Level Matching to Study the Effect of School-Level Interventions

    ERIC Educational Resources Information Center

    Hallberg, Kelly; Cook, Thomas D.; Figlio, David

    2013-01-01

    The goal of this paper is to provide guidance for applied education researchers in using multi-level data to study the effects of interventions implemented at the school level. Two primary approaches are currently employed in observational studies of the effect of school-level interventions. One approach employs intact school matching: matching…

  18. Application of the algebraic difference approach for developing self-referencing specific gravity and biomass equations

    Treesearch

    Lewis Jordan; Ray Souter; Bernard Parresol; Richard F. Daniels

    2006-01-01

    Biomass estimation is critical for looking at ecosystem processes and as a measure of stand yield. The density-integral approach allows for coincident estimation of stem profile and biomass. The algebraic difference approach (ADA) permits the derivation of dynamic or nonstatic functions. In this study we applied the ADA to develop a self-referencing specific gravity...

  19. A Project-Based Digital Storytelling Approach for Improving Students' Learning Motivation, Problem-Solving Competence and Learning Achievement

    ERIC Educational Resources Information Center

    Hung, Chun-Ming; Hwang, Gwo-Jen; Huang, Iwen

    2012-01-01

    Although project-based learning is a well-known and widely used instructional strategy, it remains a challenging issue to effectively apply this approach to practical settings for improving the learning performance of students. In this study, a project-based digital storytelling approach is proposed to cope with this problem. With a…

  20. A generalized least-squares framework for rare-variant analysis in family data.

    PubMed

    Li, Dalin; Rotter, Jerome I; Guo, Xiuqing

    2014-01-01

    Rare variants may, in part, explain some of the hereditability missing in current genome-wide association studies. Many gene-based rare-variant analysis approaches proposed in recent years are aimed at population-based samples, although analysis strategies for family-based samples are clearly warranted since the family-based design has the potential to enhance our ability to enrich for rare causal variants. We have recently developed the generalized least squares, sequence kernel association test, or GLS-SKAT, approach for the rare-variant analyses in family samples, in which the kinship matrix that was computed from the high dimension genetic data was used to decorrelate the family structure. We then applied the SKAT-O approach for gene-/region-based inference in the decorrelated data. In this study, we applied this GLS-SKAT method to the systolic blood pressure data in the simulated family sample distributed by the Genetic Analysis Workshop 18. We compared the GLS-SKAT approach to the rare-variant analysis approach implemented in family-based association test-v1 and demonstrated that the GLS-SKAT approach provides superior power and good control of type I error rate.

  1. An Alternative Approach to the Operation of Multinational Reservoir Systems: Application to the Amistad & Falcon System (Lower Rio Grande/Rí-o Bravo)

    NASA Astrophysics Data System (ADS)

    Serrat-Capdevila, A.; Valdes, J. B.

    2005-12-01

    An optimization approach for the operation of international multi-reservoir systems is presented. The approach uses Stochastic Dynamic Programming (SDP) algorithms, both steady-state and real-time, to develop two models. In the first model, the reservoirs and flows of the system are aggregated to yield an equivalent reservoir, and the obtained operating policies are disaggregated using a non-linear optimization procedure for each reservoir and for each nation water balance. In the second model a multi-reservoir approach is applied, disaggregating the releases for each country water share in each reservoir. The non-linear disaggregation algorithm uses SDP-derived operating policies as boundary conditions for a local time-step optimization. Finally, the performance of the different approaches and methods is compared. These models are applied to the Amistad-Falcon International Reservoir System as part of a binational dynamic modeling effort to develop a decision support system tool for a better management of the water resources in the Lower Rio Grande Basin, currently enduring a severe drought.

  2. An Accurate and Generic Testing Approach to Vehicle Stability Parameters Based on GPS and INS.

    PubMed

    Miao, Zhibin; Zhang, Hongtian; Zhang, Jinzhu

    2015-12-04

    With the development of the vehicle industry, controlling stability has become more and more important. Techniques of evaluating vehicle stability are in high demand. As a common method, usually GPS sensors and INS sensors are applied to measure vehicle stability parameters by fusing data from the two system sensors. Although prior model parameters should be recognized in a Kalman filter, it is usually used to fuse data from multi-sensors. In this paper, a robust, intelligent and precise method to the measurement of vehicle stability is proposed. First, a fuzzy interpolation method is proposed, along with a four-wheel vehicle dynamic model. Second, a two-stage Kalman filter, which fuses the data from GPS and INS, is established. Next, this approach is applied to a case study vehicle to measure yaw rate and sideslip angle. The results show the advantages of the approach. Finally, a simulation and real experiment is made to verify the advantages of this approach. The experimental results showed the merits of this method for measuring vehicle stability, and the approach can meet the design requirements of a vehicle stability controller.

  3. The application of an industry level participatory ergonomics approach in developing MSD interventions.

    PubMed

    Tappin, D C; Vitalis, A; Bentley, T A

    2016-01-01

    Participatory ergonomics projects are traditionally applied within one organisation. In this study, a participative approach was applied across the New Zealand meat processing industry, involving multiple organisations and geographical regions. The purpose was to develop interventions to reduce musculoskeletal disorder (MSD) risk. This paper considers the value of an industry level participatory ergonomics approach in achieving this. The main rationale for a participative approach included the need for industry credibility, and to generate MSD interventions that address industry level MSD risk factors. An industry key stakeholder group became the primary vehicle for formal participation. The study resulted in an intervention plan that included the wider work system and industry practices. These interventions were championed across the industry by the key stakeholder group and have extended beyond the life of the study. While this approach helped to meet the study aim, the existence of an industry-supported key stakeholder group and a mandate for the initiative are important prerequisites for success. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Robust Spatial Approximation of Laser Scanner Point Clouds by Means of Free-form Curve Approaches in Deformation Analysis

    NASA Astrophysics Data System (ADS)

    Bureick, Johannes; Alkhatib, Hamza; Neumann, Ingo

    2016-03-01

    In many geodetic engineering applications it is necessary to solve the problem of describing a measured data point cloud, measured, e. g. by laser scanner, by means of free-form curves or surfaces, e. g., with B-Splines as basis functions. The state of the art approaches to determine B-Splines yields results which are seriously manipulated by the occurrence of data gaps and outliers. Optimal and robust B-Spline fitting depend, however, on optimal selection of the knot vector. Hence we combine in our approach Monte-Carlo methods and the location and curvature of the measured data in order to determine the knot vector of the B-Spline in such a way that no oscillating effects at the edges of data gaps occur. We introduce an optimized approach based on computed weights by means of resampling techniques. In order to minimize the effect of outliers, we apply robust M-estimators for the estimation of control points. The above mentioned approach will be applied to a multi-sensor system based on kinematic terrestrial laserscanning in the field of rail track inspection.

  5. Hybrid computational and experimental approach for the study and optimization of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-05-01

    Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.

  6. CHAMP: a locally adaptive unmixing-based hyperspectral anomaly detection algorithm

    NASA Astrophysics Data System (ADS)

    Crist, Eric P.; Thelen, Brian J.; Carrara, David A.

    1998-10-01

    Anomaly detection offers a means by which to identify potentially important objects in a scene without prior knowledge of their spectral signatures. As such, this approach is less sensitive to variations in target class composition, atmospheric and illumination conditions, and sensor gain settings than would be a spectral matched filter or similar algorithm. The best existing anomaly detectors generally fall into one of two categories: those based on local Gaussian statistics, and those based on linear mixing moles. Unmixing-based approaches better represent the real distribution of data in a scene, but are typically derived and applied on a global or scene-wide basis. Locally adaptive approaches allow detection of more subtle anomalies by accommodating the spatial non-homogeneity of background classes in a typical scene, but provide a poorer representation of the true underlying background distribution. The CHAMP algorithm combines the best attributes of both approaches, applying a linear-mixing model approach in a spatially adaptive manner. The algorithm itself, and teste results on simulated and actual hyperspectral image data, are presented in this paper.

  7. Hierarchical organization of functional connectivity in the mouse brain: a complex network approach.

    PubMed

    Bardella, Giampiero; Bifone, Angelo; Gabrielli, Andrea; Gozzi, Alessandro; Squartini, Tiziano

    2016-08-18

    This paper represents a contribution to the study of the brain functional connectivity from the perspective of complex networks theory. More specifically, we apply graph theoretical analyses to provide evidence of the modular structure of the mouse brain and to shed light on its hierarchical organization. We propose a novel percolation analysis and we apply our approach to the analysis of a resting-state functional MRI data set from 41 mice. This approach reveals a robust hierarchical structure of modules persistent across different subjects. Importantly, we test this approach against a statistical benchmark (or null model) which constrains only the distributions of empirical correlations. Our results unambiguously show that the hierarchical character of the mouse brain modular structure is not trivially encoded into this lower-order constraint. Finally, we investigate the modular structure of the mouse brain by computing the Minimal Spanning Forest, a technique that identifies subnetworks characterized by the strongest internal correlations. This approach represents a faster alternative to other community detection methods and provides a means to rank modules on the basis of the strength of their internal edges.

  8. Hierarchical organization of functional connectivity in the mouse brain: a complex network approach

    NASA Astrophysics Data System (ADS)

    Bardella, Giampiero; Bifone, Angelo; Gabrielli, Andrea; Gozzi, Alessandro; Squartini, Tiziano

    2016-08-01

    This paper represents a contribution to the study of the brain functional connectivity from the perspective of complex networks theory. More specifically, we apply graph theoretical analyses to provide evidence of the modular structure of the mouse brain and to shed light on its hierarchical organization. We propose a novel percolation analysis and we apply our approach to the analysis of a resting-state functional MRI data set from 41 mice. This approach reveals a robust hierarchical structure of modules persistent across different subjects. Importantly, we test this approach against a statistical benchmark (or null model) which constrains only the distributions of empirical correlations. Our results unambiguously show that the hierarchical character of the mouse brain modular structure is not trivially encoded into this lower-order constraint. Finally, we investigate the modular structure of the mouse brain by computing the Minimal Spanning Forest, a technique that identifies subnetworks characterized by the strongest internal correlations. This approach represents a faster alternative to other community detection methods and provides a means to rank modules on the basis of the strength of their internal edges.

  9. A new hybrid case-based reasoning approach for medical diagnosis systems.

    PubMed

    Sharaf-El-Deen, Dina A; Moawad, Ibrahim F; Khalifa, M E

    2014-02-01

    Case-Based Reasoning (CBR) has been applied in many different medical applications. Due to the complexities and the diversities of this domain, most medical CBR systems become hybrid. Besides, the case adaptation process in CBR is often a challenging issue as it is traditionally carried out manually by domain experts. In this paper, a new hybrid case-based reasoning approach for medical diagnosis systems is proposed to improve the accuracy of the retrieval-only CBR systems. The approach integrates case-based reasoning and rule-based reasoning, and also applies the adaptation process automatically by exploiting adaptation rules. Both adaptation rules and reasoning rules are generated from the case-base. After solving a new case, the case-base is expanded, and both adaptation and reasoning rules are updated. To evaluate the proposed approach, a prototype was implemented and experimented to diagnose breast cancer and thyroid diseases. The final results show that the proposed approach increases the diagnosing accuracy of the retrieval-only CBR systems, and provides a reliable accuracy comparing to the current breast cancer and thyroid diagnosis systems.

  10. Innovative Method in Improving Communication Issues by Applying Interdisciplinary Approach. Psycholinguistic Perspective to Mitigate Communication Troubles During Cislunar Travel.

    NASA Astrophysics Data System (ADS)

    Anikushina, V.; Taratukhin, V.; Stutterheim, C. v.; Gushin, V.

    2018-02-01

    A new psycholinguistic view on the crew communication, combined with biochemical and psychological data, contributes to noninvasive methods for stress appraisal and proposes alternative approaches to improve in-group communication and cohesion.

  11. AI AND SAR APPROACHES FOR PREDICTING CHEMICAL CARCINOGENICITY: SURVEY AND STATUS REPORT

    EPA Science Inventory

    A wide variety of artificial intelligence (AI) and structure-activity relationship (SAR approaches have been applied to tackling the general problem of predicting rodent chemical carcinogenicity. Given the diversity of chemical structures and mechanisms relative to this endpoin...

  12. Human Reliability Analysis in Support of Risk Assessment for Positive Train Control

    DOT National Transportation Integrated Search

    2003-06-01

    This report describes an approach to evaluating the reliability of human actions that are modeled in a probabilistic risk assessment : (PRA) of train control operations. This approach to human reliability analysis (HRA) has been applied in the case o...

  13. Participation motives in physical education: an expectancy-value approach.

    PubMed

    Goudas, Marios; Dermitzaki, Irini

    2004-12-01

    This study applied an expectancy-value approach in examining participation motives of students in physical education. As predicted outcome expectancy, a variable formed by the combination of outcome value and outcome likelihood correlated significantly higher with motivational indices than these two factors.

  14. Applied approach slab settlement research, design/construction : final report.

    DOT National Transportation Integrated Search

    2013-08-01

    Approach embankment settlement is a pervasive problem in Oklahoma and many other states. The bump and/or abrupt slope change poses a danger to traffic and can cause increased dynamic loads on the bridge. Frequent and costly maintenance may be needed ...

  15. THE FUTURE OF TOXICOLOGY-PREDICTIVE TOXICOLOGY: AN EXPANDED VIEW OF CHEMICAL TOXICITY

    EPA Science Inventory

    A chemistry approach to predictive toxicology relies on structure−activity relationship (SAR) modeling to predict biological activity from chemical structure. Such approaches have proven capabilities when applied to well-defined toxicity end points or regions of chemical space. T...

  16. Mapping and monitoring changes in vegetation communities of Jasper Ridge, CA, using spectral fractions derived from AVIRIS images

    NASA Technical Reports Server (NTRS)

    Sabol, Donald E., Jr.; Roberts, Dar A.; Adams, John B.; Smith, Milton O.

    1993-01-01

    An important application of remote sensing is to map and monitor changes over large areas of the land surface. This is particularly significant with the current interest in monitoring vegetation communities. Most of traditional methods for mapping different types of plant communities are based upon statistical classification techniques (i.e., parallel piped, nearest-neighbor, etc.) applied to uncalibrated multispectral data. Classes from these techniques are typically difficult to interpret (particularly to a field ecologist/botanist). Also, classes derived for one image can be very different from those derived from another image of the same area, making interpretation of observed temporal changes nearly impossible. More recently, neural networks have been applied to classification. Neural network classification, based upon spectral matching, is weak in dealing with spectral mixtures (a condition prevalent in images of natural surfaces). Another approach to mapping vegetation communities is based on spectral mixture analysis, which can provide a consistent framework for image interpretation. Roberts et al. (1990) mapped vegetation using the band residuals from a simple mixing model (the same spectral endmembers applied to all image pixels). Sabol et al. (1992b) and Roberts et al. (1992) used different methods to apply the most appropriate spectral endmembers to each image pixel, thereby allowing mapping of vegetation based upon the the different endmember spectra. In this paper, we describe a new approach to classification of vegetation communities based upon the spectra fractions derived from spectral mixture analysis. This approach was applied to three 1992 AVIRIS images of Jasper Ridge, California to observe seasonal changes in surface composition.

  17. Predicting performance of polymer-bonded Terfenol-D composites under different magnetic fields

    NASA Astrophysics Data System (ADS)

    Guan, Xinchun; Dong, Xufeng; Ou, Jinping

    2009-09-01

    Considering demagnetization effect, the model used to calculate the magnetostriction of the single particle under the applied field is first created. Based on Eshelby equivalent inclusion and Mori-Tanaka method, the approach to calculate the average magnetostriction of the composites under any applied field, as well as the saturation, is studied by treating the magnetostriction particulate as an eigenstrain. The results calculated by the approach indicate that saturation magnetostriction of magnetostrictive composites increases with an increase of particle aspect and particle volume fraction, and a decrease of Young's modulus of the matrix. The influence of an applied field on magnetostriction of the composites becomes more significant with larger particle volume fraction or particle aspect. Experiments were done to verify the effectiveness of the model, the results of which indicate that the model only can provide approximate results.

  18. Effects of High-Pressure Treatment on the Muscle Proteome of Hake by Bottom-Up Proteomics.

    PubMed

    Carrera, Mónica; Fidalgo, Liliana G; Saraiva, Jorge A; Aubourg, Santiago P

    2018-05-02

    A bottom-up proteomics approach was applied for the study of the effects of high-pressure (HP) treatment on the muscle proteome of fish. The performance of the approach was established for a previous HP treatment (150-450 MPa for 2 min) on frozen (up to 5 months at -10 °C) European hake ( Merluccius merluccius). Concerning possible protein biomarkers of quality changes, a significant degradation after applying a pressure ≥430 MPa could be observed for phosphoglycerate mutase-1, enolase, creatine kinase, fructose bisphosphate aldolase, triosephosphate isomerase, and nucleoside diphosphate kinase; contrary, electrophoretic bands assigned to tropomyosin, glyceraldehyde-3-phosphate dehydrogenase, and beta parvalbumin increased their intensity after applying a pressure ≥430 MPa. This repository of potential protein biomarkers may be very useful for further HP investigations related to fish quality.

  19. Self-regulatory Behaviors and Approaches to Learning of Arts Students: A Comparison Between Professional Training and English Learning.

    PubMed

    Tseng, Min-Chen; Chen, Chia-Cheng

    2017-06-01

    This study investigated the self-regulatory behaviors of arts students, namely memory strategy, goal-setting, self-evaluation, seeking assistance, environmental structuring, learning responsibility, and planning and organizing. We also explored approaches to learning, including deep approach (DA) and surface approach (SA), in a comparison between students' professional training and English learning. The participants consisted of 344 arts majors. The Academic Self-Regulation Questionnaire and the Revised Learning Process Questionnaire were adopted to examine students' self-regulatory behaviors and their approaches to learning. The results show that a positive and significant correlation was found in students' self-regulatory behaviors between professional training and English learning. The results indicated that increases in using self-regulatory behaviors in professional training were associated with increases in applying self-regulatory behaviors in learning English. Seeking assistance, self-evaluation, and planning and organizing were significant predictors for learning English. In addition, arts students used the deep approach more often than the surface approach in both their professional training and English learning. A positive correlation was found in DA, whereas a negative correlation was shown in SA between students' self-regulatory behaviors and their approaches to learning. Students with high self-regulation adopted a deep approach, and they applied the surface approach less in professional training and English learning. In addition, a SEM model confirmed that DA had a positive influence; however, SA had a negative influence on self-regulatory behaviors.

  20. An Ensemble Approach for Drug Side Effect Prediction

    PubMed Central

    Jahid, Md Jamiul; Ruan, Jianhua

    2014-01-01

    In silico prediction of drug side-effects in early stage of drug development is becoming more popular now days, which not only reduces the time for drug design but also reduces the drug development costs. In this article we propose an ensemble approach to predict drug side-effects of drug molecules based on their chemical structure. Our idea originates from the observation that similar drugs have similar side-effects. Based on this observation we design an ensemble approach that combine the results from different classification models where each model is generated by a different set of similar drugs. We applied our approach to 1385 side-effects in the SIDER database for 888 drugs. Results show that our approach outperformed previously published approaches and standard classifiers. Furthermore, we applied our method to a number of uncharacterized drug molecules in DrugBank database and predict their side-effect profiles for future usage. Results from various sources confirm that our method is able to predict the side-effects for uncharacterized drugs and more importantly able to predict rare side-effects which are often ignored by other approaches. The method described in this article can be useful to predict side-effects in drug design in an early stage to reduce experimental cost and time. PMID:25327524

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marquez, Andres; Manzano Franco, Joseph B.; Song, Shuaiwen

    With Exascale performance and its challenges in mind, one ubiquitous concern among architects is energy efficiency. Petascale systems projected to Exascale systems are unsustainable at current power consumption rates. One major contributor to system-wide power consumption is the number of memory operations leading to data movement and management techniques applied by the runtime system. To address this problem, we present the concept of the Architected Composite Data Types (ACDT) framework. The framework is made aware of data composites, assigning them a specific layout, transformations and operators. Data manipulation overhead is amortized over a larger number of elements and program performancemore » and power efficiency can be significantly improved. We developed the fundamentals of an ACDT framework on a massively multithreaded adaptive runtime system geared towards Exascale clusters. Showcasing the capability of ACDT, we exercised the framework with two representative processing kernels - Matrix Vector Multiply and the Cholesky Decomposition – applied to sparse matrices. As transformation modules, we applied optimized compress/decompress engines and configured invariant operators for maximum energy/performance efficiency. Additionally, we explored two different approaches based on transformation opaqueness in relation to the application. Under the first approach, the application is agnostic to compression and decompression activity. Such approach entails minimal changes to the original application code, but leaves out potential applicationspecific optimizations. The second approach exposes the decompression process to the application, hereby exposing optimization opportunities that can only be exploited with application knowledge. The experimental results show that the two approaches have their strengths in HW and SW respectively, where the SW approach can yield performance and power improvements that are an order of magnitude better than ACDT-oblivious, hand-optimized implementations.We consider the ACDT runtime framework an important component of compute nodes that will lead towards power efficient Exascale clusters.« less

  2. Final Technical Report: Distributed Controls for High Penetrations of Renewables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byrne, Raymond H.; Neely, Jason C.; Rashkin, Lee J.

    2015-12-01

    The goal of this effort was to apply four potential control analysis/design approaches to the design of distributed grid control systems to address the impact of latency and communications uncertainty with high penetrations of photovoltaic (PV) generation. The four techniques considered were: optimal fixed structure control; Nyquist stability criterion; vector Lyapunov analysis; and Hamiltonian design methods. A reduced order model of the Western Electricity Coordinating Council (WECC) developed for the Matlab Power Systems Toolbox (PST) was employed for the study, as well as representative smaller systems (e.g., a two-area, three-area, and four-area power system). Excellent results were obtained with themore » optimal fixed structure approach, and the methodology we developed was published in a journal article. This approach is promising because it offers a method for designing optimal control systems with the feedback signals available from Phasor Measurement Unit (PMU) data as opposed to full state feedback or the design of an observer. The Nyquist approach inherently handles time delay and incorporates performance guarantees (e.g., gain and phase margin). We developed a technique that works for moderate sized systems, but the approach does not scale well to extremely large system because of computational complexity. The vector Lyapunov approach was applied to a two area model to demonstrate the utility for modeling communications uncertainty. Application to large power systems requires a method to automatically expand/contract the state space and partition the system so that communications uncertainty can be considered. The Hamiltonian Surface Shaping and Power Flow Control (HSSPFC) design methodology was selected to investigate grid systems for energy storage requirements to support high penetration of variable or stochastic generation (such as wind and PV) and loads. This method was applied to several small system models.« less

  3. How can machine-learning methods assist in virtual screening for hyperuricemia? A healthcare machine-learning approach.

    PubMed

    Ichikawa, Daisuke; Saito, Toki; Ujita, Waka; Oyama, Hiroshi

    2016-12-01

    Our purpose was to develop a new machine-learning approach (a virtual health check-up) toward identification of those at high risk of hyperuricemia. Applying the system to general health check-ups is expected to reduce medical costs compared with administering an additional test. Data were collected during annual health check-ups performed in Japan between 2011 and 2013 (inclusive). We prepared training and test datasets from the health check-up data to build prediction models; these were composed of 43,524 and 17,789 persons, respectively. Gradient-boosting decision tree (GBDT), random forest (RF), and logistic regression (LR) approaches were trained using the training dataset and were then used to predict hyperuricemia in the test dataset. Undersampling was applied to build the prediction models to deal with the imbalanced class dataset. The results showed that the RF and GBDT approaches afforded the best performances in terms of sensitivity and specificity, respectively. The area under the curve (AUC) values of the models, which reflected the total discriminative ability of the classification, were 0.796 [95% confidence interval (CI): 0.766-0.825] for the GBDT, 0.784 [95% CI: 0.752-0.815] for the RF, and 0.785 [95% CI: 0.752-0.819] for the LR approaches. No significant differences were observed between pairs of each approach. Small changes occurred in the AUCs after applying undersampling to build the models. We developed a virtual health check-up that predicted the development of hyperuricemia using machine-learning methods. The GBDT, RF, and LR methods had similar predictive capability. Undersampling did not remarkably improve predictive power. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. A Thermodynamically-consistent FBA-based Approach to Biogeochemical Reaction Modeling

    NASA Astrophysics Data System (ADS)

    Shapiro, B.; Jin, Q.

    2015-12-01

    Microbial rates are critical to understanding biogeochemical processes in natural environments. Recently, flux balance analysis (FBA) has been applied to predict microbial rates in aquifers and other settings. FBA is a genome-scale constraint-based modeling approach that computes metabolic rates and other phenotypes of microorganisms. This approach requires a prior knowledge of substrate uptake rates, which is not available for most natural microbes. Here we propose to constrain substrate uptake rates on the basis of microbial kinetics. Specifically, we calculate rates of respiration (and fermentation) using a revised Monod equation; this equation accounts for both the kinetics and thermodynamics of microbial catabolism. Substrate uptake rates are then computed from the rates of respiration, and applied to FBA to predict rates of microbial growth. We implemented this method by linking two software tools, PHREEQC and COBRA Toolbox. We applied this method to acetotrophic methanogenesis by Methanosarcina barkeri, and compared the simulation results to previous laboratory observations. The new method constrains acetate uptake by accounting for the kinetics and thermodynamics of methanogenesis, and predicted well the observations of previous experiments. In comparison, traditional methods of dynamic-FBA constrain acetate uptake on the basis of enzyme kinetics, and failed to reproduce the experimental results. These results show that microbial rate laws may provide a better constraint than enzyme kinetics for applying FBA to biogeochemical reaction modeling.

  5. A nonlinear interface model applied to masonry structures

    NASA Astrophysics Data System (ADS)

    Lebon, Frédéric; Raffa, Maria Letizia; Rizzoni, Raffaella

    2015-12-01

    In this paper, a new imperfect interface model is presented. The model includes finite strains, micro-cracks and smooth roughness. The model is consistently derived by coupling a homogenization approach for micro-cracked media and arguments of asymptotic analysis. The model is applied to brick/mortar interfaces. Numerical results are presented.

  6. Applying Hope Theory to Support Middle School Transitions

    ERIC Educational Resources Information Center

    Akos, Patrick; Kurz, Maureen Shields

    2016-01-01

    Middle grades transitions pose challenges to many students who meet these tasks with varying levels of success. Contemporary developmental and strengths-based literature offers Hope Theory (Snyder, 2002), a research supported approach that can mitigate risks in school transitions. This article describes how middle grades educators can apply the…

  7. Applying Organizational Commitment and Human Capital Theories to Emigration Research

    ERIC Educational Resources Information Center

    Verkhohlyad, Olga; McLean, Gary N.

    2012-01-01

    Purpose: This study aims to bring some additional insight into the issue of emigration by establishing a relationship between emigration and psychic return of citizens to their human capital investment in the country. Design/methodology/approach: The article adopts a quantitative research strategy. It applies organizational commitment and human…

  8. Applying An Aptitude-Treatment Interaction Approach to Competency Based Teacher Education.

    ERIC Educational Resources Information Center

    McNergney, Robert

    Aptitude treatment interaction (ATI), as applied to education, measures the interaction of personality factors and experimentally manipulated teaching strategies. ATI research has had dissappointingly inconclusive results so far, but proponents argue that this has been due to imprecise methods, which can be rectified. They believe that…

  9. An Applied Project-Driven Approach to Undergraduate Research Experiences

    ERIC Educational Resources Information Center

    Karls, Michael A.

    2017-01-01

    In this paper I will outline the process I have developed for conducting applied mathematics research with undergraduates and give some examples of the projects we have worked on. Several of these projects have led to refereed publications that could be used to illustrate topics taught in the undergraduate curriculum.

  10. Classroom Assistants for Foreign Language Pedagogy

    ERIC Educational Resources Information Center

    O'Brien, Trudy

    2017-01-01

    This paper describes how Applied Linguistics (AL) seminar students that are proficient in languages other than English intern as classroom assistants (CAs) in foreign language classrooms. The CAs gain useful insights from the seminar on theoretical and pedagogical approaches, methods and techniques in L2 pedagogy and then apply this knowledge…

  11. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  12. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  13. Applying APA's Learner-Centered Principles to School-Based Group Counseling.

    ERIC Educational Resources Information Center

    Stroh, Heather R.; Sink, Christopher A.

    2002-01-01

    This article introduces the American Psychological Association's learner-centered principles and provides a brief rationale for infusing them into comprehensive guidance and counseling programs. Using small group counseling as an illustration, explains how counselors can apply a learner-centered approach to their work. (Contains 43 references.)…

  14. Understanding the Conceptual Development Phase of Applied Theory-Building Research: A Grounded Approach

    ERIC Educational Resources Information Center

    Storberg-Walker, Julia

    2007-01-01

    This article presents a provisional grounded theory of conceptual development for applied theory-building research. The theory described here extends the understanding of the components of conceptual development and provides generalized relations among the components. The conceptual development phase of theory-building research has been widely…

  15. A Vision for Systems Engineering Applied to Wind Energy (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felker, F.; Dykes, K.

    2015-01-01

    This presentation was given at the Third Wind Energy Systems Engineering Workshop on January 14, 2015. Topics covered include the importance of systems engineering, a vision for systems engineering as applied to wind energy, and application of systems engineering approaches to wind energy research and development.

  16. Artificial Neural Networks: A New Approach to Predicting Application Behavior.

    ERIC Educational Resources Information Center

    Gonzalez, Julie M. Byers; DesJardins, Stephen L.

    2002-01-01

    Applied the technique of artificial neural networks to predict which students were likely to apply to one research university. Compared the results to the traditional analysis tool, logistic regression modeling. Found that the addition of artificial intelligence models was a useful new tool for predicting student application behavior. (EV)

  17. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  18. MASS BALANCE MODELLING OF PCBS IN THE FOX RIVER/GREEN BAY COMPLEX

    EPA Science Inventory

    The USEPA Office of Research and Development developed and applies a multimedia, mass balance modeling approach to the Fox River/Green Bay complex to aid managers with remedial decision-making. The suite of models were applied to PCBs due to the long history of contamination and ...

  19. Underachievers' Cognitive and Behavioral Strategies--Self-Handicapping at School.

    ERIC Educational Resources Information Center

    Nurmi, Jari-Erik; And Others

    1995-01-01

    Two studies with a total of 153 junior and senior high-school students and vocational students in Finland investigated whether underachievers applied a self-handicapping or learned-helplessness strategy in achievement contexts. Underachievers seemed to apply a self-handicapping strategy rather than a learned-helplessness approach. (SLD)

  20. Using Student Managed Businesses to Integrate the Business Curriculum

    ERIC Educational Resources Information Center

    Massad, Victor J.; Tucker, Joanne M.

    2009-01-01

    To teach business today requires that we go beyond classroom learning and encourage real world, cross-functional experiences and applied management decision-making. This paper describes an innovative approach that requires students to apply their function-specific knowledge of business, integrated with other functional areas, to an authentic…

  1. Teaching Social Science Research: An Applied Approach Using Community Resources.

    ERIC Educational Resources Information Center

    Gilliland, M. Janice; And Others

    A four-week summer project for 100 rural tenth graders in the University of Alabama's Biomedical Sciences Preparation Program (BioPrep) enabled students to acquire and apply social sciences research skills. The students investigated drinking water quality in three rural Alabama counties by interviewing local officials, health workers, and…

  2. A Case Study in Applying Lean Sustainability Concepts to Universities

    ERIC Educational Resources Information Center

    Comm, Clare L.; Mathaisel, Dennis F. X.

    2005-01-01

    Purpose: To apply the concepts of lean and sustainability to higher education. Design/methodology/approach: A questionnaire was developed, administered to 18 public and private universities and analyzed. Findings: The focus in higher education is now on cost reduction or budget containment initiatives. Although these initiatives were not…

  3. Brief Experimental Analyses of Academic Performance: Introduction to the Special Series

    ERIC Educational Resources Information Center

    McComas, Jennifer J.; Burns, Matthew K.

    2009-01-01

    Academic skills are frequent concerns in K-12 schools that could benefit from the application of applied behavior analysis (ABA). Brief experimental analysis (BEA) of academic performance is perhaps the most promising approach to apply ABA to student learning. Although research has consistently demonstrated the effectiveness of academic…

  4. Electronic-projecting Moire method applying CBR-technology

    NASA Astrophysics Data System (ADS)

    Kuzyakov, O. N.; Lapteva, U. V.; Andreeva, M. A.

    2018-01-01

    Electronic-projecting method based on Moire effect for examining surface topology is suggested. Conditions of forming Moire fringes and their parameters’ dependence on reference parameters of object and virtual grids are analyzed. Control system structure and decision-making subsystem are elaborated. Subsystem execution includes CBR-technology, based on applying case base. The approach related to analysing and forming decision for each separate local area with consequent formation of common topology map is applied.

  5. Determination of aerodynamic sensitivity coefficients in the transonic and supersonic regimes

    NASA Technical Reports Server (NTRS)

    Elbanna, Hesham M.; Carlson, Leland A.

    1989-01-01

    The quasi-analytical approach is developed to compute airfoil aerodynamic sensitivity coefficients in the transonic and supersonic flight regimes. Initial investigation verifies the feasibility of this approach as applied to the transonic small perturbation residual expression. Results are compared to those obtained by the direct (finite difference) approach and both methods are evaluated to determine their computational accuracies and efficiencies. The quasi-analytical approach is shown to be superior and worth further investigation.

  6. Towards an Optimal Noise Versus Resolution Trade-Off in Wind Scatterometry

    NASA Technical Reports Server (NTRS)

    Williams, Brent A.

    2011-01-01

    This paper approaches the noise versus resolution trade-off in wind scatterometry from a field-wise retrieval perspective. Theoretical considerations are discussed and practical implementation using a MAP estimator is applied to the Sea-Winds scatterometer. The approach is compared to conventional approaches as well as numerical weather predictions. The new approach incorporates knowledge of the wind spectrum to reduce the impact of components of the wind signal that are expected to be noisy.

  7. Systems metabolic engineering strategies for the production of amino acids.

    PubMed

    Ma, Qian; Zhang, Quanwei; Xu, Qingyang; Zhang, Chenglin; Li, Yanjun; Fan, Xiaoguang; Xie, Xixian; Chen, Ning

    2017-06-01

    Systems metabolic engineering is a multidisciplinary area that integrates systems biology, synthetic biology and evolutionary engineering. It is an efficient approach for strain improvement and process optimization, and has been successfully applied in the microbial production of various chemicals including amino acids. In this review, systems metabolic engineering strategies including pathway-focused approaches, systems biology-based approaches, evolutionary approaches and their applications in two major amino acid producing microorganisms: Corynebacterium glutamicum and Escherichia coli, are summarized.

  8. Development of a time-dependent hurricane evacuation model for the New Orleans area : research project capsule.

    DOT National Transportation Integrated Search

    2008-08-01

    Current hurricane evacuation transportation modeling uses an approach fashioned after the : traditional four-step procedure applied in urban transportation planning. One of the limiting : features of this approach is that it models traffic in a stati...

  9. A comparison of operational remote sensing-based models for estimating crop evapotranspiration

    USDA-ARS?s Scientific Manuscript database

    The integration of remotely sensed data into models of actual evapotranspiration has allowed for the estimation of water consumption across agricultural regions. Two modeling approaches have been successfully applied. The first approach computes a surface energy balance using the radiometric surface...

  10. Self-Defeating Behavior Workshops: Systems Approach for Hard-to-Serve Veterans.

    ERIC Educational Resources Information Center

    Banks, Jerry; And Others

    1979-01-01

    The self-defeating behavior (SDB) theory was applied to veterans who had problems in job training, vocational placement, and academic settings. A group-therapy structure meeting eight times during a four-week period was the basic approach of the SDB seminar. (Author)

  11. Screening native botanicals for bioactivity: an interdisciplinary approach

    USGS Publications Warehouse

    Boudreau, Anik; Cheng, Diana M.; Ruiz, Carmen; Ribnicky, David; Allain, Larry K.; Brassieur, C. Ray; Turnipseed, D. Phil; Cefalu, William T.; Floyd, Z. Elizabeth

    2014-01-01

    Conclusion: An interdisciplinary approach to screening botanical sources of therapeutic agents can be successfully applied to identify native plants used in folk medicine as potential sources of therapeutic agents in treating insulin resistance in skeletal muscle or inflammatory processes associated with obesity-related insulin resistance.

  12. Using Natural Approach Teaching Techniques.

    ERIC Educational Resources Information Center

    Whitman, Charles

    1986-01-01

    Describes a beginning foreign language class applying the principles of Stephen Krashen's "Natural Approach" and James Asher's "Total Physical Response" method. Initially students carry out the instructor's commands in the form of actions rather than being required to speak. In later stages role play and simple discussions are…

  13. Teaching Motivational Interviewing to Undergraduates: Evaluation of Three Approaches

    ERIC Educational Resources Information Center

    Madson, Michael B.; Schumacher, Julie A.; Noble, Jeremy J.; Bonnell, Melissa A.

    2013-01-01

    Many undergraduate psychology students assume positions as mental health paraprofessionals during or after college. The present study was a quasi-experimental evaluation of the effectiveness of teaching motivational interviewing (MI), a counseling approach that applies to many paraprofessional occupations. Results from 83 undergraduates indicated…

  14. Strategies for Effective Learning.

    ERIC Educational Resources Information Center

    Frierson, Henry T., Jr.

    Suggestions are offered for applying learning techniques for a variety of learning situations. The approaches are applicable to learning medical school content as well as other advanced educational content. Ways to control external distractors are suggested, including a systematic approach to completing large tasks, such as writing a research…

  15. Manpower Targets and Educational Investments

    ERIC Educational Resources Information Center

    Ritzen, Jo M.

    1976-01-01

    Discusses the use of quadratic programming to calculate the optimal distribution of educational investments required to closely approach manpower targets when financial resources are insufficient to meet manpower targets completely. Demonstrates use of the quadratic programming approach by applying it to the training of supervisory technicians in…

  16. A Practical Approach to Programmatic Assessment Design

    ERIC Educational Resources Information Center

    Timmerman, A. A.; Dijkstra, J.

    2017-01-01

    Assessment of complex tasks integrating several competencies calls for a programmatic design approach. As single instruments do not provide the information required to reach a robust judgment of integral performance, 73 guidelines for programmatic assessment design were developed. When simultaneously applying these interrelated guidelines, it is…

  17. Controlled growth of silica-titania hybrid functional nanoparticles through a multistep microfluidic approach.

    PubMed

    Shiba, K; Sugiyama, T; Takei, T; Yoshikawa, G

    2015-11-11

    Silica/titania-based functional nanoparticles were prepared through controlled nucleation of titania and subsequent encapsulation by silica through a multistep microfluidic approach, which was successfully applied to obtaining aminopropyl-functionalized silica/titania nanoparticles for a highly sensitive humidity sensor.

  18. The Flipped Classroom in Counselor Education

    ERIC Educational Resources Information Center

    Moran, Kristen; Milsom, Amy

    2015-01-01

    The flipped classroom is proposed as an effective instructional approach in counselor education. An overview of the flipped-classroom approach, including advantages and disadvantages, is provided. A case example illustrates how the flipped classroom can be applied in counselor education. Recommendations for implementing or researching flipped…

  19. The "Magic Cure" A Review of the Current Controversial Approaches for Treating Learning Disabilities.

    ERIC Educational Resources Information Center

    Silver, Larry B.

    1987-01-01

    In view of popular press coverage of controversial approaches to treating learning disabilities, the article briefly reviews evidence concerning the following: neurophysiological retraining (patterning, optometric visual training, cerebellar-vestibular remediation, and applied kinesiology); and orthomolecular medicine (concerning megavitamins,…

  20. The human genome as common heritage: common sense or legal nonsense?

    PubMed

    Ossorio, Pilar N

    2007-01-01

    This essay identifies two legal lineages underlying the common heritage concept, and applies each to the human genome. The essay notes some advantages and disadvantages of each approach, and argues that patenting of human genes would be allowable under either approach.

  1. A SYNOPTIC APPROACH FOR ASSESSING CUMULATIVE IMPACTS TO WETLANDS

    EPA Science Inventory

    The US Environmental Protection Agency's Wetlands Research Program has developed the synoptic approach as a proposed method for assessing cumulative impacts to wetlands by providing both a general and a comprehensive view of the environment. It can also be applied more broadly to...

  2. Bayesian-information-gap decision theory with an application to CO 2 sequestration

    DOE PAGES

    O'Malley, D.; Vesselinov, V. V.

    2015-09-04

    Decisions related to subsurface engineering problems such as groundwater management, fossil fuel production, and geologic carbon sequestration are frequently challenging because of an overabundance of uncertainties (related to conceptualizations, parameters, observations, etc.). Because of the importance of these problems to agriculture, energy, and the climate (respectively), good decisions that are scientifically defensible must be made despite the uncertainties. We describe a general approach to making decisions for challenging problems such as these in the presence of severe uncertainties that combines probabilistic and non-probabilistic methods. The approach uses Bayesian sampling to assess parametric uncertainty and Information-Gap Decision Theory (IGDT) to addressmore » model inadequacy. The combined approach also resolves an issue that frequently arises when applying Bayesian methods to real-world engineering problems related to the enumeration of possible outcomes. In the case of zero non-probabilistic uncertainty, the method reduces to a Bayesian method. Lastly, to illustrate the approach, we apply it to a site-selection decision for geologic CO 2 sequestration.« less

  3. Adapting environmental management to uncertain but inevitable change.

    PubMed

    Nicol, Sam; Fuller, Richard A; Iwamura, Takuya; Chadès, Iadine

    2015-06-07

    Implementation of adaptation actions to protect biodiversity is limited by uncertainty about the future. One reason for this is the fear of making the wrong decisions caused by the myriad future scenarios presented to decision-makers. We propose an adaptive management (AM) method for optimally managing a population under uncertain and changing habitat conditions. Our approach incorporates multiple future scenarios and continually learns the best management strategy from observations, even as conditions change. We demonstrate the performance of our AM approach by applying it to the spatial management of migratory shorebird habitats on the East Asian-Australasian flyway, predicted to be severely impacted by future sea-level rise. By accounting for non-stationary dynamics, our solution protects 25,000 more birds per year than the current best stationary approach. Our approach can be applied to many ecological systems that require efficient adaptation strategies for an uncertain future. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  4. The consequences of physical post-treatments (microwave and electron-beam) on food/packaging interactions: A physicochemical and toxicological approach.

    PubMed

    Riquet, A M; Breysse, C; Dahbi, L; Loriot, C; Severin, I; Chagnon, M C

    2016-05-15

    The safety of microwave and electron-beam treatments has been demonstrated, in regards to the formation of reaction products that could endanger human health. An integrated approach was used combining the potential toxicity of all the substances likely to migrate to their chemical characterizations. This approach was applied to polypropylene (PP) films prepared with a selection of additives. Components were identified by liquid and gas chromatography using a mass selective detector system. Their potential toxicity was assessed using three in vitro short-term bioassays and their migrations were carried out using a standards-based approach. After the electron-beam treatment some additives decomposed and there was a significant increase in the polyolefin oligomeric saturated hydrocarbons concentration. PP prepared with Irgafos 168 led to a significantly strong cytotoxic effect and PP prepared with Irganox 1076 induced a dose-dependant estrogenic effect in vitro. Migration values were low and below the detection limit of the analytical method applied. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. A high-throughput screening approach for the optoelectronic properties of conjugated polymers.

    PubMed

    Wilbraham, Liam; Berardo, Enrico; Turcani, Lukas; Jelfs, Kim E; Zwijnenburg, Martijn A

    2018-06-25

    We propose a general high-throughput virtual screening approach for the optical and electronic properties of conjugated polymers. This approach makes use of the recently developed xTB family of low-computational-cost density functional tight-binding methods from Grimme and co-workers, calibrated here to (TD-)DFT data computed for a representative diverse set of (co-)polymers. Parameters drawn from the resulting calibration using a linear model can then be applied to the xTB derived results for new polymers, thus generating near DFT-quality data with orders of magnitude reduction in computational cost. As a result, after an initial computational investment for calibration, this approach can be used to quickly and accurately screen on the order of thousands of polymers for target applications. We also demonstrate that the (opto)electronic properties of the conjugated polymers show only a very minor variation when considering different conformers and that the results of high-throughput screening are therefore expected to be relatively insensitive with respect to the conformer search methodology applied.

  6. A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale

    PubMed Central

    Pérez Sánchez, Carlos Javier

    2014-01-01

    Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002

  7. Implementing the Biopharmaceutics Classification System in Drug Development: Reconciling Similarities, Differences, and Shared Challenges in the EMA and US-FDA-Recommended Approaches.

    PubMed

    Cardot, J-M; Garcia Arieta, A; Paixao, P; Tasevska, I; Davit, B

    2016-07-01

    The US-FDA recently posted a draft guideline for industry recommending procedures necessary to obtain a biowaiver for immediate-release oral dosage forms based on the Biopharmaceutics Classification System (BCS). This review compares the present FDA BCS biowaiver approach, with the existing European Medicines Agency (EMA) approach, with an emphasis on similarities, difficulties, and shared challenges. Some specifics of the current EMA BCS guideline are compared with those in the recently published draft US-FDA BCS guideline. In particular, similarities and differences in the EMA versus US-FDA approaches to establishing drug solubility, permeability, dissolution, and formulation suitability for BCS biowaiver are critically reviewed. Several case studies are presented to illustrate the (i) challenges of applying for BCS biowaivers for global registration in the face of differences in the EMA and US-FDA BCS biowaiver criteria, as well as (ii) challenges inherent in applying for BCS class I or III designation and common to both jurisdictions.

  8. Defining an integrative approach for health promotion and disease prevention: A population health equity framework

    PubMed Central

    Trinh-Shevrin, Chau; Nadkarni, Smiti; Park, Rebecca; Islam, Nadia; Kwon, Simona C.

    2015-01-01

    Background Eliminating health disparities in racial ethnic minority and underserved populations requires a paradigm shift from disease-focused biomedical approaches to a health equity framework that aims to achieve optimal health for all by targeting social and structural determinants of health. Methods We describe the concepts and parallel approaches that underpin an integrative population health equity framework. Using a case study approach we present the experience of the NYU Center for the Study of Asian American Health (CSAAH) in applying the framework to guide its work. Results This framework is central to CSAAH’s efforts moving towards a population health equity vision for Asian Americans. Discussion Advancing the health of underserved populations requires community engagement and an understanding of the multilevel contextual factors that influence health. Applying an integrative framework has allowed us to advance health equity for Asian American communities and may serve as a useful framework for other underserved populations. PMID:25981095

  9. Boolean network inference from time series data incorporating prior biological knowledge.

    PubMed

    Haider, Saad; Pal, Ranadip

    2012-01-01

    Numerous approaches exist for modeling of genetic regulatory networks (GRNs) but the low sampling rates often employed in biological studies prevents the inference of detailed models from experimental data. In this paper, we analyze the issues involved in estimating a model of a GRN from single cell line time series data with limited time points. We present an inference approach for a Boolean Network (BN) model of a GRN from limited transcriptomic or proteomic time series data based on prior biological knowledge of connectivity, constraints on attractor structure and robust design. We applied our inference approach to 6 time point transcriptomic data on Human Mammary Epithelial Cell line (HMEC) after application of Epidermal Growth Factor (EGF) and generated a BN with a plausible biological structure satisfying the data. We further defined and applied a similarity measure to compare synthetic BNs and BNs generated through the proposed approach constructed from transitions of various paths of the synthetic BNs. We have also compared the performance of our algorithm with two existing BN inference algorithms. Through theoretical analysis and simulations, we showed the rarity of arriving at a BN from limited time series data with plausible biological structure using random connectivity and absence of structure in data. The framework when applied to experimental data and data generated from synthetic BNs were able to estimate BNs with high similarity scores. Comparison with existing BN inference algorithms showed the better performance of our proposed algorithm for limited time series data. The proposed framework can also be applied to optimize the connectivity of a GRN from experimental data when the prior biological knowledge on regulators is limited or not unique.

  10. Modeling of multi-band drift in nanowires using a full band Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Hathwar, Raghuraj; Saraniti, Marco; Goodnick, Stephen M.

    2016-07-01

    We report on a new numerical approach for multi-band drift within the context of full band Monte Carlo (FBMC) simulation and apply this to Si and InAs nanowires. The approach is based on the solution of the Krieger and Iafrate (KI) equations [J. B. Krieger and G. J. Iafrate, Phys. Rev. B 33, 5494 (1986)], which gives the probability of carriers undergoing interband transitions subject to an applied electric field. The KI equations are based on the solution of the time-dependent Schrödinger equation, and previous solutions of these equations have used Runge-Kutta (RK) methods to numerically solve the KI equations. This approach made the solution of the KI equations numerically expensive and was therefore only applied to a small part of the Brillouin zone (BZ). Here we discuss an alternate approach to the solution of the KI equations using the Magnus expansion (also known as "exponential perturbation theory"). This method is more accurate than the RK method as the solution lies on the exponential map and shares important qualitative properties with the exact solution such as the preservation of the unitary character of the time evolution operator. The solution of the KI equations is then incorporated through a modified FBMC free-flight drift routine and applied throughout the nanowire BZ. The importance of the multi-band drift model is then demonstrated for the case of Si and InAs nanowires by simulating a uniform field FBMC and analyzing the average carrier energies and carrier populations under high electric fields. Numerical simulations show that the average energy of the carriers under high electric field is significantly higher when multi-band drift is taken into consideration, due to the interband transitions allowing carriers to achieve higher energies.

  11. Reducing the worst case running times of a family of RNA and CFG problems, using Valiant's approach.

    PubMed

    Zakov, Shay; Tsur, Dekel; Ziv-Ukelson, Michal

    2011-08-18

    RNA secondary structure prediction is a mainstream bioinformatic domain, and is key to computational analysis of functional RNA. In more than 30 years, much research has been devoted to defining different variants of RNA structure prediction problems, and to developing techniques for improving prediction quality. Nevertheless, most of the algorithms in this field follow a similar dynamic programming approach as that presented by Nussinov and Jacobson in the late 70's, which typically yields cubic worst case running time algorithms. Recently, some algorithmic approaches were applied to improve the complexity of these algorithms, motivated by new discoveries in the RNA domain and by the need to efficiently analyze the increasing amount of accumulated genome-wide data. We study Valiant's classical algorithm for Context Free Grammar recognition in sub-cubic time, and extract features that are common to problems on which Valiant's approach can be applied. Based on this, we describe several problem templates, and formulate generic algorithms that use Valiant's technique and can be applied to all problems which abide by these templates, including many problems within the world of RNA Secondary Structures and Context Free Grammars. The algorithms presented in this paper improve the theoretical asymptotic worst case running time bounds for a large family of important problems. It is also possible that the suggested techniques could be applied to yield a practical speedup for these problems. For some of the problems (such as computing the RNA partition function and base-pair binding probabilities), the presented techniques are the only ones which are currently known for reducing the asymptotic running time bounds of the standard algorithms.

  12. From resilience thinking to Resilience Planning: Lessons from practice.

    PubMed

    Sellberg, M M; Ryan, P; Borgström, S T; Norström, A V; Peterson, G D

    2018-07-01

    Resilience thinking has frequently been proposed as an alternative to conventional natural resource management, but there are few studies of its applications in real-world settings. To address this gap, we synthesized experiences from practitioners that have applied a resilience thinking approach to strategic planning, called Resilience Planning, in regional natural resource management organizations in Australia. This case represents one of the most extensive and long-term applications of resilience thinking in the world today. We conducted semi-structured interviews with Resilience Planning practitioners from nine organizations and reviewed strategic planning documents to investigate: 1) the key contributions of the approach to their existing strategic planning, and 2) what enabled and hindered the practitioners in applying and embedding the new approach in their organizations. Our results reveal that Resilience Planning contributed to developing a social-ecological systems perspective, more adaptive and collaborative approaches to planning, and that it clarified management goals of desirable resource conditions. Applying Resilience Planning required translating resilience thinking to practice in each unique circumstance, while simultaneously creating support among staff, and engaging external actors. Embedding Resilience Planning within organizations implied starting and maintaining longer-term change processes that required sustained multi-level organizational support. We conclude by identifying four lessons for successfully applying and embedding resilience practice in an organization: 1) to connect internal "entrepreneurs" to "interpreters" and "networkers" who work across organizations, 2) to assess the opportunity context for resilience practice, 3) to ensure that resilience practice is a learning process that engages internal and external actors, and 4) to develop reflective strategies for managing complexity and uncertainty. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. A dynamic mechanical analysis technique for porous media

    PubMed Central

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in hydraulic conductivity as well. PMID:25248170

  14. Reducing the worst case running times of a family of RNA and CFG problems, using Valiant's approach

    PubMed Central

    2011-01-01

    Background RNA secondary structure prediction is a mainstream bioinformatic domain, and is key to computational analysis of functional RNA. In more than 30 years, much research has been devoted to defining different variants of RNA structure prediction problems, and to developing techniques for improving prediction quality. Nevertheless, most of the algorithms in this field follow a similar dynamic programming approach as that presented by Nussinov and Jacobson in the late 70's, which typically yields cubic worst case running time algorithms. Recently, some algorithmic approaches were applied to improve the complexity of these algorithms, motivated by new discoveries in the RNA domain and by the need to efficiently analyze the increasing amount of accumulated genome-wide data. Results We study Valiant's classical algorithm for Context Free Grammar recognition in sub-cubic time, and extract features that are common to problems on which Valiant's approach can be applied. Based on this, we describe several problem templates, and formulate generic algorithms that use Valiant's technique and can be applied to all problems which abide by these templates, including many problems within the world of RNA Secondary Structures and Context Free Grammars. Conclusions The algorithms presented in this paper improve the theoretical asymptotic worst case running time bounds for a large family of important problems. It is also possible that the suggested techniques could be applied to yield a practical speedup for these problems. For some of the problems (such as computing the RNA partition function and base-pair binding probabilities), the presented techniques are the only ones which are currently known for reducing the asymptotic running time bounds of the standard algorithms. PMID:21851589

  15. Prediction of breast cancer risk using a machine learning approach embedded with a locality preserving projection algorithm.

    PubMed

    Heidari, Morteza; Khuzani, Abolfazl Zargari; Hollingsworth, Alan B; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin

    2018-01-30

    In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.

  16. Prediction of breast cancer risk using a machine learning approach embedded with a locality preserving projection algorithm

    NASA Astrophysics Data System (ADS)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Hollingsworth, Alan B.; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qiu, Yuchen; Liu, Hong; Zheng, Bin

    2018-02-01

    In order to automatically identify a set of effective mammographic image features and build an optimal breast cancer risk stratification model, this study aims to investigate advantages of applying a machine learning approach embedded with a locally preserving projection (LPP) based feature combination and regeneration algorithm to predict short-term breast cancer risk. A dataset involving negative mammograms acquired from 500 women was assembled. This dataset was divided into two age-matched classes of 250 high risk cases in which cancer was detected in the next subsequent mammography screening and 250 low risk cases, which remained negative. First, a computer-aided image processing scheme was applied to segment fibro-glandular tissue depicted on mammograms and initially compute 44 features related to the bilateral asymmetry of mammographic tissue density distribution between left and right breasts. Next, a multi-feature fusion based machine learning classifier was built to predict the risk of cancer detection in the next mammography screening. A leave-one-case-out (LOCO) cross-validation method was applied to train and test the machine learning classifier embedded with a LLP algorithm, which generated a new operational vector with 4 features using a maximal variance approach in each LOCO process. Results showed a 9.7% increase in risk prediction accuracy when using this LPP-embedded machine learning approach. An increased trend of adjusted odds ratios was also detected in which odds ratios increased from 1.0 to 11.2. This study demonstrated that applying the LPP algorithm effectively reduced feature dimensionality, and yielded higher and potentially more robust performance in predicting short-term breast cancer risk.

  17. A comparison of two approaches to modelling snow cover dynamics at the Polish Polar Station at Hornsund

    NASA Astrophysics Data System (ADS)

    Luks, B.; Osuch, M.; Romanowicz, R. J.

    2012-04-01

    We compare two approaches to modelling snow cover dynamics at the Polish Polar Station at Hornsund. In the first approach we apply physically-based Utah Energy Balance Snow Accumulation and Melt Model (UEB) (Tarboton et al., 1995; Tarboton and Luce, 1996). The model uses a lumped representation of the snowpack with two primary state variables: snow water equivalence and energy. Its main driving inputs are: air temperature, precipitation, wind speed, humidity and radiation (estimated from the diurnal temperature range). Those variables are used for physically-based calculations of radiative, sensible, latent and advective heat exchanges with a 3 hours time step. The second method is an application of a statistically efficient lumped parameter time series approach to modelling the dynamics of snow cover , based on daily meteorological measurements from the same area. A dynamic Stochastic Transfer Function model is developed that follows the Data Based Mechanistic approach, where a stochastic data-based identification of model structure and an estimation of its parameters are followed by a physical interpretation. We focus on the analysis of uncertainty of both model outputs. In the time series approach, the applied techniques also provide estimates of the modeling errors and the uncertainty of the model parameters. In the first, physically-based approach the applied UEB model is deterministic. It assumes that the observations are without errors and that the model structure perfectly describes the processes within the snowpack. To take into account the model and observation errors, we applied a version of the Generalized Likelihood Uncertainty Estimation technique (GLUE). This technique also provide estimates of the modelling errors and the uncertainty of the model parameters. The observed snowpack water equivalent values are compared with those simulated with 95% confidence bounds. This work was supported by National Science Centre of Poland (grant no. 7879/B/P01/2011/40). Tarboton, D. G., T. G. Chowdhury and T. H. Jackson, 1995. A Spatially Distributed Energy Balance Snowmelt Model. In K. A. Tonnessen, M. W. Williams and M. Tranter (Ed.), Proceedings of a Boulder Symposium, July 3-14, IAHS Publ. no. 228, pp. 141-155. Tarboton, D. G. and C. H. Luce, 1996. Utah Energy Balance Snow Accumulation and Melt Model (UEB). Computer model technical description and users guide, Utah Water Research Laboratory and USDA Forest Service Intermountain Research Station (http://www.engineering.usu.edu/dtarb/). 64 pp.

  18. A review and meta-analysis of the enemy release hypothesis in plant–herbivorous insect systems

    PubMed Central

    Meijer, Kim; Schilthuizen, Menno; Beukeboom, Leo

    2016-01-01

    A suggested mechanism for the success of introduced non-native species is the enemy release hypothesis (ERH). Many studies have tested the predictions of the ERH using the community approach (native and non-native species studied in the same habitat) or the biogeographical approach (species studied in their native and non-native range), but results are highly variable, possibly due to large variety of study systems incorporated. We therefore focused on one specific system: plants and their herbivorous insects. We performed a systematic review and compiled a large number (68) of datasets from studies comparing herbivorous insects on native and non-native plants using the community or biogeographical approach. We performed a meta-analysis to test the predictions from the ERH for insect diversity (number of species), insect load (number of individuals) and level of herbivory for both the community and biogeographical approach. For both the community and biogeographical approach insect diversity was significantly higher on native than on non-native plants. Insect load tended to be higher on native than non-native plants at the community approach only. Herbivory was not different between native and non-native plants at the community approach, while there was too little data available for testing the biogeographical approach. Our meta-analysis generally supports the predictions from the ERH for both the community and biogeographical approach, but also shows that the outcome is importantly determined by the response measured and approach applied. So far, very few studies apply both approaches simultaneously in a reciprocal manner while this is arguably the best way for testing the ERH. PMID:28028463

  19. General Theory of Absorption in Porous Materials: Restricted Multilayer Theory.

    PubMed

    Aduenko, Alexander A; Murray, Andy; Mendoza-Cortes, Jose L

    2018-04-18

    In this article, we present an approach for the generalization of adsorption of light gases in porous materials. This new theory goes beyond Langmuir and Brunauer-Emmett-Teller theories, which are the standard approaches that have a limited application to crystalline porous materials by their unphysical assumptions on the amount of possible adsorption layers. The derivation of a more general equation for any crystalline porous framework is presented, restricted multilayer theory. Our approach allows the determination of gas uptake considering only geometrical constraints of the porous framework and the interaction energy of the guest molecule with the framework. On the basis of this theory, we calculated optimal values for the adsorption enthalpy at different temperatures and pressures. We also present the use of this theory to determine the optimal linker length for a topologically equivalent framework series. We validate this theoretical approach by applying it to metal-organic frameworks (MOFs) and show that it reproduces the experimental results for seven different reported materials. We obtained the universal equation for the optimal linker length, given the topology of a porous framework. This work applied the general equation to MOFs and H 2 to create energy-storage materials; however, this theory can be applied to other crystalline porous materials and light gases, which opens the possibility of designing the next generations of energy-storage materials by first considering only the geometrical constraints of the porous materials.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wykes, M., E-mail: mikewykes@gmail.com; Parambil, R.; Gierschner, J.

    Here, we present a general approach to treating vibronic coupling in molecular crystals based on atomistic simulations of large clusters. Such clusters comprise model aggregates treated at the quantum chemical level embedded within a realistic environment treated at the molecular mechanics level. As we calculate ground and excited state equilibrium geometries and vibrational modes of model aggregates, our approach is able to capture effects arising from coupling to intermolecular degrees of freedom, absent from existing models relying on geometries and normal modes of single molecules. Using the geometries and vibrational modes of clusters, we are able to simulate the fluorescencemore » spectra of aggregates for which the lowest excited state bears negligible oscillator strength (as is the case, e.g., ideal H-aggregates) by including both Franck-Condon (FC) and Herzberg-Teller (HT) vibronic transitions. The latter terms allow the adiabatic excited state of the cluster to couple with vibrations in a perturbative fashion via derivatives of the transition dipole moment along nuclear coordinates. While vibronic coupling simulations employing FC and HT terms are well established for single-molecules, to our knowledge this is the first time they are applied to molecular aggregates. Here, we apply this approach to the simulation of the low-temperature fluorescence spectrum of para-distyrylbenzene single-crystal H-aggregates and draw comparisons with coarse-grained Frenkel-Holstein approaches previously extensively applied to such systems.« less

Top