Towards a Methodology for the Design of Multimedia Public Access Interfaces.
ERIC Educational Resources Information Center
Rowley, Jennifer
1998-01-01
Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…
Bayesian outcome-based strategy classification.
Lee, Michael D
2016-03-01
Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014) recently developed a method for making inferences about the decision processes people use in multi-attribute forced choice tasks. Their paper makes a number of worthwhile theoretical and methodological contributions. Theoretically, they provide an insightful psychological motivation for a probabilistic extension of the widely-used "weighted additive" (WADD) model, and show how this model, as well as other important models like "take-the-best" (TTB), can and should be expressed in terms of meaningful priors. Methodologically, they develop an inference approach based on the Minimum Description Length (MDL) principles that balances both the goodness-of-fit and complexity of the decision models they consider. This paper aims to preserve these useful contributions, but provide a complementary Bayesian approach with some theoretical and methodological advantages. We develop a simple graphical model, implemented in JAGS, that allows for fully Bayesian inferences about which models people use to make decisions. To demonstrate the Bayesian approach, we apply it to the models and data considered by Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014), showing how a prior predictive analysis of the models, and posterior inferences about which models people use and the parameter settings at which they use them, can contribute to our understanding of human decision making.
Methodology for nonwork travel analysis in suburban communities.
DOT National Transportation Integrated Search
1994-01-01
The increase in the number of nonwork trips during the past decade has contributed substantially to congestion and to environmental problems. Data collection methodologies, descriptive information, and reliable models of nonwork travel behavior are n...
Integral Methodological Pluralism in Science Education Research: Valuing Multiple Perspectives
ERIC Educational Resources Information Center
Davis, Nancy T.; Callihan, Laurie P.
2013-01-01
This article examines the multiple methodologies used in educational research and proposes a model that includes all of them as contributing to understanding educational contexts and research from multiple perspectives. The model, based on integral theory (Wilber in a theory of everything. Shambhala, Boston, 2000) values all forms of research as…
A Synthesis of a Quality Management Model for Education in Universities
ERIC Educational Resources Information Center
Srikanthan, G.; Dalrymple, John
2004-01-01
The paper attempts to synthesise the features of the model for quality management in education based on the approaches spelt out in four well-articulated methodologies for the practice of quality in higher education. Each methodology contributes to different views of education from the learners' and the institution's perspectives, providing…
Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience
ERIC Educational Resources Information Center
Zanotti, Francesco
2012-01-01
Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…
Brown, Matt A; Bishnoi, Ram J; Dholakia, Sara; Velligan, Dawn I
2016-01-20
Recent failures to detect efficacy in clinical trials investigating pharmacological treatments for schizophrenia raise concerns regarding the potential contribution of methodological shortcomings to this research. This review provides an examination of two key methodological issues currently suspected of playing a role in hampering schizophrenia drug development; 1) limitations on the translational utility of preclinical development models, and 2) methodological challenges posed by increased placebo effects. Recommendations for strategies to address these methodological issues are addressed.
Radiation Effects: Overview for Space Environment Specialists
NASA Technical Reports Server (NTRS)
Ladbury, Ray
2017-01-01
Radiation Hardness Assurance (RHA) methodologies need to evolve to capitalize on the increased flexibility introduced by new models of space radiation environments. This presentation examines the characteristics of various radiation threats, the sources of error that RHA methodologies seek to control and the contributions of environment models to those errors. The influence of trends in microelectronic device technology is also considered.
Latent Trait Model Contributions to Criterion-Referenced Testing Technology.
1982-02-01
levels of ability (ranging from very low to very high). The steps in the reserach were as follows: 1. Specify the characteristics of a "typical" pool...conventional testing methodologies displayed good fit to both of the latent trait models. The one-parameter model compared favorably with the three- parameter... Methodological developments: New directions for testing a!nd measurement (No. 4). San Francisco: Jossey-Bass, 1979. Haubleton, R. K. Advances in
Facility Energy Performance Benchmarking in a Data-Scarce Environment
2017-08-01
environment, and analyze occupant-, system-, and component-level faults contributing to energy in- efficiency. A methodology for developing DoD-specific...Research, Development, Test, and Evaluation (RDTE) Program to develop an intelligent framework, encompassing methodology and model- ing, that...energy performers by installation, climate zone, and other criteria. A methodology for creating the DoD-specific EUIs would be an important part of a
On the upper ocean turbulent dissipation rate due to microscale breakers and small whitecaps
NASA Astrophysics Data System (ADS)
Banner, Michael L.; Morison, Russel P.
2018-06-01
In ocean wave modelling, accurately computing the evolution of the wind-wave spectrum depends on the source terms and the spectral bandwidth used. The wave dissipation rate source term which spectrally quantifies wave breaking and other dissipative processes remains poorly understood, including the spectral bandwidth needed to capture the essential model physics. The observational study of Sutherland and Melville (2015a) investigated the relative dissipation rate contributions of breaking waves, from large-scale whitecaps to microbreakers. They concluded that a large fraction of wave energy was dissipated by microbreakers. However, in strong contrast with their findings, our analysis of their data and other recent data sets shows that for young seas, microbreakers and small whitecaps contribute only a small fraction of the total breaking wave dissipation rate. For older seas, we find microbreakers and small whitecaps contribute a large fraction of the breaking wave dissipation rate, but this is only a small fraction of the total dissipation rate, which is now dominated by non-breaking contributions. Hence, for all the wave age conditions observed, microbreakers make an insignificant contribution to the total wave dissipation rate in the wave boundary layer. We tested the sensitivity of the results to the SM15a whitecap analysis methodology by transforming the SM15a breaking data using our breaking crest processing methodology. This resulted in the small-scale breaking waves making an even smaller contribution to the total wave dissipation rate, and so the result is independent of the breaker processing methodology. Comparison with other near-surface total TKE dissipation rate observations also support this conclusion. These contributions to the spectral dissipation rate in ocean wave models are small and need not be explicitly resolved.
ERIC Educational Resources Information Center
Gweon, Gahgene; Jain, Mahaveer; McDonough, John; Raj, Bhiksha; Rose, Carolyn P.
2013-01-01
This paper contributes to a theory-grounded methodological foundation for automatic collaborative learning process analysis. It does this by illustrating how insights from the social psychology and sociolinguistics of speech style provide a theoretical framework to inform the design of a computational model. The purpose of that model is to detect…
Garcia, Raquel A; Burgess, Neil D; Cabeza, Mar; Rahbek, Carsten; Araújo, Miguel B
2012-01-01
Africa is predicted to be highly vulnerable to 21st century climatic changes. Assessing the impacts of these changes on Africa's biodiversity is, however, plagued by uncertainties, and markedly different results can be obtained from alternative bioclimatic envelope models or future climate projections. Using an ensemble forecasting framework, we examine projections of future shifts in climatic suitability, and their methodological uncertainties, for over 2500 species of mammals, birds, amphibians and snakes in sub-Saharan Africa. To summarize a priori the variability in the ensemble of 17 general circulation models, we introduce a consensus methodology that combines co-varying models. Thus, we quantify and map the relative contribution to uncertainty of seven bioclimatic envelope models, three multi-model climate projections and three emissions scenarios, and explore the resulting variability in species turnover estimates. We show that bioclimatic envelope models contribute most to variability, particularly in projected novel climatic conditions over Sahelian and southern Saharan Africa. To summarize agreements among projections from the bioclimatic envelope models we compare five consensus methodologies, which generally increase or retain projection accuracy and provide consistent estimates of species turnover. Variability from emissions scenarios increases towards late-century and affects southern regions of high species turnover centred in arid Namibia. Twofold differences in median species turnover across the study area emerge among alternative climate projections and emissions scenarios. Our ensemble of projections underscores the potential bias when using a single algorithm or climate projection for Africa, and provides a cautious first approximation of the potential exposure of sub-Saharan African vertebrates to climatic changes. The future use and further development of bioclimatic envelope modelling will hinge on the interpretation of results in the light of methodological as well as biological uncertainties. Here, we provide a framework to address methodological uncertainties and contextualize results.
Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J
2015-03-15
This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Czocher, Jennifer A.
2016-01-01
This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…
Critical success factors for achieving superior m-health success.
Dwivedi, A; Wickramasinghe, N; Bali, R K; Naguib, R N G
2007-01-01
Recent healthcare trends clearly show significant investment by healthcare institutions into various types of wired and wireless technologies to facilitate and support superior healthcare delivery. This trend has been spurred by the shift in the concept and growing importance of the role of health information and the influence of fields such as bio-informatics, biomedical and genetic engineering. The demand is currently for integrated healthcare information systems; however for such initiatives to be successful it is necessary to adopt a macro model and appropriate methodology with respect to wireless initiatives. The key contribution of this paper is the presentation of one such integrative model for mobile health (m-health) known as the Wi-INET Business Model, along with a detailed Adaptive Mapping to Realisation (AMR) methodology. The AMR methodology details how the Wi-INET Business Model can be implemented. Further validation on the concepts detailed in the Wi-INET Business Model and the AMR methodology is offered via a short vignette on a toolkit based on a leading UK-based healthcare information technology solution.
Women, Motivation, and Achievement.
ERIC Educational Resources Information Center
Hyde, Janet Shibley; Kling, Kristen C.
2001-01-01
Reviews psychological research on motivation and educational achievement, discussing gender and contributions by feminist researchers. Feminist psychologists note sex bias and methodological flaws in traditional research on achievement motivation, proposing improved models (Eccles' expectancy x value model of achievement behavior). Contrary to…
ERIC Educational Resources Information Center
Wikeley, Felicity; Stoll, Louise; Murillo, Javier; De Jong, Rob
2005-01-01
This article describes the empirical research that contributed to the development of the model of "effective school improvement". The focus is mainly on the findings of that research but the problematic nature of designing a methodology that is applicable in 8 very different education systems is also discussed. The 4 key factors to emerge from the…
The Contribution of Human Factors in Military System Development: Methodological Considerations
1980-07-01
Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time
Maxwell's contrived analogy: An early version of the methodology of modeling
NASA Astrophysics Data System (ADS)
Hon, Giora; Goldstein, Bernard R.
2012-11-01
The term "analogy" stands for a variety of methodological practices all related in one way or another to the idea of proportionality. We claim that in his first substantial contribution to electromagnetism James Clerk Maxwell developed a methodology of analogy which was completely new at the time or, to borrow John North's expression, Maxwell's methodology was a "newly contrived analogue". In his initial response to Michael Faraday's experimental researches in electromagnetism, Maxwell did not seek an analogy with some physical system in a domain different from electromagnetism as advocated by William Thomson; rather, he constructed an entirely artificial one to suit his needs. Following North, we claim that the modification which Maxwell introduced to the methodology of analogy has not been properly appreciated. In view of our examination of the evidence, we argue that Maxwell gave a new meaning to analogy; in fact, it comes close to modeling in current usage.
ERIC Educational Resources Information Center
Moutinho, Sara; Moura, Rui; Vasconcelos, Clara
2017-01-01
Model-Based learning is a methodology that facilitates students' construction of scientific knowledge, which, sometimes, includes restructuring their mental models. Taking into consideration students' learning process, its aim is to promote a deeper understanding of phenomena's dynamics through the manipulation of models. Our aim was to ascertain…
Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis
2005-04-01
Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.
Evaluating Text-Based Information on the World Wide Web
ERIC Educational Resources Information Center
Wopereis, Iwan G. J. H.; van Merrienboer, Jeroen J. G.
2011-01-01
This special section contributes to an inclusive cognitive model of information problem solving (IPS) activity, touches briefly IPS learning, and brings to the notice methodological pitfalls related to uncovering IPS processes. Instead of focusing on the IPS process as a whole, the contributing articles turn their attention to what is regarded the…
Lerner, Richard M
2015-06-01
The bold claim that developmental science can contribute to both enhancing positive development among diverse individuals across the life span and promoting social justice in their communities, nations and regions is supported by decades of theoretical, methodological and research contributions. To explain the basis of this claim, I describe the relational developmental systems (RDS) metamodel that frames contemporary developmental science, and I present an example of a programme of research within the adolescent portion of the life span that is associated with this metamodel and is pertinent to promoting positive human development. I then discuss methodological issues associated with using RDS-based models as frames for research and application. Finally, I explain how the theoretical and methodological ideas associated with RDS thinking may provide the scholarly tools needed by developmental scientists seeking to contribute to human thriving and to advance social justice in the Global South. © 2015 International Union of Psychological Science.
Modeling, Analyzing, and Mitigating Dissonance Between Alerting Systems
NASA Technical Reports Server (NTRS)
Song, Lixia; Kuchar, James K.
2003-01-01
Alerting systems are becoming pervasive in process operations, which may result in the potential for dissonance or conflict in information from different alerting systems that suggests different threat levels and/or actions to resolve hazards. Little is currently available to help in predicting or solving the dissonance problem. This thesis presents a methodology to model and analyze dissonance between alerting systems, providing both a theoretical foundation for understanding dissonance and a practical basis from which specific problems can be addressed. A state-space representation of multiple alerting system operation is generalized that can be tailored across a variety of applications. Based on the representation, two major causes of dissonance are identified: logic differences and sensor error. Additionally, several possible types of dissonance are identified. A mathematical analysis method is developed to identify the conditions for dissonance originating from logic differences. A probabilistic analysis methodology is developed to estimate the probability of dissonance originating from sensor error, and to compare the relative contribution to dissonance of sensor error against the contribution from logic differences. A hybrid model, which describes the dynamic behavior of the process with multiple alerting systems, is developed to identify dangerous dissonance space, from which the process can lead to disaster. Methodologies to avoid or mitigate dissonance are outlined. Two examples are used to demonstrate the application of the methodology. First, a conceptual In-Trail Spacing example is presented. The methodology is applied to identify the conditions for possible dissonance, to identify relative contribution of logic difference and sensor error, and to identify dangerous dissonance space. Several proposed mitigation methods are demonstrated in this example. In the second example, the methodology is applied to address the dissonance problem between two air traffic alert and avoidance systems: the existing Traffic Alert and Collision Avoidance System (TCAS) vs. the proposed Airborne Conflict Management system (ACM). Conditions on ACM resolution maneuvers are identified to avoid dynamic dissonance between TCAS and ACM. Also included in this report is an Appendix written by Lee Winder about recent and continuing work on alerting systems design. The application of Markov Decision Process (MDP) theory to complex alerting problems is discussed and illustrated with an abstract example system.
Effect of Body Composition Methodology on Heritability Estimation of Body Fatness
Elder, Sonya J.; Roberts, Susan B.; McCrory, Megan A.; Das, Sai Krupa; Fuss, Paul J.; Pittas, Anastassios G.; Greenberg, Andrew S.; Heymsfield, Steven B.; Dawson-Hughes, Bess; Bouchard, Thomas J.; Saltzman, Edward; Neale, Michael C.
2014-01-01
Heritability estimates of human body fatness vary widely and the contribution of body composition methodology to this variability is unknown. The effect of body composition methodology on estimations of genetic and environmental contributions to body fatness variation was examined in 78 adult male and female monozygotic twin pairs reared apart or together. Body composition was assessed by six methods – body mass index (BMI), dual energy x-ray absorptiometry (DXA), underwater weighing (UWW), total body water (TBW), bioelectric impedance (BIA), and skinfold thickness. Body fatness was expressed as percent body fat, fat mass, and fat mass/height2 to assess the effect of body fatness expression on heritability estimates. Model-fitting multivariate analyses were used to assess the genetic and environmental components of variance. Mean BMI was 24.5 kg/m2 (range of 17.8–43.4 kg/m2). There was a significant effect of body composition methodology (p<0.001) on heritability estimates, with UWW giving the highest estimate (69%) and BIA giving the lowest estimate (47%) for fat mass/height2. Expression of body fatness as percent body fat resulted in significantly higher heritability estimates (on average 10.3% higher) compared to expression as fat mass/height2 (p=0.015). DXA and TBW methods expressing body fatness as fat mass/height2 gave the least biased heritability assessments, based on the small contribution of specific genetic factors to their genetic variance. A model combining DXA and TBW methods resulted in a relatively low FM/ht2 heritability estimate of 60%, and significant contributions of common and unique environmental factors (22% and 18%, respectively). The body fatness heritability estimate of 60% indicates a smaller contribution of genetic variance to total variance than many previous studies using less powerful research designs have indicated. The results also highlight the importance of environmental factors and possibly genotype by environmental interactions in the etiology of weight gain and the obesity epidemic. PMID:25067962
NASA Astrophysics Data System (ADS)
Richardson, Chris T.; Kannappan, Sheila; Bittner, Ashley; Isaac, Rohan; RESOLVE
2017-01-01
We present a novel methodology for modeling emission line galaxy samples that span the entire BPT diagram. Our methodology has several advantages over current modeling schemes: the free variables in the model are identical for both AGN and SF galaxies; these free variables are more closely linked to observable galaxy properties; and the ionizing spectra including an AGN and starlight are handled self-consistently rather than empirically. We show that our methodology is capable of fitting the vast majority of SDSS galaxies that fall within the traditional regions of galaxy classification on the BPT diagram. We also present current results for relaxing classification boundaries and extending our galaxies into the dwarf regime, using the REsolved Spectroscopy of a Local VolumE (RESOLVE) survey and the Environmental COntext (ECO) catalog, with special attention to compact blue E/S0s. We compare this methodology to PCA decomposition of the spectra. This work is supported by National Science Foundation awards AST-0955368 and CISE/ACI-1156614.
Impact of road traffic emissions on ambient air quality in an industrialized area.
Garcia, Sílvia M; Domingues, Gonçalo; Gomes, Carla; Silva, Alexandra V; Almeida, S Marta
2013-01-01
Several epidemiological studies showed a correlation between airborne particulate matter(PM) and the incidence of several diseases in exposed populations. Consequently, the European Commission reinforced the need and obligation of member-states to monitor exposure levels of PM and adopt measures to reduce this exposure. However, in order to plan appropriate actions, it is necessary to understand the main sources of air pollution and their relative contributions to the formation of the ambient aerosol. The aim of this study was to develop a methodology to assess the contribution of vehicles to the atmospheric aerosol,which may constitute a useful tool to assess the effectiveness of planned mitigation actions.This methodology is based on three main steps: (1) estimation of traffic emissions provided from the vehicles exhaust and resuspension; (2) use of the dispersion model TAPM (“The Air Pollution Model”) to estimate the contribution of traffic for the atmospheric aerosol; and(3) use of geographic information system (GIS) tools to map the PM10 concentrations provided from traffic in the surroundings of a target area. The methodology was applied to an industrial area, and results showed that the highest contribution of traffic for the PM10 concentrations resulted from dust resuspension and that heavy vehicles were the type that most contributed to the PM10 concentration.
NASA Astrophysics Data System (ADS)
Szatmári, Gábor; Laborczi, Annamária; Takács, Katalin; Pásztor, László
2017-04-01
The knowledge about soil organic carbon (SOC) baselines and changes, and the detection of vulnerable hot spots for SOC losses and gains under climate change and changed land management is still fairly limited. Thus Global Soil Partnership (GSP) has been requested to develop a global SOC mapping campaign by 2017. GSPs concept builds on official national data sets, therefore, a bottom-up (country-driven) approach is pursued. The elaborated Hungarian methodology suits the general specifications of GSOC17 provided by GSP. The input data for GSOC17@HU mapping approach has involved legacy soil data bases, as well as proper environmental covariates related to the main soil forming factors, such as climate, organisms, relief and parent material. Nowadays, digital soil mapping (DSM) highly relies on the assumption that soil properties of interest can be modelled as a sum of a deterministic and stochastic component, which can be treated and modelled separately. We also adopted this assumption in our methodology. In practice, multiple regression techniques are commonly used to model the deterministic part. However, this global (and usually linear) models commonly oversimplify the often complex and non-linear relationship, which has a crucial effect on the resulted soil maps. Thus, we integrated machine learning algorithms (namely random forest and quantile regression forest) in the elaborated methodology, supposing then to be more suitable for the problem in hand. This approach has enable us to model the GSOC17 soil properties in that complex and non-linear forms as the soil itself. Furthermore, it has enable us to model and assess the uncertainty of the results, which is highly relevant in decision making. The applied methodology has used geostatistical approach to model the stochastic part of the spatial variability of the soil properties of interest. We created GSOC17@HU map with 1 km grid resolution according to the GSPs specifications. The map contributes to the GSPs GSOC17 proposals, as well as to the development of global soil information system under GSP Pillar 4 on soil data and information. However, we elaborated our adherent code (created in R software environment) in such a way that it can be improved, specified and applied for further uses. Hence, it opens the door to create countrywide map(s) with higher grid resolution for SOC (or other soil related properties) using the advanced methodology, as well as to contribute and support the SOC (or other soil) related country level decision making. Our paper will present the soil mapping methodology itself, the resulted GSOC17@HU map, some of our conclusions drawn from the experiences and their effects on the further uses. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).
Tractenberg, Saulo G; Levandowski, Mateus L; de Azeredo, Lucas Araújo; Orso, Rodrigo; Roithmann, Laura G; Hoffmann, Emerson S; Brenhouse, Heather; Grassi-Oliveira, Rodrigo
2016-09-01
Early life stress (ELS) developmental effects have been widely studied by preclinical researchers. Despite the growing body of evidence from ELS models, such as the maternal separation paradigm, the reported results have marked inconsistencies. The maternal separation model has several methodological pitfalls that could influence the reliability of its results. Here, we critically review 94 mice studies that addressed the effects of maternal separation on behavioural outcomes. We also discuss methodological issues related to the heterogeneity of separation protocols and the quality of reporting methods. Our findings indicate a lack of consistency in maternal separation effects: major studies of behavioural and biological phenotypes failed to find significant deleterious effects. Furthermore, we identified several specific variations in separation methodological procedures. These methodological variations could contribute to the inconsistency of maternal separation effects by producing different degrees of stress exposure in maternal separation-reared pups. These methodological problems, together with insufficient reporting, might lead to inaccurate and unreliable effect estimates in maternal separation studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Dash, S. M.; York, B. J.; Sinha, N.; Dvorak, F. A.
1987-01-01
An overview of parabolic and PNS (Parabolized Navier-Stokes) methodology developed to treat highly curved sub and supersonic wall jets is presented. The fundamental data base to which these models were applied is discussed in detail. The analysis of strong curvature effects was found to require a semi-elliptic extension of the parabolic modeling to account for turbulent contributions to the normal pressure variations, as well as an extension to the turbulence models utilized, to account for the highly enhanced mixing rates observed in situations with large convex curvature. A noniterative, pressure split procedure is shown to extend parabolic models to account for such normal pressure variations in an efficient manner, requiring minimal additional run time over a standard parabolic approach. A new PNS methodology is presented to solve this problem which extends parabolic methodology via the addition of a characteristic base wave solver. Applications of this approach to analyze the interaction of wave and turbulence processes in wall jets is presented.
NASA Astrophysics Data System (ADS)
Zhang, Jie; Nixon, Andrew; Barber, Tom; Budyn, Nicolas; Bevan, Rhodri; Croxford, Anthony; Wilcox, Paul
2018-04-01
In this paper, a methodology of using finite element (FE) model to validate a ray-based model in the simulation of full matrix capture (FMC) ultrasonic array data set is proposed. The overall aim is to separate signal contributions from different interactions in FE results for easier comparing each individual component in the ray-based model results. This is achieved by combining the results from multiple FE models of the system of interest that include progressively more geometrical features while preserving the same mesh structure. It is shown that the proposed techniques allow the interactions from a large number of different ray-paths to be isolated in FE results and compared directly to the results from a ray-based forward model.
Impact of computational structure-based methods on drug discovery.
Reynolds, Charles H
2014-01-01
Structure-based drug design has become an indispensible tool in drug discovery. The emergence of structure-based design is due to gains in structural biology that have provided exponential growth in the number of protein crystal structures, new computational algorithms and approaches for modeling protein-ligand interactions, and the tremendous growth of raw computer power in the last 30 years. Computer modeling and simulation have made major contributions to the discovery of many groundbreaking drugs in recent years. Examples are presented that highlight the evolution of computational structure-based design methodology, and the impact of that methodology on drug discovery.
2017-09-29
Report: The Military-Industrial-Scientific Complex and the Rise of New Powers: Conceptual, Theoretical and Methodological Contributions and the... Methodological Contributions and the Brazilian Case Report Term: 0-Other Email: aminvielle@ucsd.edu Distribution Statement: 1-Approved for public
Rethinking Teacher Evaluation: A Conversation about Statistical Inferences and Value-Added Models
ERIC Educational Resources Information Center
Callister Everson, Kimberlee; Feinauer, Erika; Sudweeks, Richard R.
2013-01-01
In this article, the authors provide a methodological critique of the current standard of value-added modeling forwarded in educational policy contexts as a means of measuring teacher effectiveness. Conventional value-added estimates of teacher quality are attempts to determine to what degree a teacher would theoretically contribute, on average,…
A methodology for the assessment of inhalation exposure to aluminium from antiperspirant sprays.
Schwarz, Katharina; Pappa, Gerlinde; Miertsch, Heike; Scheel, Julia; Koch, Wolfgang
2018-04-01
Inhalative exposure can occur accidentally when using cosmetic spray products. Usually, a tiered approach is applied for exposure assessment, starting with rather conservative, simplistic calculation models that may be improved with measured data and more refined modelling. Here we report on an advanced methodology to mimic in-use conditions for antiperspirant spray products to provide a more accurate estimate of the amount of aluminium possibly inhaled and taken up systemically, thus contributing to the overall body burden. Four typical products were sprayed onto a skin surrogate in defined rooms. For aluminium, size-related aerosol release fractions, i.e. inhalable, thoracic and respirable, were determined by a mass balance method taking droplet maturation into account. These data were included into a simple two-box exposure model, allowing calculation of the inhaled aluminium dose over 12 min. Systemic exposure doses were calculated for exposure of the deep lung and the upper respiratory tract using the Multiple Path Particle Deposition Model (MPPD) model. The total systemically available dose of aluminium was in all cases found to be less than 0.5 µg per application. With this study it could be demonstrated that refinement of the input data of the two-box exposure model with measured data of released airborne aluminium is a valuable approach to analyse the contribution of antiperspirant spray inhalation to total aluminium exposure as part of the overall risk assessment. We suggest the methodology which can also be applied to other exposure modelling approaches for spray products, and further is adapted to other similar use scenarios.
ERIC Educational Resources Information Center
Itoh, Masayuki; Suemoto, Makoto; Matsuoka, Koji; Ito, Atsushi; Yui, Kiyomitsu; Matsuda, Tsuyoshi; Ishikawa, Masanobu
2008-01-01
Purpose: The purpose of this paper is to introduce the Regional Centre of Expertise (RCE) on education for sustainable development (ESD) Hyogo-Kobe, and the contribution of Kobe University as a model case. An attempt to develop and implement a new ESD programme in higher education is also reported. Design/methodology/approach: A brief description…
Preparing for budget-based payment methodologies: global payment and episode-based payment.
Hudson, Mark E
2015-10-01
Use of budget-based payment methodologies (capitation and episode-based bundled payment) has been demonstrated to drive value in healthcare delivery. With a focus on high-volume, high-cost surgical procedures, inclusion of anaesthesiology services in these methodologies is likely. This review provides a summary of budget-based payment methodologies and practical information necessary for anaesthesiologists to prepare for participation in these programmes. Although few examples of anaesthesiologists' participation in these models exist, an understanding of the structure of these programmes and opportunities for participation are available. Prospective preparation in developing anaesthesiology-specific bundled payment profiles and early participation in pathway development associated with selected episodes of care are essential for successful participation as a gainsharing partner. With significant opportunity to contribute to care coordination and cost management, anaesthesiology can play an important role in budget-based payment programmes and should expect to participate as full gainsharing partners. Precise costing methodologies and accurate economic modelling, along with identification of quality management and cost control opportunities, will help identify participation opportunities and appropriate payment and gainsharing agreements. Anaesthesiology-specific examples with budget-based payment models are needed to help guide increased participation in these programmes.
Canis familiaris As a Model for Non-Invasive Comparative Neuroscience.
Bunford, Nóra; Andics, Attila; Kis, Anna; Miklósi, Ádám; Gácsi, Márta
2017-07-01
There is an ongoing need to improve animal models for investigating human behavior and its biological underpinnings. The domestic dog (Canis familiaris) is a promising model in cognitive neuroscience. However, before it can contribute to advances in this field in a comparative, reliable, and valid manner, several methodological issues warrant attention. We review recent non-invasive canine neuroscience studies, primarily focusing on (i) variability among dogs and between dogs and humans in cranial characteristics, and (ii) generalizability across dog and dog-human studies. We argue not for methodological uniformity but for functional comparability between methods, experimental designs, and neural responses. We conclude that the dog may become an innovative and unique model in comparative neuroscience, complementing more traditional models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Lonsdorf, Tina B; Merz, Christian J
2017-09-01
Why do only some individuals develop pathological anxiety following adverse events? Fear acquisition, extinction and return of fear paradigms serve as experimental learning models for the development, treatment and relapse of anxiety. Individual differences in experimental performance were however mostly regarded as 'noise' by researchers interested in basic associative learning principles. Our work for the first time presents a comprehensive literature overview and methodological discussion on inter-individual differences in fear acquisition, extinction and return of fear. We tell a story from noise that steadily develops into a meaningful tune and converges to a model of mechanisms contributing to individual risk/resilience with respect to fear and anxiety-related behavior. Furthermore, in light of the present 'replicability crisis' we identify methodological pitfalls and provide suggestions for study design and analyses tailored to individual difference research in fear conditioning. Ultimately, synergistic transdisciplinary and collaborative efforts hold promise to not only improve our mechanistic understanding but can also be expected to contribute to the development of specifically tailored ('individualized') intervention and targeted prevention programs in the future. Copyright © 2017 Elsevier Ltd. All rights reserved.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Gobbi, Gian Paolo; Barnaba, Francesca; Bolignano, Andrea; Costabile, Francesca; Di Liberto, Luca; Dionisi, Davide; Drewnick, Frank; Lucarelli, Franco; Manigrasso, Maurizio; Nava, Silvia; Sauvage, Laurent; Sozzi, Roberto; Struckmeier, Caroline; Wille, Holger
2015-04-01
The EC LIFE+2010 DIAPASON Project (Desert dust Impact on Air quality through model-Predictions and Advanced Sensors ObservatioNs, www.diapason-life.eu) intends to contribute new methodologies to assess the role of aerosol advections of Saharan dust to the local PM loads recorded in Europe. To this goal, automated Polarization Lidar-Ceilometers (PLCs) were prototyped within DIAPASON to certify the presence of Saharan dust plumes and support evaluating their mass loadings in the lowermost atmosphere. The whole process also involves operational dust forecasts, as well as satellite and in-situ observations. Demonstration of the Project is implemented in the pilot region of Rome (Central Italy) where three networked DIAPASON PLCs started, in October 2013 a year-round, 24h/day monitoring of the altitude-resolved aerosol backscatter and depolarization profiles. Two intensive observational periods (IOPs) involving chemical analysis and detailed physical characterization of aerosol samples have also been carried out in this year-long campaign, namely in Fall 2013 and Spring 2014. These allowed for an extensive interpretation of the PLC observations, highlighting important synergies between the PLC and the in situ data. The presentation will address capabilities of the employed PLCs, observations agreement with model forecasts of dust advections, retrievals of aerosol properties and methodologies developed to detect Saharan advections and to evaluate the relevant mass contribution to PM10. This latter task is intended to provide suggestions on possible improvements to the current EC Guidelines (2011) on this matter. In fact, specific Guidelines are delivered by the European Commission to provide the Member States a common method to asses the Saharan dust contribution to the currently legislated PM-related Air Quality metrics. The DIAPASON experience shows that improvements can be proposed to make the current EC Methodology more robust and flexible. The methodology DIAPASON recommends has been designed and validated taking advantage of the PLC observations and highlights the benefits of the operational use of such systems in routine Air Quality applications. Concurrently, PLC activities are contributing to the COST Action "TOPROF", an European effort aiming at the setup and operational use of Lidar-Ceilometers networks for meteorological and safety purposes.
Tsushima, Yoko; Brient, Florent; Klein, Stephen A.; ...
2017-11-27
The CFMIP Diagnostic Codes Catalogue assembles cloud metrics, diagnostics and methodologies, together with programs to diagnose them from general circulation model (GCM) outputs written by various members of the CFMIP community. This aims to facilitate use of the diagnostics by the wider community studying climate and climate change. Here, this paper describes the diagnostics and metrics which are currently in the catalogue, together with examples of their application to model evaluation studies and a summary of some of the insights these diagnostics have provided into the main shortcomings in current GCMs. Analysis of outputs from CFMIP and CMIP6 experiments willmore » also be facilitated by the sharing of diagnostic codes via this catalogue. Any code which implements diagnostics relevant to analysing clouds – including cloud–circulation interactions and the contribution of clouds to estimates of climate sensitivity in models – and which is documented in peer-reviewed studies, can be included in the catalogue. We very much welcome additional contributions to further support community analysis of CMIP6 outputs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsushima, Yoko; Brient, Florent; Klein, Stephen A.
The CFMIP Diagnostic Codes Catalogue assembles cloud metrics, diagnostics and methodologies, together with programs to diagnose them from general circulation model (GCM) outputs written by various members of the CFMIP community. This aims to facilitate use of the diagnostics by the wider community studying climate and climate change. Here, this paper describes the diagnostics and metrics which are currently in the catalogue, together with examples of their application to model evaluation studies and a summary of some of the insights these diagnostics have provided into the main shortcomings in current GCMs. Analysis of outputs from CFMIP and CMIP6 experiments willmore » also be facilitated by the sharing of diagnostic codes via this catalogue. Any code which implements diagnostics relevant to analysing clouds – including cloud–circulation interactions and the contribution of clouds to estimates of climate sensitivity in models – and which is documented in peer-reviewed studies, can be included in the catalogue. We very much welcome additional contributions to further support community analysis of CMIP6 outputs.« less
ERIC Educational Resources Information Center
Blanc, Ann K.; Bruce, Judith
2013-01-01
This special issue addresses an ambitious set of concerns around the experience of adolescents in the majority world: expanded models of development, successful models of intervention, and the impact of globalization. The papers, which vary widely in both substance and methodology, make a substantial contribution to pushing forward the boundaries…
Dennis P. Dykstra; Robert A. Monserud
2009-01-01
The purpose of the international conference from which these proceedings are drawn was to explore relationships between forest management activities and timber quality. Sessions were organized to explore models and simulation methodologies that contribute to an understanding of tree development over time and the ways that management and harvesting activities can...
Saharan dust contribution to PM levels: The EC LIFE+ DIAPASON project
NASA Astrophysics Data System (ADS)
Gobbi, G. P.; Wille, H.; Sozzi, R.; Angelini, F.; Barnaba, F.; Costabile, F.; Frey, S.; Bolignano, A.; Di Giosa, A.
2012-04-01
The contribution of Saharan-dust advections to both daily and annual PM average values can be significant all over Southern Europe. The most important effects of dust on the number of PM exceedances are mostly observed in polluted areas and large cities. While a wide literature exists documenting episodes of Saharan dust transport towards the Euro-Mediterranean region and Europe in general, a limited number of studies are still available providing statistically significant results on the impact of Saharan dust on the particulate matter loads over the continent. A four-year (2001-2004) study performed in Rome (Italy) found these events to contribute to the average ground PM10 with about 15±10 µg/m3 on about 17% of the days in a year. Since the PM10 yearly average of many traffic stations in Rome is close to 40 μg/m3, these events can cause the PM10 concentration to exceed air quality limit values (50 μg/m3 as daily average) set by the EU Air Quality Directive 2008/50/EC. Although the European legislation allows Member States to subtract the contribution of natural sources before counting PM10 exceedances, definition of an optimal methodology to quantitatively assess such contribution is still in progress. On the basis of the current European Guidelines on the assessment of natural contributions to PM, the DIAPASON project ("Desert-dust Impact on Air quality through model-Predictions and Advanced Sensors ObservatioNs", recently funded under the EC LIFE+ program) has been formulated to provide a robust, user-oriented methodology to assess the presence of desert dust and its contribution to PM levels. To this end, in addition to satellite-based data and model forecasts, the DIAPASON methodology will employ innovative and affordable technologies, partly prototyped within the project itself, as an operational Polarization Lidar-Ceilometer (laser radar) capable of detecting and profiling dust clouds from the ground up to 10 km altitude. The DIAPASON Project (2011-2014) will be first implemented as a network of three stations in the Rome metropolitan area. However, the DIAPASON methodology to detect/quantify the Saharan dust contribution to PM will be designed to be easily applicable by air-quality and meteorological agencies. In fact, the possibility of manufacturing cheap, operational polarization lidar-ceilometers and scatter them on the territory will also represent a breakthrough in the detection and quantification of other atmospheric aerosol layers, as volcanic or wild-fire plumes, with further benefits in terms of meteo forecasts, flight security and air quality assessments.
Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller
2018-02-01
Given the complexity of factors contributing to alcohol misuse, appropriate epistemologies and methodologies are needed to understand and intervene meaningfully. We aimed to (1) provide an overview of computational modeling methodologies, with an emphasis on system dynamics modeling; (2) explain how community-based system dynamics modeling can forge new directions in alcohol prevention research; and (3) present a primer on how to build alcohol misuse simulation models using system dynamics modeling, with an emphasis on stakeholder involvement, data sources and model validation. Throughout, we use alcohol misuse among college students in the United States as a heuristic example for demonstrating these methodologies. System dynamics modeling employs a top-down aggregate approach to understanding dynamically complex problems. Its three foundational properties-stocks, flows and feedbacks-capture non-linearity, time-delayed effects and other system characteristics. As a methodological choice, system dynamics modeling is amenable to participatory approaches; in particular, community-based system dynamics modeling has been used to build impactful models for addressing dynamically complex problems. The process of community-based system dynamics modeling consists of numerous stages: (1) creating model boundary charts, behavior-over-time-graphs and preliminary system dynamics models using group model-building techniques; (2) model formulation; (3) model calibration; (4) model testing and validation; and (5) model simulation using learning-laboratory techniques. Community-based system dynamics modeling can provide powerful tools for policy and intervention decisions that can result ultimately in sustainable changes in research and action in alcohol misuse prevention. © 2017 Society for the Study of Addiction.
NASA Astrophysics Data System (ADS)
Acri, Antonio; Offner, Guenter; Nijman, Eugene; Rejlek, Jan
2016-10-01
Noise legislations and the increasing customer demands determine the Noise Vibration and Harshness (NVH) development of modern commercial vehicles. In order to meet the stringent legislative requirements for the vehicle noise emission, exact knowledge of all vehicle noise sources and their acoustic behavior is required. Transfer path analysis (TPA) is a fairly well established technique for estimating and ranking individual low-frequency noise or vibration contributions via the different transmission paths. Transmission paths from different sources to target points of interest and their contributions can be analyzed by applying TPA. This technique is applied on test measurements, which can only be available on prototypes, at the end of the designing process. In order to overcome the limits of TPA, a numerical transfer path analysis methodology based on the substructuring of a multibody system is proposed in this paper. Being based on numerical simulation, this methodology can be performed starting from the first steps of the designing process. The main target of the proposed methodology is to get information of noise sources contributions of a dynamic system considering the possibility to have multiple forces contemporary acting on the system. The contributions of these forces are investigated with particular focus on distribute or moving forces. In this paper, the mathematical basics of the proposed methodology and its advantages in comparison with TPA will be discussed. Then, a dynamic system is investigated with a combination of two methods. Being based on the dynamic substructuring (DS) of the investigated model, the methodology proposed requires the evaluation of the contact forces at interfaces, which are computed with a flexible multi-body dynamic (FMBD) simulation. Then, the structure-borne noise paths are computed with the wave based method (WBM). As an example application a 4-cylinder engine is investigated and the proposed methodology is applied on the engine block. The aim is to get accurate and clear relationships between excitations and responses of the simulated dynamic system, analyzing the noise and vibrational sources inside a car engine, showing the main advantages of a numerical methodology.
NASA Astrophysics Data System (ADS)
Belis, Claudio A.; Pernigotti, Denise; Pirovano, Guido
2017-04-01
Source Apportionment (SA) is the identification of ambient air pollution sources and the quantification of their contribution to pollution levels. This task can be accomplished using different approaches: chemical transport models and receptor models. Receptor models are derived from measurements and therefore are considered as a reference for primary sources urban background levels. Chemical transport model have better estimation of the secondary pollutants (inorganic) and are capable to provide gridded results with high time resolution. Assessing the performance of SA model results is essential to guarantee reliable information on source contributions to be used for the reporting to the Commission and in the development of pollution abatement strategies. This is the first intercomparison ever designed to test both receptor oriented models (or receptor models) and chemical transport models (or source oriented models) using a comprehensive method based on model quality indicators and pre-established criteria. The target pollutant of this exercise, organised in the frame of FAIRMODE WG 3, is PM10. Both receptor models and chemical transport models present good performances when evaluated against their respective references. Both types of models demonstrate quite satisfactory capabilities to estimate the yearly source contributions while the estimation of the source contributions at the daily level (time series) is more critical. Chemical transport models showed a tendency to underestimate the contribution of some single sources when compared to receptor models. For receptor models the most critical source category is industry. This is probably due to the variety of single sources with different characteristics that belong to this category. Dust is the most problematic source for Chemical Transport Models, likely due to the poor information about this kind of source in the emission inventories, particularly concerning road dust re-suspension, and consequently the little detail about the chemical components of this source used in the models. The sensitivity tests show that chemical transport models show better performances when displaying a detailed set of sources (14) than when using a simplified one (only 8). It was also observed that an enhanced vertical profiling can improve the estimation of specific sources, such as industry, under complex meteorological conditions and that an insufficient spatial resolution in urban areas can impact on the capabilities of models to estimate the contribution of diffuse primary sources (e.g. traffic). Both families of models identify traffic and biomass burning as the first and second most contributing categories, respectively, to elemental carbon. The results of this study demonstrate that the source apportionment assessment methodology developed by the JRC is applicable to any kind of SA model. The same methodology is implemented in the on-line DeltaSA tool to support source apportionment model evaluation (http://source-apportionment.jrc.ec.europa.eu/).
Biobehavioral Outcomes Following Psychological Interventions for Cancer Patients
Andersen, Barbara L.
2007-01-01
Psychological interventions for adult cancer patients have primarily focused on reducing stress and enhancing quality of life. However, there has been expanded focus on biobehavioral outcomes—health behaviors, compliance, biologic responses, and disease outcomes—consistent with the Biobehavioral Model of cancer stress and disease course. The author reviewed this expanded focus in quasi-experimental and experimental studies of psychological interventions, provided methodologic detail, summarized findings, and highlighted novel contributions. A final section discussed methodologic issues, research directions, and challenges for the coming decade. PMID:12090371
NASA Astrophysics Data System (ADS)
Tabibzadeh, Maryam
According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis of any high-risk operations, in not only the oil and gas industry but also in other industries such as nuclear power plants, aviation industry, and transportation sector.
ERIC Educational Resources Information Center
Ferrão, Maria Eugénia; Couto, Alcino Pinto
2014-01-01
This article focuses on the use of a value-added approach for promoting school improvement. It presents yearly value-added estimates, analyses their stability over time, and discusses the contribution of this methodological approach for promoting school improvement programmes in the Portuguese system of evaluation. The value-added model is applied…
Source apportionment and sensitivity analysis: two methodologies with two different purposes
NASA Astrophysics Data System (ADS)
Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe
2017-11-01
This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts
(sensitivity analysis) and contributions
(source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.
Jiang, Jheng Jie; Lee, Chon Lin; Fang, Meng Der; Boyd, Kenneth G.; Gibb, Stuart W.
2015-01-01
This paper presents a methodology based on multivariate data analysis for characterizing potential source contributions of emerging contaminants (ECs) detected in 26 river water samples across multi-scape regions during dry and wet seasons. Based on this methodology, we unveil an approach toward potential source contributions of ECs, a concept we refer to as the “Pharmaco-signature.” Exploratory analysis of data points has been carried out by unsupervised pattern recognition (hierarchical cluster analysis, HCA) and receptor model (principal component analysis-multiple linear regression, PCA-MLR) in an attempt to demonstrate significant source contributions of ECs in different land-use zone. Robust cluster solutions grouped the database according to different EC profiles. PCA-MLR identified that 58.9% of the mean summed ECs were contributed by domestic impact, 9.7% by antibiotics application, and 31.4% by drug abuse. Diclofenac, ibuprofen, codeine, ampicillin, tetracycline, and erythromycin-H2O have significant pollution risk quotients (RQ>1), indicating potentially high risk to aquatic organisms in Taiwan. PMID:25874375
Effect of body composition methodology on heritability estimation of body fatness
USDA-ARS?s Scientific Manuscript database
Heritability estimates of human body fatness vary widely and the contribution of body composition methodology to this variability is unknown. The effect of body composition methodology on estimations of genetic and environmental contributions to body fatness variation was examined in 78 adult male ...
The Zone of Inertia: Absorptive Capacity and Organizational Change
ERIC Educational Resources Information Center
Godkin, Lynn
2010-01-01
Purpose: The purpose of this paper is to describe how interruptions in organizational learning effect institutional absorptive capacity and contribute to organizational inertia. Design/methodology/approach: An exploratory model is presented as a heuristic to describe how interruptions in organizational learning affect absorptive capacity.…
Pathways towards Sustainability through Higher Education
ERIC Educational Resources Information Center
Sibbel, Anne
2009-01-01
Purpose: The aim of this paper is to contribute to aligning higher education towards meeting the challenge of global sustainability. Design/methodology/approach: The barriers to sustainability are juxtaposed against the resources, responsibilities and potential of higher education. Ideas from several models and from within several disciplines are…
Anstey, Kaarin J; Bielak, Allison AM; Birrell, Carole L; Browning, Colette J; Burns, Richard A; Byles, Julie; Kiley, Kim M; Nepal, Binod; Ross, Lesley A; Steel, David; Windsor, Timothy D
2014-01-01
Aim To describe the Dynamic Analyses to Optimise Ageing (DYNOPTA) project and illustrate its contributions to understanding ageing through innovative methodology, and investigations on outcomes based on the project themes. DYNOPTA provides a platform and technical expertise that may be used to combine other national and international datasets. Method The DYNOPTA project has pooled and harmonized data from nine Australian longitudinal studies to create the largest available longitudinal dataset (N=50652) on ageing in Australia. Results A range of findings have resulted from the study to date, including methodological advances, prevalence rates of disease and disability, and mapping trajectories of ageing with and without increasing morbidity. DYNOPTA also forms the basis of a microsimulation model that will provide projections of future costs of disease and disability for the baby boomer cohort. Conclusion DYNOPTA contributes significantly to the Australian evidence-base on ageing to inform key social and health policy domains. PMID:22032767
Major challenges for correlational ecological niche model projections to future climate conditions.
Peterson, A Townsend; Cobos, Marlon E; Jiménez-García, Daniel
2018-06-20
Species-level forecasts of distributional potential and likely distributional shifts, in the face of changing climates, have become popular in the literature in the past 20 years. Many refinements have been made to the methodology over the years, and the result has been an approach that considers multiple sources of variation in geographic predictions, and how that variation translates into both specific predictions and uncertainty in those predictions. Although numerous previous reviews and overviews of this field have pointed out a series of assumptions and caveats associated with the methodology, three aspects of the methodology have important impacts but have not been treated previously in detail. Here, we assess those three aspects: (1) effects of niche truncation on model transfers to future climate conditions, (2) effects of model selection procedures on future-climate transfers of ecological niche models, and (3) relative contributions of several factors (replicate samples of point data, general circulation models, representative concentration pathways, and alternative model parameterizations) to overall variance in model outcomes. Overall, the view is one of caution: although resulting predictions are fascinating and attractive, this paradigm has pitfalls that may bias and limit confidence in niche model outputs as regards the implications of climate change for species' geographic distributions. © 2018 New York Academy of Sciences.
Bringing Action Reflection Learning into Action Learning
ERIC Educational Resources Information Center
Rimanoczy, Isabel; Brown, Carole
2008-01-01
This paper introduces Action Reflection Learning (ARL) as a learning methodology that can contribute to, and enrich, the practice of action learning programs. It describes the Swedish constructivist origins of the model, its evolution and the coded responses that resulted from researching the practice. The paper presents the resulting sixteen ARL…
Notes about COOL: Analysis and Highlights of Complex View in Education
ERIC Educational Resources Information Center
de Oliveira, C. A.
2012-01-01
Purpose: The purpose of this paper is to present principles from the complex approach in education and describe some practical pedagogic experiences enhancing how "real world" perspectives have influenced and contributed to curriculum development. Design/methodology/approach: Necessity of integration in terms of knowledge modeling is an…
Factors Contributing to Institutions Achieving Environmental Sustainability
ERIC Educational Resources Information Center
James, Matthew; Card, Karen
2012-01-01
Purpose: The purpose of this paper is to determine what factors contributed to three universities achieving environmental sustainability. Design/methodology/approach: A case study methodology was used to determine how each factor contributed to the institutions' sustainability. Site visits, fieldwork, document reviews, and interviews with…
Formulating accident occurrence as a survival process.
Chang, H L; Jovanis, P P
1990-10-01
A conceptual framework for accident occurrence is developed based on the principle of the driver as an information processor. The framework underlies the development of a modeling approach that is consistent with the definition of exposure to risk as a repeated trial. Survival theory is proposed as a statistical technique that is consistent with the conceptual structure and allows the exploration of a wide range of factors that contribute to highway operating risk. This survival model of accident occurrence is developed at a disaggregate level, allowing safety researchers to broaden the scope of studies which may be limited by the use of traditional aggregate approaches. An application of the approach to motor carrier safety is discussed as are potential applications to a variety of transportation industries. Lastly, a typology of highway safety research methodologies is developed to compare the properties of four safety methodologies: laboratory experiments, on-the-road studies, multidisciplinary accident investigations, and correlational studies. The survival theory formulation has a mathematical structure that is compatible with each safety methodology, so it may facilitate the integration of findings across methodologies.
A lava flow simulation model for the development of volcanic hazard maps for Mount Etna (Italy)
NASA Astrophysics Data System (ADS)
Damiani, M. L.; Groppelli, G.; Norini, G.; Bertino, E.; Gigliuto, A.; Nucita, A.
2006-05-01
Volcanic hazard assessment is of paramount importance for the safeguard of the resources exposed to volcanic hazards. In the paper we present ELFM, a lava flow simulation model for the evaluation of the lava flow hazard on Mount Etna (Sicily, Italy), the most important active volcano in Europe. The major contributions of the paper are: (a) a detailed specification of the lava flow simulation model and the specification of an algorithm implementing it; (b) the definition of a methodological framework for applying the model to the specific volcano. For what concerns the former issue, we propose an extended version of an existing stochastic model that has been applied so far only to the assessment of the volcanic hazard on Lanzarote and Tenerife (Canary Islands). Concerning the methodological framework, we claim model validation is definitely needed for assessing the effectiveness of the lava flow simulation model. To that extent a strategy has been devised for the generation of simulation experiments and evaluation of their outcomes.
Symposium Introduction: Papers on 'Modeling National Health Expenditures'.
Getzen, Thomas E; Okunade, Albert A
2017-07-01
Significant contributions have been made since the World Health Organization published Brian Abel-Smith's pioneering comparative study of national health expenditures more than 50 years ago. There have been major advances in theories, model specifications, methodological approaches, and data structures. This introductory essay provides a historical context for this line of work, highlights four newly published studies that move health economics research forward, and indicates several important areas of challenging but potentially fruitful research to strengthen future contributions to the literature and make empirical findings more useful for evaluating health policy decisions. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Methodology for the nuclear design validation of an Alternate Emergency Management Centre (CAGE)
NASA Astrophysics Data System (ADS)
Hueso, César; Fabbri, Marco; de la Fuente, Cristina; Janés, Albert; Massuet, Joan; Zamora, Imanol; Gasca, Cristina; Hernández, Héctor; Vega, J. Ángel
2017-09-01
The methodology is devised by coupling different codes. The study of weather conditions as part of the data of the site will determine the relative concentrations of radionuclides in the air using ARCON96. The activity in the air is characterized depending on the source and release sequence specified in NUREG-1465 by RADTRAD code, which provides results of the inner cloud source term contribution. Known activities, energy spectra are inferred using ORIGEN-S, which are used as input for the models of the outer cloud, filters and containment generated with MCNP5. The sum of the different contributions must meet the conditions of habitability specified by the CSN (Spanish Nuclear Regulatory Body) (TEDE <50 mSv and equivalent dose to the thyroid <500 mSv within 30 days following the accident doses) so that the dose is optimized by varying parameters such as CAGE location, flow filtering need for recirculation, thicknesses and compositions of the walls, etc. The results for the most penalizing area meet the established criteria, and therefore the CAGE building design based on the methodology presented is radiologically validated.
Altszyler, Edgar; Ventura, Alejandra C; Colman-Lerner, Alejandro; Chernomoretz, Ariel
2017-01-01
Ultrasensitive response motifs, capable of converting graded stimuli into binary responses, are well-conserved in signal transduction networks. Although it has been shown that a cascade arrangement of multiple ultrasensitive modules can enhance the system's ultrasensitivity, how a given combination of layers affects a cascade's ultrasensitivity remains an open question for the general case. Here, we introduce a methodology that allows us to determine the presence of sequestration effects and to quantify the relative contribution of each module to the overall cascade's ultrasensitivity. The proposed analysis framework provides a natural link between global and local ultrasensitivity descriptors and it is particularly well-suited to characterize and understand mathematical models used to study real biological systems. As a case study, we have considered three mathematical models introduced by O'Shaughnessy et al. to study a tunable synthetic MAPK cascade, and we show how our methodology can help modelers better understand alternative models.
Altszyler, Edgar; Ventura, Alejandra C.; Colman-Lerner, Alejandro; Chernomoretz, Ariel
2017-01-01
Ultrasensitive response motifs, capable of converting graded stimuli into binary responses, are well-conserved in signal transduction networks. Although it has been shown that a cascade arrangement of multiple ultrasensitive modules can enhance the system’s ultrasensitivity, how a given combination of layers affects a cascade’s ultrasensitivity remains an open question for the general case. Here, we introduce a methodology that allows us to determine the presence of sequestration effects and to quantify the relative contribution of each module to the overall cascade’s ultrasensitivity. The proposed analysis framework provides a natural link between global and local ultrasensitivity descriptors and it is particularly well-suited to characterize and understand mathematical models used to study real biological systems. As a case study, we have considered three mathematical models introduced by O’Shaughnessy et al. to study a tunable synthetic MAPK cascade, and we show how our methodology can help modelers better understand alternative models. PMID:28662096
Reference Model 6 (RM6): Oscillating Wave Energy Converter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bull, Diana L; Smith, Chris; Jenne, Dale Scott
This report is an addendum to SAND2013-9040: Methodology for Design and Economic Analysis of Marine Energy Conversion (MEC) Technologies. This report describes an Oscillating Water Column Wave Energy Converter reference model design in a complementary manner to Reference Models 1-4 contained in the above report. In this report, a conceptual design for an Oscillating Water Column Wave Energy Converter (WEC) device appropriate for the modeled reference resource site was identified, and a detailed backward bent duct buoy (BBDB) device design was developed using a combination of numerical modeling tools and scaled physical models. Our team used the methodology in SAND2013-9040more » for the economic analysis that included costs for designing, manufacturing, deploying, and operating commercial-scale MEC arrays, up to 100 devices. The methodology was applied to identify key cost drivers and to estimate levelized cost of energy (LCOE) for this RM6 Oscillating Water Column device in dollars per kilowatt-hour ($/kWh). Although many costs were difficult to estimate at this time due to the lack of operational experience, the main contribution of this work was to disseminate a detailed set of methodologies and models that allow for an initial cost analysis of this emerging technology. This project is sponsored by the U.S. Department of Energy's (DOE) Wind and Water Power Technologies Program Office (WWPTO), within the Office of Energy Efficiency & Renewable Energy (EERE). Sandia National Laboratories, the lead in this effort, collaborated with partners from National Laboratories, industry, and universities to design and test this reference model.« less
Cierkens, Katrijn; Plano, Salvatore; Benedetti, Lorenzo; Weijers, Stefan; de Jonge, Jarno; Nopens, Ingmar
2012-01-01
Application of activated sludge models (ASMs) to full-scale wastewater treatment plants (WWTPs) is still hampered by the problem of model calibration of these over-parameterised models. This either requires expert knowledge or global methods that explore a large parameter space. However, a better balance in structure between the submodels (ASM, hydraulic, aeration, etc.) and improved quality of influent data result in much smaller calibration efforts. In this contribution, a methodology is proposed that links data frequency and model structure to calibration quality and output uncertainty. It is composed of defining the model structure, the input data, an automated calibration, confidence interval computation and uncertainty propagation to the model output. Apart from the last step, the methodology is applied to an existing WWTP using three models differing only in the aeration submodel. A sensitivity analysis was performed on all models, allowing the ranking of the most important parameters to select in the subsequent calibration step. The aeration submodel proved very important to get good NH(4) predictions. Finally, the impact of data frequency was explored. Lowering the frequency resulted in larger deviations of parameter estimates from their default values and larger confidence intervals. Autocorrelation due to high frequency calibration data has an opposite effect on the confidence intervals. The proposed methodology opens doors to facilitate and improve calibration efforts and to design measurement campaigns.
Multirate flutter suppression system design for the Benchmark Active Controls Technology Wing
NASA Technical Reports Server (NTRS)
Berg, Martin C.; Mason, Gregory S.
1994-01-01
To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies will be applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing (also called the PAPA wing). Eventually, the designs will be implemented in hardware and tested on the BACT wing in a wind tunnel. This report describes a project at the University of Washington to design a multirate flutter suppression system for the BACT wing. The objective of the project was two fold. First, to develop a methodology for designing robust multirate compensators, and second, to demonstrate the methodology by applying it to the design of a multirate flutter suppression system for the BACT wing. The contributions of this project are (1) development of an algorithm for synthesizing robust low order multirate control laws (the algorithm is capable of synthesizing a single compensator which stabilizes both the nominal plant and multiple plant perturbations; (2) development of a multirate design methodology, and supporting software, for modeling, analyzing and synthesizing multirate compensators; and (3) design of a multirate flutter suppression system for NASA's BACT wing which satisfies the specified design criteria. This report describes each of these contributions in detail. Section 2.0 discusses our design methodology. Section 3.0 details the results of our multirate flutter suppression system design for the BACT wing. Finally, Section 4.0 presents our conclusions and suggestions for future research. The body of the report focuses primarily on the results. The associated theoretical background appears in the three technical papers that are included as Attachments 1-3. Attachment 4 is a user's manual for the software that is key to our design methodology.
[Artificial neural networks for decision making in urologic oncology].
Remzi, M; Djavan, B
2007-06-01
This chapter presents a detailed introduction regarding Artificial Neural Networks (ANNs) and their contribution to modern Urologic Oncology. It includes a description of ANNs methodology and points out the differences between Artifical Intelligence and traditional statistic models in terms of usefulness for patients and clinicians, and its advantages over current statistical analysis.
Supply Chain Development: Insights from Strategic Niche Management
ERIC Educational Resources Information Center
Caniels, Marjolein C. J.; Romijn, Henny A.
2008-01-01
Purpose: The purpose of this paper is to contribute to the study of supply chain design from the perspective of complex dynamic systems. Unlike extant studies that use formal simulation modelling and associated methodologies rooted in the physical sciences, it adopts a framework rooted in the social sciences, strategic niche management, which…
ERIC Educational Resources Information Center
Develaki, Maria
2017-01-01
Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and…
Madlambayan, Gerard J.; Butler, Jason M.; Hosaka, Koji; Jorgensen, Marda; Fu, Dongtao; Guthrie, Steven M.; Shenoy, Anitha K.; Brank, Adam; Russell, Kathryn J.; Otero, Jaclyn; Siemann, Dietmar W.
2009-01-01
Adult bone marrow (BM) contributes to neovascularization in some but not all settings, and reasons for these discordant results have remained unexplored. We conducted novel comparative studies in which multiple neovascularization models were established in single mice to reduce variations in experimental methodology. In different combinations, BM contribution was detected in ischemic retinas and, to a lesser extent, Lewis lung carcinoma cells, whereas B16 melanomas showed little to no BM contribution. Using this spectrum of BM contribution, we demonstrate the necessity for site-specific expression of stromal-derived factor-1α (SDF-1α) and its mobilizing effects on BM. Blocking SDF-1α activity with neutralizing antibodies abrogated BM-derived neovascularization in lung cancer and retinopathy. Furthermore, secondary transplantation of single hematopoietic stem cells (HSCs) showed that HSCs are a long-term source of neovasculogenesis and that CD133+CXCR4+ myeloid progenitor cells directly participate in new blood vessel formation in response to SDF-1α. The varied BM contribution seen in different model systems is suggestive of redundant mechanisms governing postnatal neovasculogenesis and provides an explanation for contradictory results observed in the field. PMID:19717647
Adaptation of Mesoscale Weather Models to Local Forecasting
NASA Technical Reports Server (NTRS)
Manobianco, John T.; Taylor, Gregory E.; Case, Jonathan L.; Dianic, Allan V.; Wheeler, Mark W.; Zack, John W.; Nutter, Paul A.
2003-01-01
Methodologies have been developed for (1) configuring mesoscale numerical weather-prediction models for execution on high-performance computer workstations to make short-range weather forecasts for the vicinity of the Kennedy Space Center (KSC) and the Cape Canaveral Air Force Station (CCAFS) and (2) evaluating the performances of the models as configured. These methodologies have been implemented as part of a continuing effort to improve weather forecasting in support of operations of the U.S. space program. The models, methodologies, and results of the evaluations also have potential value for commercial users who could benefit from tailoring their operations and/or marketing strategies based on accurate predictions of local weather. More specifically, the purpose of developing the methodologies for configuring the models to run on computers at KSC and CCAFS is to provide accurate forecasts of winds, temperature, and such specific thunderstorm-related phenomena as lightning and precipitation. The purpose of developing the evaluation methodologies is to maximize the utility of the models by providing users with assessments of the capabilities and limitations of the models. The models used in this effort thus far include the Mesoscale Atmospheric Simulation System (MASS), the Regional Atmospheric Modeling System (RAMS), and the National Centers for Environmental Prediction Eta Model ( Eta for short). The configuration of the MASS and RAMS is designed to run the models at very high spatial resolution and incorporate local data to resolve fine-scale weather features. Model preprocessors were modified to incorporate surface, ship, buoy, and rawinsonde data as well as data from local wind towers, wind profilers, and conventional or Doppler radars. The overall evaluation of the MASS, Eta, and RAMS was designed to assess the utility of these mesoscale models for satisfying the weather-forecasting needs of the U.S. space program. The evaluation methodology includes objective and subjective verification methodologies. Objective (e.g., statistical) verification of point forecasts is a stringent measure of model performance, but when used alone, it is not usually sufficient for quantifying the value of the overall contribution of the model to the weather-forecasting process. This is especially true for mesoscale models with enhanced spatial and temporal resolution that may be capable of predicting meteorologically consistent, though not necessarily accurate, fine-scale weather phenomena. Therefore, subjective (phenomenological) evaluation, focusing on selected case studies and specific weather features, such as sea breezes and precipitation, has been performed to help quantify the added value that cannot be inferred solely from objective evaluation.
de la Estrella, Manuel; Mateo, Rubén G.; Wieringa, Jan J.; Mackinder, Barbara; Muñoz, Jesús
2012-01-01
Objectives Species Distribution Models (SDMs) are used to produce predictions of potential Leguminosae diversity in West Central Africa. Those predictions are evaluated subsequently using expert opinion. The established methodology of combining all SDMs is refined to assess species diversity within five defined vegetation types. Potential species diversity is thus predicted for each vegetation type respectively. The primary aim of the new methodology is to define, in more detail, areas of species richness for conservation planning. Methodology Using Maxent, SDMs based on a suite of 14 environmental predictors were generated for 185 West Central African Leguminosae species, each categorised according to one of five vegetation types: Afromontane, coastal, non-flooded forest, open formations, or riverine forest. The relative contribution of each environmental variable was compared between different vegetation types using a nonparametric Kruskal-Wallis analysis followed by a post-hoc Kruskal-Wallis Paired Comparison contrast. Legume species diversity patterns were explored initially using the typical method of stacking all SDMs. Subsequently, five different ensemble models were generated by partitioning SDMs according to vegetation category. Ecological modelers worked with legume specialists to improve data integrity and integrate expert opinion in the interpretation of individual species models and potential species richness predictions for different vegetation types. Results/Conclusions Of the 14 environmental predictors used, five showed no difference in their relative contribution to the different vegetation models. Of the nine discriminating variables, the majority were related to temperature variation. The set of variables that played a major role in the Afromontane species diversity model differed significantly from the sets of variables of greatest relative important in other vegetation categories. The traditional approach of stacking all SDMs indicated overall centers of diversity in the region but the maps indicating potential species richness by vegetation type offered more detailed information on which conservation efforts can be focused. PMID:22911808
Estimating Household Travel Energy Consumption in Conjunction with a Travel Demand Forecasting Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garikapati, Venu M.; You, Daehyun; Zhang, Wenwen
This paper presents a methodology for the calculation of the consumption of household travel energy at the level of the traffic analysis zone (TAZ) in conjunction with information that is readily available from a standard four-step travel demand model system. This methodology embeds two algorithms. The first provides a means of allocating non-home-based trips to residential zones that are the source of such trips, whereas the second provides a mechanism for incorporating the effects of household vehicle fleet composition on fuel consumption. The methodology is applied to the greater Atlanta, Georgia, metropolitan region in the United States and is foundmore » to offer a robust mechanism for calculating the footprint of household travel energy at the level of the individual TAZ; this mechanism makes possible the study of variations in the energy footprint across space. The travel energy footprint is strongly correlated with the density of the built environment, although socioeconomic differences across TAZs also likely contribute to differences in travel energy footprints. The TAZ-level calculator of the footprint of household travel energy can be used to analyze alternative futures and relate differences in the energy footprint to differences in a number of contributing factors and thus enables the design of urban form, formulation of policy interventions, and implementation of awareness campaigns that may produce more-sustainable patterns of energy consumption.« less
A methodology to assess the economic impact of power storage technologies.
El-Ghandour, Laila; Johnson, Timothy C
2017-08-13
We present a methodology for assessing the economic impact of power storage technologies. The methodology is founded on classical approaches to the optimal stopping of stochastic processes but involves an innovation that circumvents the need to, ex ante , identify the form of a driving process and works directly on observed data, avoiding model risks. Power storage is regarded as a complement to the intermittent output of renewable energy generators and is therefore important in contributing to the reduction of carbon-intensive power generation. Our aim is to present a methodology suitable for use by policy makers that is simple to maintain, adaptable to different technologies and easy to interpret. The methodology has benefits over current techniques and is able to value, by identifying a viable optimal operational strategy, a conceived storage facility based on compressed air technology operating in the UK.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, John R.; Brooks, Dusty Marie
In pressurized water reactors, the prevention, detection, and repair of cracks within dissimilar metal welds is essential to ensure proper plant functionality and safety. Weld residual stresses, which are difficult to model and cannot be directly measured, contribute to the formation and growth of cracks due to primary water stress corrosion cracking. Additionally, the uncertainty in weld residual stress measurements and modeling predictions is not well understood, further complicating the prediction of crack evolution. The purpose of this document is to develop methodology to quantify the uncertainty associated with weld residual stress that can be applied to modeling predictions andmore » experimental measurements. Ultimately, the results can be used to assess the current state of uncertainty and to build confidence in both modeling and experimental procedures. The methodology consists of statistically modeling the variation in the weld residual stress profiles using functional data analysis techniques. Uncertainty is quantified using statistical bounds (e.g. confidence and tolerance bounds) constructed with a semi-parametric bootstrap procedure. Such bounds describe the range in which quantities of interest, such as means, are expected to lie as evidenced by the data. The methodology is extended to provide direct comparisons between experimental measurements and modeling predictions by constructing statistical confidence bounds for the average difference between the two quantities. The statistical bounds on the average difference can be used to assess the level of agreement between measurements and predictions. The methodology is applied to experimental measurements of residual stress obtained using two strain relief measurement methods and predictions from seven finite element models developed by different organizations during a round robin study.« less
Sturtz, Timothy M; Schichtel, Bret A; Larson, Timothy V
2014-10-07
Source contributions to total fine particle carbon predicted by a chemical transport model (CTM) were incorporated into the positive matrix factorization (PMF) receptor model to form a receptor-oriented hybrid model. The level of influence of the CTM versus traditional PMF was varied using a weighting parameter applied to an object function as implemented in the Multilinear Engine (ME-2). The methodology provides the ability to separate features that would not be identified using PMF alone, without sacrificing fit to observations. The hybrid model was applied to IMPROVE data taken from 2006 through 2008 at Monture and Sula Peak, Montana. It was able to separately identify major contributions of total carbon (TC) from wildfires and minor contributions from biogenic sources. The predictions of TC had a lower cross-validated RMSE than those from either PMF or CTM alone. Two unconstrained, minor features were identified at each site, a soil derived feature with elevated summer impacts and a feature enriched in sulfate and nitrate with significant, but sporadic contributions across the sampling period. The respective mean TC contributions from wildfires, biogenic emissions, and other sources were 1.18, 0.12, and 0.12 ugC/m(3) at Monture and 1.60, 0.44, and 0.06 ugC/m(3) at Sula Peak.
Sherry, Simon B; Sabourin, Brigitte C; Hall, Peter A; Hewitt, Paul L; Flett, Gordon L; Gralnick, Tara M
2014-12-01
The perfectionism model of binge eating (PMOBE) is an integrative model explaining the link between perfectionism and binge eating. This model proposes socially prescribed perfectionism confers risk for binge eating by generating exposure to 4 putative binge triggers: interpersonal discrepancies, low interpersonal esteem, depressive affect, and dietary restraint. The present study addresses important gaps in knowledge by testing if these 4 binge triggers uniquely predict changes in binge eating on a daily basis and if daily variations in each binge trigger mediate the link between socially prescribed perfectionism and daily binge eating. Analyses also tested if proposed mediational models generalized across Asian and European Canadians. The PMOBE was tested in 566 undergraduate women using a 7-day daily diary methodology. Depressive affect predicted binge eating, whereas anxious affect did not. Each binge trigger uniquely contributed to binge eating on a daily basis. All binge triggers except for dietary restraint mediated the relationship between socially prescribed perfectionism and change in daily binge eating. Results suggested cross-cultural similarities, with the PMOBE applying to both Asian and European Canadian women. The present study advances understanding of the personality traits and the contextual conditions accompanying binge eating and provides an important step toward improving treatments for people suffering from eating binges and associated negative consequences.
a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation
NASA Astrophysics Data System (ADS)
Kıvılcım, C. Ö.; Duran, Z.
2016-06-01
The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.
Constrained Stochastic Extended Redundancy Analysis.
DeSarbo, Wayne S; Hwang, Heungsun; Stadler Blank, Ashley; Kappe, Eelco
2015-06-01
We devise a new statistical methodology called constrained stochastic extended redundancy analysis (CSERA) to examine the comparative impact of various conceptual factors, or drivers, as well as the specific predictor variables that contribute to each driver on designated dependent variable(s). The technical details of the proposed methodology, the maximum likelihood estimation algorithm, and model selection heuristics are discussed. A sports marketing consumer psychology application is provided in a Major League Baseball (MLB) context where the effects of six conceptual drivers of game attendance and their defining predictor variables are estimated. Results compare favorably to those obtained using traditional extended redundancy analysis (ERA).
Bernal, Guillermo; Adames, Cristina
2017-08-01
Mayor advancements have been achieved in research on the cultural adaptation of prevention and treatment interventions that are conducted with diverse ethnocultural groups. This commentary addresses conceptual, ethical, contextual, and methodological issues related to cultural adaptations. The articles in this special issue represent a major contribution to the study of cultural adaptations in prevention science. We frame our analysis of fidelity to core intervention components using a conceptual approach that examines (a) the propositional model (theory of change), (b) the procedural model (theory of action, methods), and (c) the philosophical assumptions that undergird these models. Regarding ethics, we caution against imposing the norms, values, and world views of the Western dominant society onto vulnerable populations such as ethnocultural groups. Given that the assumption of universality in behavioral science has been questioned, and as randomized clinical trials (RCTs) seldom examine the ecological validity of evidence-based interventions and treatments (EBI/T), imposing such interventions onto ethnocultural groups is problematic since these interventions contain values, norms, beliefs, and worldviews that may be contrary to those held by many ethnocultural groups. Regarding methods, several innovative designs are discussed that serve as alternatives to the RCT and represent an important contribution to prevention science. Also, we discuss guidelines for conducting cultural adaptations. Finally, the articles in this special issue make a major contribution to the growing field of cultural adaptation of preventive interventions with ethnocultural groups and majority-world populations.
Estimating stream discharge from a Himalayan Glacier using coupled satellite sensor data
NASA Astrophysics Data System (ADS)
Child, S. F.; Stearns, L. A.; van der Veen, C. J.; Haritashya, U. K.; Tarpanelli, A.
2015-12-01
The 4th IPCC report highlighted our limited understanding of Himalayan glacier behavior and contribution to the region's hydrology. Seasonal snow and glacier melt in the Himalayas are important sources of water, but estimates greatly differ about the actual contribution of melted glacier ice to stream discharge. A more comprehensive understanding of the contribution of glaciers to stream discharge is needed because streams being fed by glaciers affect the livelihoods of a large part of the world's population. Most of the streams in the Himalayas are unmonitored because in situ measurements are logistically difficult and costly. This necessitates the use of remote sensing platforms to obtain estimates of river discharge for validating hydrological models. In this study, we estimate stream discharge using cost-effective methods via repeat satellite imagery from Landsat-8 and SENTINEL-1A sensors. The methodology is based on previous studies, which show that ratio values from optical satellite bands correlate well with measured stream discharge. While similar, our methodology relies on significantly higher resolution imagery (30 m) and utilizes bands that are in the blue and near-infrared spectrum as opposed to previous studies using 250 m resolution imagery and spectral bands only in the near-infrared. Higher resolution imagery is necessary for streams where the source is a glacier's terminus because the width of the stream is often only 10s of meters. We validate our methodology using two rivers in the state of Kansas, where stream gauges are plentiful. We then apply our method to the Bhagirathi River, in the North-Central Himalayas, which is fed by the Gangotri Glacier and has a well monitored stream gauge. The analysis will later be used to couple river discharge and glacier flow and mass balance through an integrated hydrologic model in the Bhagirathi Basin.
A methodology for identification and control of electro-mechanical actuators
Tutunji, Tarek A.; Saleem, Ashraf
2015-01-01
Mechatronic systems are fully-integrated engineering systems that are composed of mechanical, electronic, and computer control sub-systems. These integrated systems use electro-mechanical actuators to cause the required motion. Therefore, the design of appropriate controllers for these actuators are an essential step in mechatronic system design. In this paper, a three-stage methodology for real-time identification and control of electro-mechanical actuator plants is presented, tested, and validated. First, identification models are constructed from experimental data to approximate the plants’ response. Second, the identified model is used in a simulation environment for the purpose of designing a suitable controller. Finally, the designed controller is applied and tested on the real plant through Hardware-in-the-Loop (HIL) environment. The described three-stage methodology provides the following practical contributions: • Establishes an easy-to-follow methodology for controller design of electro-mechanical actuators. • Combines off-line and on-line controller design for practical performance. • Modifies the HIL concept by using physical plants with computer control (rather than virtual plants with physical controllers). Simulated and experimental results for two case studies, induction motor and vehicle drive system, are presented in order to validate the proposed methodology. These results showed that electromechanical actuators can be identified and controlled using an easy-to-duplicate and flexible procedure. PMID:26150992
A methodology for identification and control of electro-mechanical actuators.
Tutunji, Tarek A; Saleem, Ashraf
2015-01-01
Mechatronic systems are fully-integrated engineering systems that are composed of mechanical, electronic, and computer control sub-systems. These integrated systems use electro-mechanical actuators to cause the required motion. Therefore, the design of appropriate controllers for these actuators are an essential step in mechatronic system design. In this paper, a three-stage methodology for real-time identification and control of electro-mechanical actuator plants is presented, tested, and validated. First, identification models are constructed from experimental data to approximate the plants' response. Second, the identified model is used in a simulation environment for the purpose of designing a suitable controller. Finally, the designed controller is applied and tested on the real plant through Hardware-in-the-Loop (HIL) environment. The described three-stage methodology provides the following practical contributions: •Establishes an easy-to-follow methodology for controller design of electro-mechanical actuators.•Combines off-line and on-line controller design for practical performance.•Modifies the HIL concept by using physical plants with computer control (rather than virtual plants with physical controllers). Simulated and experimental results for two case studies, induction motor and vehicle drive system, are presented in order to validate the proposed methodology. These results showed that electromechanical actuators can be identified and controlled using an easy-to-duplicate and flexible procedure.
Air quality assessment of benzo(a)pyrene from asphalt plant operation.
Gibson, Nigel; Stewart, Robert; Rankin, Erika
2012-01-01
A study has been carried out to assess the contribution of Polycyclic Aromatic Hydrocarbons (PAHs) from asphalt plant operation, utilising Benzo(a)pyrene (BaP) as a marker for PAHs, to the background air concentration around asphalt plants in the UK. The purpose behind this assessment was to determine whether the use of published BaP emission factors based on the US Environmental Protection Agency (EPA) methodology is appropriate in the context of the UK, especially as the EPA methodology does not give BaP emission factors for all activities. The study also aimed to improve the overall understanding of BaP emissions from asphalt plants in the UK, and determine whether site location and operation is likely to influence the contribution of PAHs to ambient air quality. In order to establish whether the use of US EPA emissions factors is appropriate, the study has compared the BaP emissions measured and calculated emissions rates from two UK sites with those estimated using US EPA emission factors. A dispersion modelling exercise was carried out to show the BaP contribution to ambient air around each site. This study showed that, as the US EPA methodology does not provide factors for all emission sources on asphalt plants, their use may give rise to over- or under-estimations, particularly where sources of BaP are temperature dependent. However, the contribution of both the estimated and measured BaP concentrations to environmental concentration were low, averaging about 0.05 ng m(-3) at the boundary of the sites, which is well below the UK BaP assessment threshold of 0.25 ng m(-3). Therefore, BaP concentrations, and hence PAH concentrations, from similar asphalt plant operations are unlikely to contribute negatively to ambient air quality.
Neuroanthropology: a humanistic science for the study of the culture–brain nexus
Turner, Robert; Lewis, E. Douglas; Egan, Gary
2010-01-01
In this article, we argue that a combined anthropology/neuroscience field of enquiry can make a significant and distinctive contribution to the study of the relationship between culture and the brain. This field, which can appropriately be termed as neuroanthropology, is conceived of as being complementary to and mutually informative with social and cultural neuroscience. We start by providing an introduction to the culture concept in anthropology. We then present a detailed characterization of neuroanthropology and its methods and how they relate to the anthropological understanding of culture. The field is described as a humanistic science, that is, a field of enquiry founded on the perceived epistemological and methodological interdependence of science and the humanities. We also provide examples that illustrate the proposed methodological model for neuroanthropology. We conclude with a discussion about specific contributions the field can make to the study of the culture–brain nexus. PMID:19654141
Xiang, Yang; Delbarre, Hervé; Sauvage, Stéphane; Léonardis, Thierry; Fourmentin, Marc; Augustin, Patrick; Locoge, Nadine
2012-03-01
During summer 2009, online measurements of 25 Volatile Organic Compounds (VOCs) from C6 to C10 as well as micro-meteorological parameters were simultaneously performed in the industrial city of Dunkerque. With the obtained data set, we developed a methodology to examine how the contributions of different source categories depend on atmospheric turbulences, and the results provided identification of emission modes. Eight factors were resolved by using Positive Matrix Factorization model and three of them were associated with mixed sources. The observed behaviours of contributions with turbulences lead to attribute some factors with sources at ground level, and some other factors with sources in the upper part of surface layer. The impact of vertical turbulence on the pollutant dispersion is also affected by the distance between sources and receptor site. Copyright © 2011 Elsevier Ltd. All rights reserved.
Neuroanthropology: a humanistic science for the study of the culture-brain nexus.
Domínguez Duque, Juan F; Turner, Robert; Lewis, E Douglas; Egan, Gary
2010-06-01
In this article, we argue that a combined anthropology/neuroscience field of enquiry can make a significant and distinctive contribution to the study of the relationship between culture and the brain. This field, which can appropriately be termed as neuroanthropology, is conceived of as being complementary to and mutually informative with social and cultural neuroscience. We start by providing an introduction to the culture concept in anthropology. We then present a detailed characterization of neuroanthropology and its methods and how they relate to the anthropological understanding of culture. The field is described as a humanistic science, that is, a field of enquiry founded on the perceived epistemological and methodological interdependence of science and the humanities. We also provide examples that illustrate the proposed methodological model for neuroanthropology. We conclude with a discussion about specific contributions the field can make to the study of the culture-brain nexus.
Ellsworth C. Dougherty: A Pioneer in the Selection of Caenorhabditis elegans as a Model Organism
Ferris, Howard
2015-01-01
Ellsworth Dougherty (1921–1965) was a man of impressive intellectual dimensions and interests; in a relatively short career he contributed enormously as researcher and scholar to the biological knowledge base for selection of Caenorhabditis elegans as a model organism in neurobiology, genetics, and molecular biology. He helped guide the choice of strains that were eventually used, and, in particular, he developed the methodology and understanding for the nutrition and axenic culture of nematodes and other organisms. Dougherty insisted upon a concise terminology for culture techniques and coined descriptive neologisms that were justified by their linguistic roots. Among other contributions, he refined the classification system for the Protista. PMID:26272995
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
ERIC Educational Resources Information Center
Seman, Laio Oriel; Hausmann, Romeu; Bezerra, Eduardo Augusto
2018-01-01
Contribution: This paper presents the "PBL classroom model," an agent-based simulation (ABS) that allows testing of several scenarios of a project-based learning (PBL) application by considering different levels of soft-skills, and students' perception of the methodology. Background: While the community has made great advances in…
Design of a Competency-Based Assessment Model in the Field of Accounting
ERIC Educational Resources Information Center
Ciudad-Gómez, Adelaida; Valverde-Berrocoso, Jesús
2012-01-01
This paper presents the phases involved in the design of a methodology to contribute both to the acquisition of competencies and to their assessment in the field of Financial Accounting, within the European Higher Education Area (EHEA) framework, which we call MANagement of COMpetence in the areas of Accounting (MANCOMA). Having selected and…
ERIC Educational Resources Information Center
Grace, Rebekah; Bowes, Jennifer
2011-01-01
This paper contributes to the discussion around methodologies effective in gathering the perspectives of young children for the purposes of research. It describes ecocultural theory, a theoretical model that has grown out of anthropology and cross-cultural psychology, and argues for the benefits of applying an ecocultural approach to interviews…
Handling Math Expressions in Economics: Recoding Spreadsheet Teaching Tool of Growth Models
ERIC Educational Resources Information Center
Moro-Egido, Ana I.; Pedauga, Luis E.
2017-01-01
In the present paper, we develop a teaching methodology for economic theory. The main contribution of this paper relies on combining the interactive characteristics of spreadsheet programs such as Excel and Unicode plain-text linear format for mathematical expressions. The advantage of Unicode standard rests on its ease for writing and reading…
Create full-scale predictive economic models on ROI and innovation with performance computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joseph, Earl C.; Conway, Steve
The U.S. Department of Energy (DOE), the world's largest buyer and user of supercomputers, awarded IDC Research, Inc. a grant to create two macroeconomic models capable of quantifying, respectively, financial and non-financial (innovation) returns on investments in HPC resources. Following a 2013 pilot study in which we created the models and tested them on about 200 real-world HPC cases, DOE authorized us to conduct a full-out, three-year grant study to collect and measure many more examples, a process that would also subject the methodology to further testing and validation. A secondary, "stretch" goal of the full-out study was to advancemore » the methodology from association toward (but not all the way to) causation, by eliminating the effects of some of the other factors that might be contributing, along with HPC investments, to the returns produced in the investigated projects.« less
Analytical modeling of helicopter static and dynamic induced velocity in GRASP
NASA Technical Reports Server (NTRS)
Kunz, Donald L.; Hodges, Dewey H.
1987-01-01
The methodology used by the General Rotorcraft Aeromechanical Stability Program (GRASP) to model the characteristics of the flow through a helicopter rotor in hovering or axial flight is described. Since the induced flow plays a significant role in determining the aeroelastic properties of rotorcraft, the computation of the induced flow is an important aspect of the program. Because of the combined finite-element/multibody methodology used as the basis for GRASP, the implementation of induced velocity calculations presented an unusual challenge to the developers. To preserve the modelling flexibility and generality of the code, it was necessary to depart from the traditional methods of computing the induced velocity. This is accomplished by calculating the actuator disc contributions to the rotor loads in a separate element called the air mass element, and then performing the calculations of the aerodynamic forces on individual blade elements within the aeroelastic beam element.
Vygotsky's Methodological Contribution to Sociocultural Theory.
ERIC Educational Resources Information Center
Mahn, Holbrook
1999-01-01
This article introduces major contributions of educational psychologist, Lev S. Vygotsky, through examination of his dialectical methodological approach. Topics discussed include semiotic mediation, social sources of development, verbal thinking, concept formation, spontaneous and scientific concepts, the zone of proximal development, and higher…
A Hybrid Methodology for Modeling Risk of Adverse Events in Complex Health-Care Settings.
Kazemi, Reza; Mosleh, Ali; Dierks, Meghan
2017-03-01
In spite of increased attention to quality and efforts to provide safe medical care, adverse events (AEs) are still frequent in clinical practice. Reports from various sources indicate that a substantial number of hospitalized patients suffer treatment-caused injuries while in the hospital. While risk cannot be entirely eliminated from health-care activities, an important goal is to develop effective and durable mitigation strategies to render the system "safer." In order to do this, though, we must develop models that comprehensively and realistically characterize the risk. In the health-care domain, this can be extremely challenging due to the wide variability in the way that health-care processes and interventions are executed and also due to the dynamic nature of risk in this particular domain. In this study, we have developed a generic methodology for evaluating dynamic changes in AE risk in acute care hospitals as a function of organizational and nonorganizational factors, using a combination of modeling formalisms. First, a system dynamics (SD) framework is used to demonstrate how organizational-level and policy-level contributions to risk evolve over time, and how policies and decisions may affect the general system-level contribution to AE risk. It also captures the feedback of organizational factors and decisions over time and the nonlinearities in these feedback effects. SD is a popular approach to understanding the behavior of complex social and economic systems. It is a simulation-based, differential equation modeling tool that is widely used in situations where the formal model is complex and an analytical solution is very difficult to obtain. Second, a Bayesian belief network (BBN) framework is used to represent patient-level factors and also physician-level decisions and factors in the management of an individual patient, which contribute to the risk of hospital-acquired AE. BBNs are networks of probabilities that can capture probabilistic relations between variables and contain historical information about their relationship, and are powerful tools for modeling causes and effects in many domains. The model is intended to support hospital decisions with regard to staffing, length of stay, and investments in safety, which evolve dynamically over time. The methodology has been applied in modeling the two types of common AEs: pressure ulcers and vascular-catheter-associated infection, and the models have been validated with eight years of clinical data and use of expert opinion. © 2017 Society for Risk Analysis.
Barba, Lida; Sánchez-Macías, Davinia; Barba, Iván; Rodríguez, Nibaldo
2018-06-01
Guinea pig meat consumption is increasing exponentially worldwide. The evaluation of the contribution of carcass components to carcass quality potentially can allow for the estimation of the value added to food animal origin and make research in guinea pigs more practicable. The aim of this study was to propose a methodology for modelling the contribution of different carcass components to the overall carcass quality of guinea pigs by using non-invasive pre- and post mortem carcass measurements. The selection of predictors was developed through correlation analysis and statistical significance; whereas the prediction models were based on Multiple Linear Regression. The prediction results showed higher accuracy in the prediction of carcass component contribution expressed in grams, compared to when expressed as a percentage of carcass quality components. The proposed prediction models can be useful for the guinea pig meat industry and research institutions by using non-invasive and time- and cost-efficient carcass component measuring techniques. Copyright © 2018 Elsevier Ltd. All rights reserved.
Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.
Eom, Hwisoo; Lee, Sang Hun
2015-06-12
A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model.
Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles
Eom, Hwisoo; Lee, Sang Hun
2015-01-01
A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406
Wäscher, Sebastian; Salloch, Sabine; Ritter, Peter; Vollmann, Jochen; Schildmann, Jan
2017-05-01
This article describes a process of developing, implementing and evaluating a clinical ethics support service intervention with the goal of building up a context-sensitive structure of minimal clinical-ethics in an oncology department without prior clinical ethics structure. Scholars from different disciplines have called for an improvement in the evaluation of clinical ethics support services (CESS) for different reasons over several decades. However, while a lot has been said about the concepts and methodological challenges of evaluating CESS up to the present time, relatively few empirical studies have been carried out. The aim of this article is twofold. On the one hand, it describes a process of development, modifying and evaluating a CESS intervention as part of the ETHICO research project, using the approach of qualitative-formative evaluation. On the other hand, it provides a methodological analysis which specifies the contribution of qualitative empirical methods to the (formative) evaluation of CESS. We conclude with a consideration of the strengths and limitations of qualitative evaluation research with regards to the evaluation and development of context sensitive CESS. We further discuss our own approach in contrast to rather traditional consult or committee models. © 2017 John Wiley & Sons Ltd.
Efficient free energy calculations of quantum systems through computer simulations
NASA Astrophysics Data System (ADS)
Antonelli, Alex; Ramirez, Rafael; Herrero, Carlos; Hernandez, Eduardo
2009-03-01
In general, the classical limit is assumed in computer simulation calculations of free energy. This approximation, however, is not justifiable for a class of systems in which quantum contributions for the free energy cannot be neglected. The inclusion of quantum effects is important for the determination of reliable phase diagrams of these systems. In this work, we present a new methodology to compute the free energy of many-body quantum systems [1]. This methodology results from the combination of the path integral formulation of statistical mechanics and efficient non-equilibrium methods to estimate free energy, namely, the adiabatic switching and reversible scaling methods. A quantum Einstein crystal is used as a model to show the accuracy and reliability the methodology. This new method is applied to the calculation of solid-liquid coexistence properties of neon. Our findings indicate that quantum contributions to properties such as, melting point, latent heat of fusion, entropy of fusion, and slope of melting line can be up to 10% of the calculated values using the classical approximation. [1] R. M. Ramirez, C. P. Herrero, A. Antonelli, and E. R. Hernández, Journal of Chemical Physics 129, 064110 (2008)
Lykes, M Brinton; Scheib, Holly
2016-01-01
Recovery from disaster and displacement involves multiple challenges including accompanying survivors, documenting effects, and rethreading community. This paper demonstrates how African-American and Latina community health promoters and white university-based researchers engaged visual methodologies and participatory action research (photoPAR) as resources in cross-community praxis in the wake of Hurricane Katrina and the flooding of New Orleans. Visual techniques, including but not limited to photonarratives, facilitated the health promoters': (1) care for themselves and each other as survivors of and responders to the post-disaster context; (2) critical interrogation of New Orleans' entrenched pre- and post-Katrina structural racism as contributing to the racialised effects of and responses to Katrina; and (3) meaning-making and performances of women's community-based, cross-community health promotion within this post-disaster context. This feminist antiracist participatory action research project demonstrates how visual methodologies contributed to the co-researchers' cross-community self- and other caring, critical bifocality, and collaborative construction of a contextually and culturally responsive model for women's community-based health promotion post 'unnatural disaster'. Selected limitations as well as the potential for future cross-community antiracist feminist photoPAR in post-disaster contexts are discussed.
NASA Astrophysics Data System (ADS)
Hu, Tengfei; Mao, Jingqiao; Pan, Shunqi; Dai, Lingquan; Zhang, Peipei; Xu, Diandian; Dai, Huichao
2018-07-01
Reservoir operations significantly alter the hydrological regime of the downstream river and river-connected lake, which has far-reaching impacts on the lake ecosystem. To facilitate the management of lakes connected to regulated rivers, the following information must be provided: (1) the response of lake water levels to reservoir operation schedules in the near future and (2) the importance of different rivers in terms of affecting the water levels in different lake regions of interest. We develop an integrated modeling and analytical methodology for the water level management of such lakes. The data-driven method is used to model the lake level as it has the potential of producing quick and accurate predictions. A new genetic algorithm-based synchronized search is proposed to optimize input variable time lags and data-driven model parameters simultaneously. The methodology also involves the orthogonal design and range analysis for extracting the influence of an individual river from that of all the rivers. The integrated methodology is applied to the second largest freshwater lake in China, the Dongting Lake. The results show that: (1) the antecedent lake levels are of crucial importance for the current lake level prediction; (2) the selected river discharge time lags reflect the spatial heterogeneity of the rivers' impacts on lake level changes; (3) the predicted lake levels are in very good agreement with the observed data (RMSE ≤ 0.091 m; R2 ≥ 0.9986). This study demonstrates the practical potential of the integrated methodology, which can provide both the lake level responses to future dam releases and the relative contributions of different rivers to lake level changes.
Oxlade, Olivia; Pinto, Marcia; Trajman, Anete; Menzies, Dick
2013-01-01
Introduction Cost effectiveness analyses (CEA) can provide useful information on how to invest limited funds, however they are less useful if different analysis of the same intervention provide unclear or contradictory results. The objective of our study was to conduct a systematic review of methodologic aspects of CEA that evaluate Interferon Gamma Release Assays (IGRA) for the detection of Latent Tuberculosis Infection (LTBI), in order to understand how differences affect study results. Methods A systematic review of studies was conducted with particular focus on study quality and the variability in inputs used in models used to assess cost-effectiveness. A common decision analysis model of the IGRA versus Tuberculin Skin Test (TST) screening strategy was developed and used to quantify the impact on predicted results of observed differences of model inputs taken from the studies identified. Results Thirteen studies were ultimately included in the review. Several specific methodologic issues were identified across studies, including how study inputs were selected, inconsistencies in the costing approach, the utility of the QALY (Quality Adjusted Life Year) as the effectiveness outcome, and how authors choose to present and interpret study results. When the IGRA versus TST test strategies were compared using our common decision analysis model predicted effectiveness largely overlapped. Implications Many methodologic issues that contribute to inconsistent results and reduced study quality were identified in studies that assessed the cost-effectiveness of the IGRA test. More specific and relevant guidelines are needed in order to help authors standardize modelling approaches, inputs, assumptions and how results are presented and interpreted. PMID:23505412
Rothman, Jason; Alemán Bañón, José; González Alonso, Jorge
2015-01-01
This article has two main objectives. First, we offer an introduction to the subfield of generative third language (L3) acquisition. Concerned primarily with modeling initial stages transfer of morphosyntax, one goal of this program is to show how initial stages L3 data make significant contributions toward a better understanding of how the mind represents language and how (cognitive) economy constrains acquisition processes more generally. Our second objective is to argue for and demonstrate how this subfield will benefit from a neuro/psycholinguistic methodological approach, such as event-related potential experiments, to complement the claims currently made on the basis of exclusively behavioral experiments. PMID:26300800
Manuel Stein's Five Decades of Structural Mechanics Contributions (1944-1988)
NASA Technical Reports Server (NTRS)
Mikulas, Martin M.; Card, Michael F.; Peterson, Jim P.; Starnes, James H., Jr.
1998-01-01
Manuel Stein went to work for NACA (National Advisory Committee for Aeronautics) in 1944 and left in 1988. His research contributions spanned five decades of extremely defining times for the aerospace industry. Problems arising from the analysis and design of efficient thin plate and shell aerospace structures have stimulated research over the past half century. The primary structural technology drivers during Dr. Stein's career included 1940's aluminum aircraft, 1950's jet aircraft, 1960's launch vehicles and advanced spacecraft, 1970's reusable launch vehicles and commercial aircraft, and 1980's composite aircraft. Dr. Stein's research was driven by these areas and he made lasting contributions for each. Dr. Stein's research can be characterized by a judicious mixture of physical insight into the problem, understanding of the basic mechanisms, mathematical modeling of the observed phenomena, and extraordinary analytical and numerical solution methodologies of the resulting mathematical models. This paper summarizes Dr. Stein's life and his contributions to the technical community.
[Qualitative research methodology in health care].
Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara
2017-03-01
Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.
NASA Astrophysics Data System (ADS)
Schinckus, C.
2016-12-01
This article aimed at presenting the scattered econophysics literature as a unified and coherent field through a specific lens imported from philosophy science. More precisely, I used the methodology developed by Imre Lakatos to cover the methodological evolution of econophysics over these last two decades. In this perspective, three co-existing approaches have been identified: statistical econophysics, bottom-up agent based econophysics and top-down agent based econophysics. Although the last is presented here as the last step of the methodological evolution of econophysics, it is worth mentioning that this tradition is still very new. A quick look on the econophysics literature shows that the vast majority of works in this field deal with a strictly statistical approach or a classical bottom-up agent-based modelling. In this context of diversification, the objective (and contribution) of this article is to emphasize the conceptual coherence of econophysics as a unique field of research. With this purpose, I used a theoretical framework coming from philosophy of science to characterize how econophysics evolved by combining a methodological enrichment with the preservation of its core conceptual statements.
The comparison of the use of holonic and agent-based methods in modelling of manufacturing systems
NASA Astrophysics Data System (ADS)
Foit, K.; Banaś, W.; Gwiazda, A.; Hryniewicz, P.
2017-08-01
The rapid evolution in the field of industrial automation and manufacturing is often called the 4th Industry Revolution. Worldwide availability of the internet access contributes to the competition between manufacturers, gives the opportunity for buying materials, parts and for creating the partnership networks, like cloud manufacturing, grid manufacturing (MGrid), virtual enterprises etc. The effect of the industry evolution is the need to search for new solutions in the field of manufacturing systems modelling and simulation. During the last decade researchers have developed the agent-based approach of modelling. This methodology have been taken from the computer science, but was adapted to the philosophy of industrial automation and robotization. The operation of the agent-based system depends on the simultaneous acting of different agents that may have different roles. On the other hand, there is the holon-based approach that uses the structures created by holons. It differs from the agent-based structure in some aspects, while the other ones are quite similar in both methodologies. The aim of this paper is to present the both methodologies and discuss the similarities and the differences. This may could help to select the optimal method of modelling, according to the considered problem and software resources.
de Brouwer, Geoffrey; Wolmarans, De Wet
2018-04-22
Animal models of human psychiatric illness are valuable frameworks to investigate the etiology and neurobiology underlying the human conditions. Accurate behavioral measures that can be used to characterize animal behavior, thereby contributing to a model's validity, are crucial. One such measure, i.e. the rodent marble-burying test (MBT), is often applied as a measure of anxiety- and compulsive-like behaviors. However, the test is characterized by noteworthy between-laboratory methodological differences and demonstrates positive treatment responses to an array of pharmacotherapies that are often of little translational value. Therefore, using a naturalistic animal model of obsessive-compulsive disorder, i.e. the deer mouse (Peromyscus maniculatus bairdii), the current investigation attempted to illuminate the discrepancies reported in literature by means of a methodological approach to the MBT. Five key aspects of the test that vary between laboratories, viz. observer/scoring, burying substrate, optional avoidance, the use of repeated testing, and determinations of locomotor activity, have been investigated. Following repeated MB tests in four different burying substrates and in two zone configurations, we have demonstrated that 1) observer bias may contribute to the significant differences in findings reported, 2) MB seems to be a natural exploratory response to a novel environment, rather than being triggered by aberrant cognition, 3) burying substrates with a small particle size and higher density deliver the most accurate results with respect to the burying phenotype, and 4) to exclude the influence of normal exploratory behavior on the number of marbles being covered, assessments of marble-burying should be based on pre-occupation with the objects itself. Copyright © 2018 Elsevier B.V. All rights reserved.
Methodology discourses as boundary work in the construction of engineering education.
Beddoes, Kacey
2014-04-01
Engineering education research is a new field that emerged in the social sciences over the past 10 years. This analysis of engineering education research demonstrates that methodology discourses have played a central role in the construction and development of the field of engineering education, and that they have done so primarily through boundary work. This article thus contributes to science and technology studies literature by examining the role of methodology discourses in an emerging social science field. I begin with an overview of engineering education research before situating the case within relevant bodies of literature on methodology discourses and boundary work. I then identify two methodology discourses--rigor and methodological diversity--and discuss how they contribute to the construction and development of engineering education research. The article concludes with a discussion of how the findings relate to prior research on methodology discourses and boundary work and implications for future research.
Multi-scale landslide hazard assessment: Advances in global and regional methodologies
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang
2010-05-01
The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.
NASA Technical Reports Server (NTRS)
Hoppa, Mary Ann; Wilson, Larry W.
1994-01-01
There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.
Alexakis, Dimitrios D.; Mexis, Filippos-Dimitrios K.; Vozinaki, Anthi-Eirini K.; Daliakopoulos, Ioannis N.; Tsanis, Ioannis K.
2017-01-01
A methodology for elaborating multi-temporal Sentinel-1 and Landsat 8 satellite images for estimating topsoil Soil Moisture Content (SMC) to support hydrological simulation studies is proposed. After pre-processing the remote sensing data, backscattering coefficient, Normalized Difference Vegetation Index (NDVI), thermal infrared temperature and incidence angle parameters are assessed for their potential to infer ground measurements of SMC, collected at the top 5 cm. A non-linear approach using Artificial Neural Networks (ANNs) is tested. The methodology is applied in Western Crete, Greece, where a SMC gauge network was deployed during 2015. The performance of the proposed algorithm is evaluated using leave-one-out cross validation and sensitivity analysis. ANNs prove to be the most efficient in SMC estimation yielding R2 values between 0.7 and 0.9. The proposed methodology is used to support a hydrological simulation with the HEC-HMS model, applied at the Keramianos basin which is ungauged for SMC. Results and model sensitivity highlight the contribution of combining Sentinel-1 SAR and Landsat 8 images for improving SMC estimates and supporting hydrological studies. PMID:28635625
Alexakis, Dimitrios D; Mexis, Filippos-Dimitrios K; Vozinaki, Anthi-Eirini K; Daliakopoulos, Ioannis N; Tsanis, Ioannis K
2017-06-21
A methodology for elaborating multi-temporal Sentinel-1 and Landsat 8 satellite images for estimating topsoil Soil Moisture Content (SMC) to support hydrological simulation studies is proposed. After pre-processing the remote sensing data, backscattering coefficient, Normalized Difference Vegetation Index (NDVI), thermal infrared temperature and incidence angle parameters are assessed for their potential to infer ground measurements of SMC, collected at the top 5 cm. A non-linear approach using Artificial Neural Networks (ANNs) is tested. The methodology is applied in Western Crete, Greece, where a SMC gauge network was deployed during 2015. The performance of the proposed algorithm is evaluated using leave-one-out cross validation and sensitivity analysis. ANNs prove to be the most efficient in SMC estimation yielding R² values between 0.7 and 0.9. The proposed methodology is used to support a hydrological simulation with the HEC-HMS model, applied at the Keramianos basin which is ungauged for SMC. Results and model sensitivity highlight the contribution of combining Sentinel-1 SAR and Landsat 8 images for improving SMC estimates and supporting hydrological studies.
Kansei, surfaces and perception engineering
NASA Astrophysics Data System (ADS)
Rosen, B.-G.; Eriksson, L.; Bergman, M.
2016-09-01
The aesthetic and pleasing properties of a product are important and add significantly to the meaning and relevance of a product. Customer sensation and perception are largely about psychological factors. There has been a strong industrial and academic need and interest for methods and tools to quantify and link product properties to the human response but a lack of studies of the impact of surfaces. In this study, affective surface engineering is used to illustrate and model the link between customer expectations and perception to controllable product surface properties. The results highlight the use of the soft metrology concept for linking physical and human factors contributing to the perception of products. Examples of surface applications of the Kansei methodology are presented from sauna bath, health care, architectural and hygiene tissue application areas to illustrate, discuss and confirm the strength of the methodology. In the conclusions of the study, future research in soft metrology is proposed to allow understanding and modelling of product perception and sensations in combination with a development of the Kansei surface engineering methodology and software tools.
Influence Map Methodology for Evaluating Systemic Safety Issues
NASA Technical Reports Server (NTRS)
2008-01-01
"Raising the bar" in safety performance is a critical challenge for many organizations, including Kennedy Space Center. Contributing-factor taxonomies organize information about the reasons accidents occur and therefore are essential elements of accident investigations and safety reporting systems. Organizations must balance efforts to identify causes of specific accidents with efforts to evaluate systemic safety issues in order to become more proactive about improving safety. This project successfully addressed the following two problems: (1) methods and metrics to support the design of effective taxonomies are limited and (2) influence relationships among contributing factors are not explicitly modeled within a taxonomy.
Eluru, Naveen; Chakour, Vincent; Chamberlain, Morgan; Miranda-Moreno, Luis F
2013-10-01
Vehicle operating speed measured on roadways is a critical component for a host of analysis in the transportation field including transportation safety, traffic flow modeling, roadway geometric design, vehicle emissions modeling, and road user route decisions. The current research effort contributes to the literature on examining vehicle speed on urban roads methodologically and substantively. In terms of methodology, we formulate a new econometric model framework for examining speed profiles. The proposed model is an ordered response formulation of a fractional split model. The ordered nature of the speed variable allows us to propose an ordered variant of the fractional split model in the literature. The proposed formulation allows us to model the proportion of vehicles traveling in each speed interval for the entire segment of roadway. We extend the model to allow the influence of exogenous variables to vary across the population. Further, we develop a panel mixed version of the fractional split model to account for the influence of site-specific unobserved effects. The paper contributes substantively by estimating the proposed model using a unique dataset from Montreal consisting of weekly speed data (collected in hourly intervals) for about 50 local roads and 70 arterial roads. We estimate separate models for local roads and arterial roads. The model estimation exercise considers a whole host of variables including geometric design attributes, roadway attributes, traffic characteristics and environmental factors. The model results highlight the role of various street characteristics including number of lanes, presence of parking, presence of sidewalks, vertical grade, and bicycle route on vehicle speed proportions. The results also highlight the presence of site-specific unobserved effects influencing the speed distribution. The parameters from the modeling exercise are validated using a hold-out sample not considered for model estimation. The results indicate that the proposed panel mixed ordered probit fractional split model offers promise for modeling such proportional ordinal variables. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Guler Yigitoglu, Askin
In the context of long operation of nuclear power plants (NPPs) (i.e., 60-80 years, and beyond), investigation of the aging of passive systems, structures and components (SSCs) is important to assess safety margins and to decide on reactor life extension as indicated within the U.S. Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) Program. In the traditional probabilistic risk assessment (PRA) methodology, evaluating the potential significance of aging of passive SSCs on plant risk is challenging. Although passive SSC failure rates can be added as initiating event frequencies or basic event failure rates in the traditional event-tree/fault-tree methodology, these failure rates are generally based on generic plant failure data which means that the true state of a specific plant is not reflected in a realistic manner on aging effects. Dynamic PRA methodologies have gained attention recently due to their capability to account for the plant state and thus address the difficulties in the traditional PRA modeling of aging effects of passive components using physics-based models (and also in the modeling of digital instrumentation and control systems). Physics-based models can capture the impact of complex aging processes (e.g., fatigue, stress corrosion cracking, flow-accelerated corrosion, etc.) on SSCs and can be utilized to estimate passive SSC failure rates using realistic NPP data from reactor simulation, as well as considering effects of surveillance and maintenance activities. The objectives of this dissertation are twofold: The development of a methodology for the incorporation of aging modeling of passive SSC into a reactor simulation environment to provide a framework for evaluation of their risk contribution in both the dynamic and traditional PRA; and the demonstration of the methodology through its application to pressurizer surge line pipe weld and steam generator tubes in commercial nuclear power plants. In the proposed methodology, a multi-state physics based model is selected to represent the aging process. The model is modified via sojourn time approach to reflect the operational and maintenance history dependence of the transition rates. Thermal-hydraulic parameters of the model are calculated via the reactor simulation environment and uncertainties associated with both parameters and the models are assessed via a two-loop Monte Carlo approach (Latin hypercube sampling) to propagate input probability distributions through the physical model. The effort documented in this thesis towards this overall objective consists of : i) defining a process for selecting critical passive components and related aging mechanisms, ii) aging model selection, iii) calculating the probability that aging would cause the component to fail, iv) uncertainty/sensitivity analyses, v) procedure development for modifying an existing PRA to accommodate consideration of passive component failures, and, vi) including the calculated failure probability in the modified PRA. The proposed methodology is applied to pressurizer surge line pipe weld aging and steam generator tube degradation in pressurized water reactors.
High-frequency measurements of aeolian saltation flux: Field-based methodology and applications
NASA Astrophysics Data System (ADS)
Martin, Raleigh L.; Kok, Jasper F.; Hugenholtz, Chris H.; Barchyn, Thomas E.; Chamecki, Marcelo; Ellis, Jean T.
2018-02-01
Aeolian transport of sand and dust is driven by turbulent winds that fluctuate over a broad range of temporal and spatial scales. However, commonly used aeolian transport models do not explicitly account for such fluctuations, likely contributing to substantial discrepancies between models and measurements. Underlying this problem is the absence of accurate sand flux measurements at the short time scales at which wind speed fluctuates. Here, we draw on extensive field measurements of aeolian saltation to develop a methodology for generating high-frequency (up to 25 Hz) time series of total (vertically-integrated) saltation flux, namely by calibrating high-frequency (HF) particle counts to low-frequency (LF) flux measurements. The methodology follows four steps: (1) fit exponential curves to vertical profiles of saltation flux from LF saltation traps, (2) determine empirical calibration factors through comparison of LF exponential fits to HF number counts over concurrent time intervals, (3) apply these calibration factors to subsamples of the saltation count time series to obtain HF height-specific saltation fluxes, and (4) aggregate the calibrated HF height-specific saltation fluxes into estimates of total saltation fluxes. When coupled to high-frequency measurements of wind velocity, this methodology offers new opportunities for understanding how aeolian saltation dynamics respond to variability in driving winds over time scales from tens of milliseconds to days.
Evaluating the uncertainty of predicting future climate time series at the hourly time scale
NASA Astrophysics Data System (ADS)
Caporali, E.; Fatichi, S.; Ivanov, V. Y.
2011-12-01
A stochastic downscaling methodology is developed to generate hourly, point-scale time series for several meteorological variables, such as precipitation, cloud cover, shortwave radiation, air temperature, relative humidity, wind speed, and atmospheric pressure. The methodology uses multi-model General Circulation Model (GCM) realizations and an hourly weather generator, AWE-GEN. Probabilistic descriptions of factors of change (a measure of climate change with respect to historic conditions) are computed for several climate statistics and different aggregation times using a Bayesian approach that weights the individual GCM contributions. The Monte Carlo method is applied to sample the factors of change from their respective distributions thereby permitting the generation of time series in an ensemble fashion, which reflects the uncertainty of climate projections of future as well as the uncertainty of the downscaling procedure. Applications of the methodology and probabilistic expressions of certainty in reproducing future climates for the periods, 2000 - 2009, 2046 - 2065 and 2081 - 2100, using the 1962 - 1992 period as the baseline, are discussed for the location of Firenze (Italy). The climate predictions for the period of 2000 - 2009 are tested against observations permitting to assess the reliability and uncertainties of the methodology in reproducing statistics of meteorological variables at different time scales.
Use of digital technologies for nasal prosthesis manufacturing.
Palousek, David; Rosicky, Jiri; Koutny, Daniel
2014-04-01
Digital technology is becoming more accessible for common use in medical applications; however, their expansion in prosthetic and orthotic laboratories is not large because of the persistent image of difficult applicability to real patients. This article aims to offer real example in the area of human facial prostheses. This article describes the utilization of optical digitization, computational modelling, rapid prototyping, mould fabrication and manufacturing of a nasal silicone prosthesis. This technical note defines the key points of the methodology and aspires to contribute to the introduction of a certified manufacturing procedure. The results show that the used technologies reduce the manufacturing time, reflect patient's requirements and allow the manufacture of high-quality prostheses for missing facial asymmetric parts. The methodology provides a good position for further development issues and is usable for clinical practice. Clinical relevance Utilization of digital technologies in facial prosthesis manufacturing process can be a good contribution for higher patient comfort and higher production efficiency but with higher initial investment and demands for experience with software tools.
The French connection: some contributions of French-language research in the post-Piagetian era.
Larivée, S; Normandeau, S; Parent, S
2000-01-01
This article presents French-speaking researchers' contribution to the field of differential developmental psychology. Following a brief review of key Piagetian ideas pertaining to his conceptualization of individual differences, the core of the article traces methodological and theoretical transformations that were necessary for understanding individual differences within a general theory of cognitive development. On a methodological level, French-speaking researchers went from standardizing Piaget's clinical method to constructing developmental scales and operational tests. On a theoretical level, Reuchlin's writings guided Longeot, and several other French (Lautrey and Bideaud) and Genevan (de Ribaupierre and Rieben) researchers into a scientific quest for a genuine integration of differential and developmental psychology. We present an overview of the pluralistic and multidimensional model of cognitive functioning and development that emerged from the work of the French-Swiss team of researchers. Concluding remarks focus on the actual research agendas of researchers interested in resolving the challenging issue of understanding relationships between inter- and intraindividual differences and general tendencies in cognitive development.
A new methodology for modeling of direct landslide costs for transportation infrastructures
NASA Astrophysics Data System (ADS)
Klose, Martin; Terhorst, Birgit
2014-05-01
The world's transportation infrastructure is at risk of landslides in many areas across the globe. A safe and affordable operation of traffic routes are the two main criteria for transportation planning in landslide-prone areas. The right balancing of these often conflicting priorities requires, amongst others, profound knowledge of the direct costs of landslide damage. These costs include capital investments for landslide repair and mitigation as well as operational expenditures for first response and maintenance works. This contribution presents a new methodology for ex post assessment of direct landslide costs for transportation infrastructures. The methodology includes tools to compile, model, and extrapolate landslide losses on different spatial scales over time. A landslide susceptibility model enables regional cost extrapolation by means of a cost figure obtained from local cost compilation for representative case study areas. On local level, cost survey is closely linked with cost modeling, a toolset for cost estimation based on landslide databases. Cost modeling uses Landslide Disaster Management Process Models (LDMMs) and cost modules to simulate and monetize cost factors for certain types of landslide damage. The landslide susceptibility model provides a regional exposure index and updates the cost figure to a cost index which describes the costs per km of traffic route at risk of landslides. Both indexes enable the regionalization of local landslide losses. The methodology is applied and tested in a cost assessment for highways in the Lower Saxon Uplands, NW Germany, in the period 1980 to 2010. The basis of this research is a regional subset of a landslide database for the Federal Republic of Germany. In the 7,000 km² large Lower Saxon Uplands, 77 km of highway are located in potential landslide hazard area. Annual average costs of 52k per km of highway at risk of landslides are identified as cost index for a local case study area in this region. The cost extrapolation for the Lower Saxon Uplands results in annual average costs for highways of 4.02mn. This test application as well as a validation of selected modeling tools verifies the functionality of this methodology.
Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henneke, Dennis W.; Robinson, James
In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less
Sea Ice Biogeochemistry: A Guide for Modellers
Tedesco, Letizia; Vichi, Marcello
2014-01-01
Sea ice is a fundamental component of the climate system and plays a key role in polar trophic food webs. Nonetheless sea ice biogeochemical dynamics at large temporal and spatial scales are still rarely described. Numerical models may potentially contribute integrating among sparse observations, but available models of sea ice biogeochemistry are still scarce, whether their relevance for properly describing the current and future state of the polar oceans has been recently addressed. A general methodology to develop a sea ice biogeochemical model is presented, deriving it from an existing validated model application by extension of generic pelagic biogeochemistry model parameterizations. The described methodology is flexible and considers different levels of ecosystem complexity and vertical representation, while adopting a strategy of coupling that ensures mass conservation. We show how to apply this methodology step by step by building an intermediate complexity model from a published realistic application and applying it to analyze theoretically a typical season of first-year sea ice in the Arctic, the one currently needing the most urgent understanding. The aim is to (1) introduce sea ice biogeochemistry and address its relevance to ocean modelers of polar regions, supporting them in adding a new sea ice component to their modelling framework for a more adequate representation of the sea ice-covered ocean ecosystem as a whole, and (2) extend our knowledge on the relevant controlling factors of sea ice algal production, showing that beyond the light and nutrient availability, the duration of the sea ice season may play a key-role shaping the algal production during the on going and upcoming projected changes. PMID:24586604
An Analysis of USSPACECOM’s Space Surveillance Network (SSN) Sensor Tasking Methodology
1992-12-01
2-6 2.3.2 Collateral Sensors .......................... 2- 7 2.3.3 Contributing Sensors ........................ 2-8 2.4 Space Surveillance Network...3I 3.1.1 T"hr State, Solution . ...... . ................... 3.:1 Page 3.1.2 The State-Transition Matrix... ............ 3- 7 3.2 Differential...Execution ........................... 4- 7 4.3.3 Model Verification ......................... 4-10 4.41 Differential Corrector
Network Models of Entrepreneurial Ecosystems in Developing Economies
2014-01-01
Department of Mathematical Sciences, U.S. Military Academy Candice Price , Ph.D. , Department of Mathematical Sciences, U.S. Military Academy NOTICES...methodology. “Youth unemployment is a ticking time bomb,” –Alexander Chikwanda, Finance Minister, Zambia Protesters in Tahrir Square, Cairo...with the recent political and social changes in the region, only contributes to this high unemployment rate. As the Finance Minister of Zambia stated
Towards quantifying uncertainty in Greenland's contribution to 21st century sea-level rise
NASA Astrophysics Data System (ADS)
Perego, M.; Tezaur, I.; Price, S. F.; Jakeman, J.; Eldred, M.; Salinger, A.; Hoffman, M. J.
2015-12-01
We present recent work towards developing a methodology for quantifying uncertainty in Greenland's 21st century contribution to sea-level rise. While we focus on uncertainties associated with the optimization and calibration of the basal sliding parameter field, the methodology is largely generic and could be applied to other (or multiple) sets of uncertain model parameter fields. The first step in the workflow is the solution of a large-scale, deterministic inverse problem, which minimizes the mismatch between observed and computed surface velocities by optimizing the two-dimensional coefficient field in a linear-friction sliding law. We then expand the deviation in this coefficient field from its estimated "mean" state using a reduced basis of Karhunen-Loeve Expansion (KLE) vectors. A Bayesian calibration is used to determine the optimal coefficient values for this expansion. The prior for the Bayesian calibration can be computed using the Hessian of the deterministic inversion or using an exponential covariance kernel. The posterior distribution is then obtained using Markov Chain Monte Carlo run on an emulator of the forward model. Finally, the uncertainty in the modeled sea-level rise is obtained by performing an ensemble of forward propagation runs. We present and discuss preliminary results obtained using a moderate-resolution model of the Greenland Ice sheet. As demonstrated in previous work, the primary difficulty in applying the complete workflow to realistic, high-resolution problems is that the effective dimension of the parameter space is very large.
PRIORITIES FOR HEALTH ECONOMIC METHODOLOGICAL RESEARCH: RESULTS OF AN EXPERT CONSULTATION.
Tordrup, David; Chouaid, Christos; Cuijpers, Pim; Dab, William; van Dongen, Johanna Maria; Espin, Jaime; Jönsson, Bengt; Léonard, Christian; McDaid, David; McKee, Martin; Miguel, José Pereira; Patel, Anita; Reginster, Jean-Yves; Ricciardi, Walter; Rutten-van Molken, Maureen; Rupel, Valentina Prevolnik; Sach, Tracey; Sassi, Franco; Waugh, Norman; Bertollini, Roberto
2017-01-01
The importance of economic evaluation in decision making is growing with increasing budgetary pressures on health systems. Diverse economic evidence is available for a range of interventions across national contexts within Europe, but little attention has been given to identifying evidence gaps that, if filled, could contribute to more efficient allocation of resources. One objective of the Research Agenda for Health Economic Evaluation project is to determine the most important methodological evidence gaps for the ten highest burden conditions in the European Union (EU), and to suggest ways of filling these gaps. The highest burden conditions in the EU by Disability Adjusted Life Years were determined using the Global Burden of Disease study. Clinical interventions were identified for each condition based on published guidelines, and economic evaluations indexed in MEDLINE were mapped to each intervention. A panel of public health and health economics experts discussed the evidence during a workshop and identified evidence gaps. The literature analysis contributed to identifying cross-cutting methodological and technical issues, which were considered by the expert panel to derive methodological research priorities. The panel suggests a research agenda for health economics which incorporates the use of real-world evidence in the assessment of new and existing interventions; increased understanding of cost-effectiveness according to patient characteristics beyond the "-omics" approach to inform both investment and disinvestment decisions; methods for assessment of complex interventions; improved cross-talk between economic evaluations from health and other sectors; early health technology assessment; and standardized, transferable approaches to economic modeling.
Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yau, M.; Motamed, M.; Guarro, S.
2006-07-01
Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less
Pyrolysis Model Development for a Multilayer Floor Covering
McKinnon, Mark B.; Stoliarov, Stanislav I.
2015-01-01
Comprehensive pyrolysis models that are integral to computational fire codes have improved significantly over the past decade as the demand for improved predictive capabilities has increased. High fidelity pyrolysis models may improve the design of engineered materials for better fire response, the design of the built environment, and may be used in forensic investigations of fire events. A major limitation to widespread use of comprehensive pyrolysis models is the large number of parameters required to fully define a material and the lack of effective methodologies for measurement of these parameters, especially for complex materials. The work presented here details a methodology used to characterize the pyrolysis of a low-pile carpet tile, an engineered composite material that is common in commercial and institutional occupancies. The studied material includes three distinct layers of varying composition and physical structure. The methodology utilized a comprehensive pyrolysis model (ThermaKin) to conduct inverse analyses on data collected through several experimental techniques. Each layer of the composite was individually parameterized to identify its contribution to the overall response of the composite. The set of properties measured to define the carpet composite were validated against mass loss rate curves collected at conditions outside the range of calibration conditions to demonstrate the predictive capabilities of the model. The mean error between the predicted curve and the mean experimental mass loss rate curve was calculated as approximately 20% on average for heat fluxes ranging from 30 to 70 kW·m−2, which is within the mean experimental uncertainty. PMID:28793556
NASA Astrophysics Data System (ADS)
Nieves, Ian
Dynamic finite element analysis (FEA) was used to verify the ability of a novel percussion instrument to characterize the composition and structure of laminated materials and glass columns and to elucidate key facets of this process. Initial simulations modeling the percussion process with varying probe geometries were performed to access which configuration most accurately represented in situ diagnostic activity. Percussion testing of monoliths and laminated duplex scaffolds consisting of PTFE and 6061 Al was simulated to assess the ability of the numeric methodology to model intrinsic damping in laminated scaffolds and determine the potential contributions of size effects, gripping configurations, and probe friction to the loading response of the material being tested. Percussion testing of laminated scaffolds and monoliths composed of either PMMA or PLGA was modeled to investigate the effects of defects on the impact response and to evaluate promising strategies for enhancing damping that promotes tissue regeneration in biomedical materials. Percussion testing of virgin and cracked glass columns was modeled and the resulting probe acceleration predictions compared to corresponding experimental findings to evaluate the overall accuracy of the methodology and to discern its capacity for elucidating facets of defect detection in rigid materials. Overall, the modeling the results validated the effectiveness of the numeric methodology for modeling and elucidating the mechanics of percussion testing and suggested strategies whereby this procedure can facilitate the development of innovative biomedical materials designed to promote tissue regeneration.
Global Change adaptation in water resources management: the Water Change project.
Pouget, Laurent; Escaler, Isabel; Guiu, Roger; Mc Ennis, Suzy; Versini, Pierre-Antoine
2012-12-01
In recent years, water resources management has been facing new challenges due to increasing changes and their associated uncertainties, such as changes in climate, water demand or land use, which can be grouped under the term Global Change. The Water Change project (LIFE+ funding) developed a methodology and a tool to assess the Global Change impacts on water resources, thus helping river basin agencies and water companies in their long term planning and in the definition of adaptation measures. The main result of the project was the creation of a step by step methodology to assess Global Change impacts and define strategies of adaptation. This methodology was tested in the Llobregat river basin (Spain) with the objective of being applicable to any water system. It includes several steps such as setting-up the problem with a DPSIR framework, developing Global Change scenarios, running river basin models and performing a cost-benefit analysis to define optimal strategies of adaptation. This methodology was supported by the creation of a flexible modelling system, which can link a wide range of models, such as hydrological, water quality, and water management models. The tool allows users to integrate their own models to the system, which can then exchange information among them automatically. This enables to simulate the interactions among multiple components of the water cycle, and run quickly a large number of Global Change scenarios. The outcomes of this project make possible to define and test different sets of adaptation measures for the basin that can be further evaluated through cost-benefit analysis. The integration of the results contributes to an efficient decision-making on how to adapt to Global Change impacts. Copyright © 2012 Elsevier B.V. All rights reserved.
van Mil, Anke C C M; Greyling, Arno; Zock, Peter L; Geleijnse, Johanna M; Hopman, Maria T; Mensink, Ronald P; Reesink, Koen D; Green, Daniel J; Ghiadoni, Lorenzo; Thijssen, Dick H
2016-09-01
Brachial artery flow-mediated dilation (FMD) is a popular technique to examine endothelial function in humans. Identifying volunteer and methodological factors related to variation in FMD is important to improve measurement accuracy and applicability. Volunteer-related and methodology-related parameters were collected in 672 volunteers from eight affiliated centres worldwide who underwent repeated measures of FMD. All centres adopted contemporary expert-consensus guidelines for FMD assessment. After calculating the coefficient of variation (%) of the FMD for each individual, we constructed quartiles (n = 168 per quartile). Based on two regression models (volunteer-related factors and methodology-related factors), statistically significant components of these two models were added to a final regression model (calculated as β-coefficient and R). This allowed us to identify factors that independently contributed to the variation in FMD%. Median coefficient of variation was 17.5%, with healthy volunteers demonstrating a coefficient of variation 9.3%. Regression models revealed age (β = 0.248, P < 0.001), hypertension (β = 0.104, P < 0.001), dyslipidemia (β = 0.331, P < 0.001), time between measurements (β = 0.318, P < 0.001), lab experience (β = -0.133, P < 0.001) and baseline FMD% (β = 0.082, P < 0.05) as contributors to the coefficient of variation. After including all significant factors in the final model, we found that time between measurements, hypertension, baseline FMD% and lab experience with FMD independently predicted brachial artery variability (total R = 0.202). Although FMD% showed good reproducibility, larger variation was observed in conditions with longer time between measurements, hypertension, less experience and lower baseline FMD%. Accounting for these factors may improve FMD% variability.
NASA Astrophysics Data System (ADS)
Gobbi, Gian Paolo; Wille, Holger; Sozzi, Roberto; Barnaba, Francesca; Costabile, Francesca; Angelini, Federico; Frey, Steffen; Bolignano, Andrea; Morelli, Matteo
2013-04-01
The contribution of Saharan-dust advections to both daily and annual PM average mass concentrations can be significant all over Southern Europe. The Directive 2008/50/EC allows subtraction of PM10 exceedances caused by natural contributions from the statistic used to determine air-quality levels in Europe. To this purpose, the Commission Staff Working Paper 6771/11 (EC, 2011) provides specific Guidelines on methods to quantify and subtract the contribution of these sources in the framework of the Air Quality Directive. For Saharan dust, the EC methodology is largely based on a thorough analysis performed over the Iberian Peninsula (Escudero et al, 2007), although revision of the current methodology is in progress. In line with the EC Guidelines, the DIAPASON project ("Desert-dust Impact on Air quality through model-Predictions and Advanced Sensors ObservatioNs"), funded under the EC LIFE+ program, has been formulated to provide a robust, user-oriented, and demonstrated method to assess the presence of desert dust and evaluate its contribution to PM10 levels at the monitoring sites. To this end, in addition to satellite-based data and model forecasts already included in the EC Guidelines, DIAPASON will take advantage, in both the Project implementation and demonstration phase, of innovative and affordable technologies (partly prototyped within the project itself), namely operational Polarization Lidar-Ceilometers (PLC) capable of detecting and profiling dust clouds from the ground up to 10 km altitude. The PLC prototypes have been already finalized during the initial phase of the Project. Three of them will be networked in relevant air quality monitoring stations located in the Rome metropolitan area (Italy) during the DIAPASON observational phase (one-year long field campaign) starting in March 2013. The Rome region was chosen as the DIAPASON pilot scale area since highly impacted by urban pollution and frequently affected by Saharan dust transport events. In fact, a preliminary assessment of the role of Saharan dust in this area, based on a four-year dataset (2001-2004) has shown average increases of PM10 levels of the order of 11.9 µg/m3 when Saharan dust presence is either predicted by models or observed by a depolarization lidar. Conversely, PM10 increases computed relying only on the Lidar detections (i.e., presence of dust layers actually observed) were of the order of 15.6 µg/m3. Both analyses indicate the annual average contribution of dust advections to the city PM10 mass concentrations to be of the order of 2.3 µg/m3 (Gobbi et al., 2013). These results confirm Saharan advections in the central Mediterranean as important modulators of PM10 loads and exceedances. After the demonstrative pilot scale study, the DIAPASON results will be spatially generalised to a wider area. The final DIAPASON methodology to detect/quantify the Saharan dust contribution to PM10 will be tailored for a national scale application, and easily transferable to other air-quality and meteorological agencies in Europe. In this work, preliminary results from the combined analysis of Saharan dust model predictions, PM10 data and lidar records performed within DIAPASON will be shown, with particular focus on the added-value provided by continuous polarization lidar data in integrating the present EC Methodology. - EC, Commission Staff Working Paper 6771/11 establishing guidelines for demonstration and subtraction of exceedances attributable to natural sources under the Directive 2008/50/EC on ambient air quality and cleaner air for Europe, European Commission, 2011. - Escudero, M., Querol, X., Pey, J., Alastuey, A., Pérez, N., Ferreira, F., Alonso, S., Rodríguez, S. and Cuevas, E., A methodology for the quantification of the net African dust load in air quality monitoring networks, Atmos. Envir., 41, 5516-5524, 2007. - Gobbi,G. P., F. Angelini, F. Barnaba, F. Costabile, J. M. Baldasano, S. Basart, R. Sozzi and A. Bolignano, Changes in Particulate Matter Physical Properties During Saharan Advections over Rome (Italy): A Four-Year Study, 2001-2004, Atmos. Chem. Phys., Discus., 2013.
Partitioning uncertainty in streamflow projections under nonstationary model conditions
NASA Astrophysics Data System (ADS)
Chawla, Ila; Mujumdar, P. P.
2018-02-01
Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them for future streamflow projections and segregate the contribution of various sources to the uncertainty.
Speck-Planche, Alejandro; Kleandrova, Valeria V; Luan, Feng; Cordeiro, M Natália D S
2012-08-01
The discovery of new and more potent anti-cancer agents constitutes one of the most active fields of research in chemotherapy. Colorectal cancer (CRC) is one of the most studied cancers because of its high prevalence and number of deaths. In the current pharmaceutical design of more efficient anti-CRC drugs, the use of methodologies based on Chemoinformatics has played a decisive role, including Quantitative-Structure-Activity Relationship (QSAR) techniques. However, until now, there is no methodology able to predict anti-CRC activity of compounds against more than one CRC cell line, which should constitute the principal goal. In an attempt to overcome this problem we develop here the first multi-target (mt) approach for the virtual screening and rational in silico discovery of anti-CRC agents against ten cell lines. Here, two mt-QSAR classification models were constructed using a large and heterogeneous database of compounds. The first model was based on linear discriminant analysis (mt-QSAR-LDA) employing fragment-based descriptors while the second model was obtained using artificial neural networks (mt-QSAR-ANN) with global 2D descriptors. Both models correctly classified more than 90% of active and inactive compounds in training and prediction sets. Some fragments were extracted from the molecules and their contributions to anti-CRC activity were calculated using mt-QSAR-LDA model. Several fragments were identified as potential substructural features responsible for the anti-CRC activity and new molecules designed from those fragments with positive contributions were suggested and correctly predicted by the two models as possible potent and versatile anti-CRC agents. Copyright © 2012 Elsevier Ltd. All rights reserved.
A Fatigue Life Prediction Model of Welded Joints under Combined Cyclic Loading
NASA Astrophysics Data System (ADS)
Goes, Keurrie C.; Camarao, Arnaldo F.; Pereira, Marcos Venicius S.; Ferreira Batalha, Gilmar
2011-01-01
A practical and robust methodology is developed to evaluate the fatigue life in seam welded joints when subjected to combined cyclic loading. The fatigue analysis was conducted in virtual environment. The FE stress results from each loading were imported to fatigue code FE-Fatigue and combined to perform the fatigue life prediction using the S x N (stress x life) method. The measurement or modelling of the residual stresses resulting from the welded process is not part of this work. However, the thermal and metallurgical effects, such as distortions and residual stresses, were considered indirectly through fatigue curves corrections in the samples investigated. A tube-plate specimen was submitted to combined cyclic loading (bending and torsion) with constant amplitude. The virtual durability analysis result was calibrated based on these laboratory tests and design codes such as BS7608 and Eurocode 3. The feasibility and application of the proposed numerical-experimental methodology and contributions for the technical development are discussed. Major challenges associated with this modelling and improvement proposals are finally presented.
Karayanidis, Frini; Jamadar, Sharna; Ruge, Hannes; Phillips, Natalie; Heathcote, Andrew; Forstmann, Birte U.
2010-01-01
Recent research has taken advantage of the temporal and spatial resolution of event-related brain potentials (ERPs) and functional magnetic resonance imaging (fMRI) to identify the time course and neural circuitry of preparatory processes required to switch between different tasks. Here we overview some key findings contributing to understanding strategic processes in advance preparation. Findings from these methodologies are compatible with advance preparation conceptualized as a set of processes activated for both switch and repeat trials, but with substantial variability as a function of individual differences and task requirements. We then highlight new approaches that attempt to capitalize on this variability to link behavior and brain activation patterns. One approach examines correlations among behavioral, ERP and fMRI measures. A second “model-based” approach accounts for differences in preparatory processes by estimating quantitative model parameters that reflect latent psychological processes. We argue that integration of behavioral and neuroscientific methodologies is key to understanding the complex nature of advance preparation in task-switching. PMID:21833196
Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul
2012-11-26
The aim of this work is to develop group-contribution(+) (GC(+)) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality of parameter estimation, such as the parameter covariance, the standard errors in predicted properties, and the confidence intervals. For parameter estimation, large data sets of experimentally measured property values of a wide range of chemicals (hydrocarbons, oxygenated chemicals, nitrogenated chemicals, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22 environment-related properties, which include the fathead minnow 96-h LC(50), Daphnia magna 48-h LC(50), oral rat LD(50), aqueous solubility, bioconcentration factor, permissible exposure limit (OSHA-TWA), photochemical oxidation potential, global warming potential, ozone depletion potential, acidification potential, emission to urban air (carcinogenic and noncarcinogenic), emission to continental rural air (carcinogenic and noncarcinogenic), emission to continental fresh water (carcinogenic and noncarcinogenic), emission to continental seawater (carcinogenic and noncarcinogenic), emission to continental natural soil (carcinogenic and noncarcinogenic), and emission to continental agricultural soil (carcinogenic and noncarcinogenic) have been modeled and analyzed. The application of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.
The chlorine budget of the present-day atmosphere - A modeling study
NASA Technical Reports Server (NTRS)
Weisenstein, Debra K.; Ko, Malcolm K. W.; Sze, Nien-Dak
1992-01-01
The contribution of source gases to the total amount of inorganic chlorine (ClY) is examined analytically with a time-dependent model employing 11 source gases. The source-gas emission data are described, and the modeling methodology is set forth with attention given to the data interpretation. The abundances and distributions are obtained for all 11 source gases with corresponding ClY production rates and mixing ratios. It is shown that the ClY production rate and the ClY mixing ratio for each source gas are spatially dependent, and the change in the relative contributions from 1950 to 1990 is given. Ozone changes in the past decade are characterized by losses in the polar and midlatitude lower stratosphere. The values for CFC-11, CCl4, and CH3CCl3 suggest that they are more evident in the lower stratosphere than is suggested by steady-state estimates based on surface concentrations.
Two-factor theory – at the intersection of health care management and patient satisfaction
Bohm, Josef
2012-01-01
Using data obtained from the 2004 Joint Canadian/United States Survey of Health, an analytic model using principles derived from Herzberg’s motivational hygiene theory was developed for evaluating patient satisfaction with health care. The analysis sought to determine whether survey variables associated with consumer satisfaction act as Hertzberg factors and contribute to survey participants’ self-reported levels of health care satisfaction. To validate the technique, data from the survey were analyzed using logistic regression methods and then compared with results obtained from the two-factor model. The findings indicate a high degree of correlation between the two methods. The two-factor analytical methodology offers advantages due to its ability to identify whether a factor assumes a motivational or hygienic role and assesses the influence of a factor within select populations. Its ease of use makes this methodology well suited for assessment of multidimensional variables. PMID:23055755
Two-factor theory - at the intersection of health care management and patient satisfaction.
Bohm, Josef
2012-01-01
Using data obtained from the 2004 Joint Canadian/United States Survey of Health, an analytic model using principles derived from Herzberg's motivational hygiene theory was developed for evaluating patient satisfaction with health care. The analysis sought to determine whether survey variables associated with consumer satisfaction act as Hertzberg factors and contribute to survey participants' self-reported levels of health care satisfaction. To validate the technique, data from the survey were analyzed using logistic regression methods and then compared with results obtained from the two-factor model. The findings indicate a high degree of correlation between the two methods. The two-factor analytical methodology offers advantages due to its ability to identify whether a factor assumes a motivational or hygienic role and assesses the influence of a factor within select populations. Its ease of use makes this methodology well suited for assessment of multidimensional variables.
Heat Transfer Measurement and Modeling in Rigid High-Temperature Reusable Surface Insulation Tiles
NASA Technical Reports Server (NTRS)
Daryabeigi, Kamran; Knutson, Jeffrey R.; Cunnington, George R.
2011-01-01
Heat transfer in rigid reusable surface insulations was investigated. Steady-state thermal conductivity measurements in a vacuum were used to determine the combined contribution of radiation and solid conduction components of heat transfer. Thermal conductivity measurements at higher pressures were then used to estimate the effective insulation characteristic length for gas conduction modeling. The thermal conductivity of the insulation can then be estimated at any temperature and pressure in any gaseous media. The methodology was validated by comparing estimated thermal conductivities with published data on a rigid high-temperature silica reusable surface insulation tile. The methodology was also applied to the alumina enhanced thermal barrier tiles. Thermal contact resistance for thermal conductivity measurements on rigid tiles was also investigated. A technique was developed to effectively eliminate thermal contact resistance on the rigid tile s cold-side surface for the thermal conductivity measurements.
Electrical cable utilization for wave energy converters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bull, Diana; Baca, Michael; Schenkman, Benjamin
Here, this paper investigates the suitability of sizing the electrical export cable based on the rating of the contributing WECs within a farm. These investigations have produced a new methodology to evaluate the probabilities associated with peak power values on an annual basis. It has been shown that the peaks in pneumatic power production will follow an exponential probability function for a linear model. A methodology to combine all the individual probability functions into an annual view has been demonstrated on pneumatic power production by a Backward Bent Duct Buoy (BBDB). These investigations have also resulted in a highly simplifiedmore » and perfunctory model of installed cable cost as a function of voltage and conductor cross-section. This work solidifies the need to determine electrical export cable rating based on expected energy delivery as opposed to device rating as small decreases in energy delivery can result in cost savings.« less
Electrical cable utilization for wave energy converters
Bull, Diana; Baca, Michael; Schenkman, Benjamin
2018-04-27
Here, this paper investigates the suitability of sizing the electrical export cable based on the rating of the contributing WECs within a farm. These investigations have produced a new methodology to evaluate the probabilities associated with peak power values on an annual basis. It has been shown that the peaks in pneumatic power production will follow an exponential probability function for a linear model. A methodology to combine all the individual probability functions into an annual view has been demonstrated on pneumatic power production by a Backward Bent Duct Buoy (BBDB). These investigations have also resulted in a highly simplifiedmore » and perfunctory model of installed cable cost as a function of voltage and conductor cross-section. This work solidifies the need to determine electrical export cable rating based on expected energy delivery as opposed to device rating as small decreases in energy delivery can result in cost savings.« less
Ren, J; Jenkinson, I; Wang, J; Xu, D L; Yang, J B
2008-01-01
Focusing on people and organizations, this paper aims to contribute to offshore safety assessment by proposing a methodology to model causal relationships. The methodology is proposed in a general sense that it will be capable of accommodating modeling of multiple risk factors considered in offshore operations and will have the ability to deal with different types of data that may come from different resources. Reason's "Swiss cheese" model is used to form a generic offshore safety assessment framework, and Bayesian Network (BN) is tailored to fit into the framework to construct a causal relationship model. The proposed framework uses a five-level-structure model to address latent failures within the causal sequence of events. The five levels include Root causes level, Trigger events level, Incidents level, Accidents level, and Consequences level. To analyze and model a specified offshore installation safety, a BN model was established following the guideline of the proposed five-level framework. A range of events was specified, and the related prior and conditional probabilities regarding the BN model were assigned based on the inherent characteristics of each event. This paper shows that Reason's "Swiss cheese" model and BN can be jointly used in offshore safety assessment. On the one hand, the five-level conceptual model is enhanced by BNs that are capable of providing graphical demonstration of inter-relationships as well as calculating numerical values of occurrence likelihood for each failure event. Bayesian inference mechanism also makes it possible to monitor how a safety situation changes when information flow travel forwards and backwards within the networks. On the other hand, BN modeling relies heavily on experts' personal experiences and is therefore highly domain specific. "Swiss cheese" model is such a theoretic framework that it is based on solid behavioral theory and therefore can be used to provide industry with a roadmap for BN modeling and implications. A case study of the collision risk between a Floating Production, Storage and Offloading (FPSO) unit and authorized vessels caused by human and organizational factors (HOFs) during operations is used to illustrate an industrial application of the proposed methodology.
Martin, François-Pierre J; Montoliu, Ivan; Kochhar, Sunil; Rezzi, Serge
2010-12-01
Over the past decade, the analysis of metabolic data with advanced chemometric techniques has offered the potential to explore functional relationships among biological compartments in relation to the structure and function of the intestine. However, the employed methodologies, generally based on regression modeling techniques, have given emphasis to region-specific metabolic patterns, while providing only limited insights into the spatiotemporal metabolic features of the complex gastrointestinal system. Hence, novel approaches are needed to analyze metabolic data to reconstruct the metabolic biological space associated with the evolving structures and functions of an organ such as the gastrointestinal tract. Here, we report the application of multivariate curve resolution (MCR) methodology to model metabolic relationships along the gastrointestinal compartments in relation to its structure and function using data from our previous metabonomic analysis. The method simultaneously summarizes metabolite occurrence and contribution to continuous metabolic signatures of the different biological compartments of the gut tract. This methodology sheds new light onto the complex web of metabolic interactions with gut symbionts that modulate host cell metabolism in surrounding gut tissues. In the future, such an approach will be key to provide new insights into the dynamic onset of metabolic deregulations involved in region-specific gastrointestinal disorders, such as Crohn's disease or ulcerative colitis.
Analysis and Reduction of Complex Networks Under Uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghanem, Roger G
2014-07-31
This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC teammore » consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.« less
Abramovitch, Amitai; Mittelman, Andrew; Tankersley, Amelia P; Abramowitz, Jonathan S; Schweiger, Avraham
2015-07-30
The inconsistent nature of the neuropsychology literature pertaining to obsessive-compulsive disorder (OCD) has long been recognized. However, individual studies, systematic reviews, and recent meta-analytic reviews were unsuccessful in establishing a consensus regarding a disorder-specific neuropsychological profile. In an attempt to identify methodological factors that may contribute to the inconsistency that is characteristic of this body of research, a systematic review of methodological factors in studies comparing OCD patients and non-psychiatric controls on neuropsychological tests was conducted. This review covered 115 studies that included nearly 3500 patients. Results revealed a range of methodological weaknesses. Some of these weaknesses have been previously noted in the broader neuropsychological literature, while some are more specific to psychiatric disorders, and to OCD. These methodological shortcomings have the potential to hinder the identification of a specific neuropsychological profile associated with OCD as well as to obscure the association between neurocognitive dysfunctions and contemporary neurobiological models. Rectifying these weaknesses may facilitate replicability, and promote our ability to extract cogent, meaningful, and more unified inferences regarding the neuropsychology of OCD. To that end, we present a set of methodological recommendations to facilitate future neuropsychology research in psychiatric disorders in general, and in OCD in particular. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Hendricks, Michelle A.; Conway, Christopher M.; Kellogg, Ronald T.
2013-01-01
Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and…
Modeling methodology for supply chain synthesis and disruption analysis
NASA Astrophysics Data System (ADS)
Wu, Teresa; Blackhurst, Jennifer
2004-11-01
The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.
Computational Social Creativity.
Saunders, Rob; Bown, Oliver
2015-01-01
This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.
Architectural approaches for HL7-based health information systems implementation.
López, D M; Blobel, B
2010-01-01
Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.
Effects of Caffeine and Warrior Stress on Behavioral : An Animal Model
2016-03-14
contributes invaluably to ethical and humane research. A special thank you to Erin Barry for providing statistical expertise and methodological support...of behavioral health in rats. Several ethical and logistical issues prevent the use of humans in true controlled experiments that manipulate stress...play in the development or maintenance of behavioral problems. There are ethical issues associated with exposing humans to high caffeine doses and
Deans, Rachel; Wade, Shawna
2011-01-01
Growing demand from clients waiting to access vital services in a healthcare sector under economic constraint, coupled with the pressure for ongoing improvement within a multi-faceted organization, can have a significant impact on the front-line staff, who are essential to the successful implementation of any quality improvement initiative. The Lean methodology is a management system for continuous improvement based on the Toyota Production System; it focuses on two main themes: respect for people and the elimination of waste or non-value-added activities. Within the Lean process, value-added is used to describe any activity that contributes directly to satisfying the needs of the client, and non-value-added refers to any activity that takes time, space or resources but does not contribute directly to satisfying client needs. Through the revision of existing models of service delivery, the authors' organization has made an impact on increasing access to care and has supported successful engagement of staff in the process, while ensuring that the focus remains on the central needs of clients and families accessing services. While the performance metrics continue to exhibit respectable results for this strategic priority, further gains are expected over the next 18-24 months.
Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model
NASA Astrophysics Data System (ADS)
Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman
2016-08-01
e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity diagram, and initial class diagram from a list of requirements as identified earlier by the SE researchers
NASA Astrophysics Data System (ADS)
Arhonditsis, G.; Giourga, C.; Loumou, A.; Koulouri, M.
2002-09-01
Three mathematical models, the runoff curve number equation, the universal soil loss equation, and the mass response functions, were evaluated for predicting nonpoint source nutrient loading from agricultural watersheds of the Mediterranean region. These methodologies were applied to a catchment, the gulf of Gera Basin, that is a typical terrestrial ecosystem of the islands of the Aegean archipelago. The calibration of the model parameters was based on data from experimental plots from which edge-of-field losses of sediment, water runoff, and nutrients were measured. Special emphasis was given to the transport of dissolved and solid-phase nutrients from their sources in the farmers' fields to the outlet of the watershed in order to estimate respective attenuation rates. It was found that nonpoint nutrient loading due to surface losses was high during winter, the contribution being between 50% and 80% of the total annual nutrient losses from the terrestrial ecosystem. The good fit between simulated and experimental data supports the view that these modeling procedures should be considered as reliable and effective methodological tools in Mediterranean areas for evaluating potential control measures, such as management practices for soil and water conservation and changes in land uses, aimed at diminishing soil loss and nutrient delivery to surface waters. Furthermore, the modifications of the general mathematical formulations and the experimental values of the model parameters provided by the study can be used in further application of these methodologies in watersheds with similar characteristics.
Logical Modeling and Dynamical Analysis of Cellular Networks
Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T.; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine
2016-01-01
The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle. PMID:27303434
NASA Astrophysics Data System (ADS)
Sushkevich, T. A.; Strelkov, S. A.; Maksakova, S. V.
2017-11-01
We are talking about the national achievements of the world level in theory of radiation transfer in the system atmosphere-oceans and about the modern scientific potential developing in Russia, which adequately provides a methodological basis for theoretical and computational studies of radiation processes and radiation fields in the natural environments with the use of supercomputers and massively parallel processing for problems of remote sensing and the climate of Earth. A model of the radiation field in system "clouds cover the atmosphere-ocean" to the separation of the contributions of clouds, atmosphere and ocean.
The Contributions of Vietnamese Learners of English to ELT Methodology
ERIC Educational Resources Information Center
Tomlinson, Brian; Dat, Bao
2004-01-01
This article reports a survey of 300 intermediate-level EFL adult learners' views about the instruction they receive and of 15 of their teachers at the National University of Vietnam in Ho Chi Minh City. Its main focus is on how learners can contribute to ELT methodology. The article reviews the literature on learner cultures and perceptions in…
Sampling methods to the statistical control of the production of blood components.
Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo
2017-12-01
The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rejeb, Olfa; Pilet, Claire; Hamana, Sabri; Xie, Xiaolan; Durand, Thierry; Aloui, Saber; Doly, Anne; Biron, Pierre; Perrier, Lionel; Augusto, Vincent
2018-06-01
Innovation and health-care funding reforms have contributed to the deployment of Information and Communication Technology (ICT) to improve patient care. Many health-care organizations considered the application of ICT as a crucial key to enhance health-care management. The purpose of this paper is to provide a methodology to assess the organizational impact of high-level Health Information System (HIS) on patient pathway. We propose an integrated performance evaluation of HIS approach through the combination of formal modeling using the Architecture of Integrated Information Systems (ARIS) models, a micro-costing approach for cost evaluation, and a Discrete-Event Simulation (DES) approach. The methodology is applied to the consultation for cancer treatment process. Simulation scenarios are established to conclude about the impact of HIS on patient pathway. We demonstrated that although high level HIS lengthen the consultation, occupation rate of oncologists are lower and quality of service is higher (through the number of available information accessed during the consultation to formulate the diagnostic). The provided method allows also to determine the most cost-effective ICT elements to improve the care process quality while minimizing costs. The methodology is flexible enough to be applied to other health-care systems.
NASA Technical Reports Server (NTRS)
Beckingham, Kathleen M.; Armstrong, J. Douglas; Texada, Michael J.; Munjaal, Ravi; Baker, Dean A.
2005-01-01
Drosophila melanogaster has been intensely studied for almost 100 years. The sophisticated array of genetic and molecular tools that have evolved for analysis of gene function in this organism are unique. Further, Drosophila is a complex multi-cellular organism in which many aspects of development and behavior parallel those in human beings. These combined advantages have permitted research in Drosophila to make seminal contributions to the understanding of fundamental biological processes and ensure that Drosophila will continue to provide unique insights in the genomic era. An overview of the genetic methodologies available in Drosophila is given here, together with examples of outstanding recent contributions of Drosophila to our understanding of cell and organismal biology. The growing contribution of Drosophila to our knowledge of gravity-related responses is addressed.
Has 60 years of research in psychology really gone astray?
Yurevich, Andrey
2007-03-01
The author presents several arguments against Toomela's (Culture of science: Strange history of the methodological thinking in psychology. Integrative Psychological and Behavioral Science, 2007a, doi:10.1007/sl2124-007-9004-0, History of methodology in psychology: Starting point, not the goal. Integrative Psychological and Behavioral Science, 2007b, doi:10.1007/sl2124-007-9005-z) pessimistic thesis: "The last 60 years of research in psychology seems to have gone astray." Nevertheless he admits that Toomela's article despite the excessively categorical assessments contained in it and the undue pessimism crowing its conclusion, represents a substantial contribution to the highlighting of socio-cultural impact on various models of psychological cognition, which lurks behind the international unification of globalizing science.
Coevolution of Epidemics, Social Networks, and Individual Behavior: A Case Study
NASA Astrophysics Data System (ADS)
Chen, Jiangzhuo; Marathe, Achla; Marathe, Madhav
This research shows how a limited supply of antivirals can be distributed optimally between the hospitals and the market so that the attack rate is minimized and enough revenue is generated to recover the cost of the antivirals. Results using an individual based model find that prevalence elastic demand behavior delays the epidemic and change in the social contact network induced by isolation reduces the peak of the epidemic significantly. A microeconomic analysis methodology combining behavioral economics and agent-based simulation is a major contribution of this work. In this paper we apply this methodology to analyze the fairness of the stockpile distribution, and the response of human behavior to disease prevalence level and its interaction with the market.
NASA Astrophysics Data System (ADS)
Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.
2017-09-01
In this study, Computer Aided Engineering was used for injection moulding simulation. The method of Design of experiment (DOE) was utilize according to the Latin Square orthogonal array. The relationship between the injection moulding parameters and warpage were identify based on the experimental data that used. Response Surface Methodology (RSM) was used as to validate the model accuracy. Then, the RSM and GA method were combine as to examine the optimum injection moulding process parameter. Therefore the optimisation of injection moulding is largely improve and the result shown an increasing accuracy and also reliability. The propose method by combining RSM and GA method also contribute in minimising the warpage from occur.
Calibration of Safecast dose rate measurements.
Cervone, Guido; Hultquist, Carolynne
2018-10-01
A methodology is presented to calibrate contributed Safecast dose rate measurements acquired between 2011 and 2016 in the Fukushima prefecture of Japan. The Safecast data are calibrated using observations acquired by the U.S. Department of Energy at the time of the 2011 Fukushima Daiichi power plant nuclear accident. The methodology performs a series of interpolations between the U.S. government and contributed datasets at specific temporal windows and at corresponding spatial locations. The coefficients found for all the different temporal windows are aggregated and interpolated using quadratic regressions to generate a time dependent calibration function. Normal background radiation, decay rates, and missing values are taken into account during the analysis. Results show that the standard Safecast static transformation function overestimates the official measurements because it fails to capture the presence of two different Cesium isotopes and their changing magnitudes with time. A model is created to predict the ratio of the isotopes from the time of the accident through 2020. The proposed time dependent calibration takes into account this Cesium isotopes ratio, and it is shown to reduce the error between U.S. government and contributed data. The proposed calibration is needed through 2020, after which date the errors introduced by ignoring the presence of different isotopes will become negligible. Copyright © 2018 Elsevier Ltd. All rights reserved.
Tîrnăucă, Cristina; Duque, Rafael; Montaña, José L.
2017-01-01
A relevant goal in human–computer interaction is to produce applications that are easy to use and well-adjusted to their users’ needs. To address this problem it is important to know how users interact with the system. This work constitutes a methodological contribution capable of identifying the context of use in which users perform interactions with a groupware application (synchronous or asynchronous) and provides, using machine learning techniques, generative models of how users behave. Additionally, these models are transformed into a text that describes in natural language the main characteristics of the interaction of the users with the system. PMID:28726762
Nursing research: can a feminist perspective make any contribution?
Ehlers, V
1999-03-01
As more than 90% of the RSA's nurses are women and as at least 50% of the health care clients are also women, nursing research can definitely benefit by incorporating feminist research approaches. Specific feminist research issues which could be relevant to nursing research include: inherent themes in feminist research feminist research methodology gender stereotypes and nursing research gender-based stereotypes of researchers potential benefits of incorporating feminist research approaches in nursing research. Most formal models of nursing, and thus also most nursing research based on these models, ignore gender issues. Thus they ignore part of the social reality of nursing and might provide distorted images of nursing. A feminist approach to nursing research could enhance the reality-based gender issues relevant to nursing specifically, and health care generally, and contribute towards rendering effective health care within a multidisciplinary health care context.
Computational biology for cardiovascular biomarker discovery.
Azuaje, Francisco; Devaux, Yvan; Wagner, Daniel
2009-07-01
Computational biology is essential in the process of translating biological knowledge into clinical practice, as well as in the understanding of biological phenomena based on the resources and technologies originating from the clinical environment. One such key contribution of computational biology is the discovery of biomarkers for predicting clinical outcomes using 'omic' information. This process involves the predictive modelling and integration of different types of data and knowledge for screening, diagnostic or prognostic purposes. Moreover, this requires the design and combination of different methodologies based on statistical analysis and machine learning. This article introduces key computational approaches and applications to biomarker discovery based on different types of 'omic' data. Although we emphasize applications in cardiovascular research, the computational requirements and advances discussed here are also relevant to other domains. We will start by introducing some of the contributions of computational biology to translational research, followed by an overview of methods and technologies used for the identification of biomarkers with predictive or classification value. The main types of 'omic' approaches to biomarker discovery will be presented with specific examples from cardiovascular research. This will include a review of computational methodologies for single-source and integrative data applications. Major computational methods for model evaluation will be described together with recommendations for reporting models and results. We will present recent advances in cardiovascular biomarker discovery based on the combination of gene expression and functional network analyses. The review will conclude with a discussion of key challenges for computational biology, including perspectives from the biosciences and clinical areas.
Workshops as a Research Methodology
ERIC Educational Resources Information Center
Ørngreen, Rikke; Levinsen, Karin
2017-01-01
This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…
Dealing with dissatisfaction in mathematical modelling to integrate QFD and Kano’s model
NASA Astrophysics Data System (ADS)
Retno Sari Dewi, Dian; Debora, Joana; Edy Sianto, Martinus
2017-12-01
The purpose of the study is to implement the integration of Quality Function Deployment (QFD) and Kano’s Model into mathematical model. Voice of customer data in QFD was collected using questionnaire and the questionnaire was developed based on Kano’s model. Then the operational research methodology was applied to build the objective function and constraints in the mathematical model. The relationship between voice of customer and engineering characteristics was modelled using linier regression model. Output of the mathematical model would be detail of engineering characteristics. The objective function of this model is to maximize satisfaction and minimize dissatisfaction as well. Result of this model is 62% .The major contribution of this research is to implement the existing mathematical model to integrate QFD and Kano’s Model in the case study of shoe cabinet.
Hydrochemical characterization of a river affected by acid mine drainage in the Iberian Pyrite Belt.
Grande, J A; Santisteban, M; Valente, T; de la Torre, M L; Gomes, P
2017-06-01
This paper addresses the modelling of the processes associated with acid mine drainage affecting the Trimpancho River basin, chosen for this purpose because of its location and paradigmatic hydrological, geological, mining and environmental contexts. By using physical-chemical indicators it is possible to define the contamination degree of the system from the perspective of an entire river basin, due to its reduced dimension. This allows an exhaustive monitoring of the study area, considering the particularity that the stream flows directly into a water dam used for human supply. With such a perspective, and in order to find global solutions, the present study seeks to develop methodologies and tools for expeditious and accurate diagnosis of the pollution level of the affected stream that feeds the water reservoir. The implemented methodology can be applied to other water systems affected by similar problems, while the results will contribute to the development of the state of the art in a representative basin of the Iberian Pyrite Belt, whose pollutants' contributions are incorporated into the reservoir.
Chiang, Austin W T; Liu, Wei-Chung; Charusanti, Pep; Hwang, Ming-Jing
2014-01-15
A major challenge in mathematical modeling of biological systems is to determine how model parameters contribute to systems dynamics. As biological processes are often complex in nature, it is desirable to address this issue using a systematic approach. Here, we propose a simple methodology that first performs an enrichment test to find patterns in the values of globally profiled kinetic parameters with which a model can produce the required system dynamics; this is then followed by a statistical test to elucidate the association between individual parameters and different parts of the system's dynamics. We demonstrate our methodology on a prototype biological system of perfect adaptation dynamics, namely the chemotaxis model for Escherichia coli. Our results agreed well with those derived from experimental data and theoretical studies in the literature. Using this model system, we showed that there are motifs in kinetic parameters and that these motifs are governed by constraints of the specified system dynamics. A systematic approach based on enrichment statistical tests has been developed to elucidate the relationships between model parameters and the roles they play in affecting system dynamics of a prototype biological network. The proposed approach is generally applicable and therefore can find wide use in systems biology modeling research.
Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J
2008-04-01
Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.
Postgraduate Conception of Research Methodology: Implications for Learning and Teaching
ERIC Educational Resources Information Center
Daniel, Ben; Kumar, Vijay; Omar, Noritah
2018-01-01
This qualitative inquiry investigates postgraduate students' conceptions of research methodology and how it contributes to their learning. It explores factors likely to motivate student choice of research methodology and challenges in understanding research methods. The research was carried out at research-intensive universities in New Zealand and…
Mikhail Geraskov (1874-1957): Methodological Concepts of Learning Physics
ERIC Educational Resources Information Center
Ilieva, Mariyana
2014-01-01
Mikhail Geraskov is a distinguished Bulgarian educator from the first half of the twentieth century, who developed the scientific foundations of didactics and methodology of training. His work contributed a lot to the development of the Bulgarian pedagogy. The subject of scientific research is didactical conceptions and methodological conceptions…
Determinants of Nontraditional Student Status: A Methodological Review of the Research
ERIC Educational Resources Information Center
Langrehr, Kimberly J.; Phillips, Julia C.; Melville, Alexis; Eum, Koun
2015-01-01
This article presents a review of 21 years (1990 to 2011) of multidisciplinary research on nontraditional college students that focuses on determinants of nontraditional student status and research methodology. The purpose is to address the methodological hindrances that have contributed to deficit-based views of nontraditional students in…
NASA Astrophysics Data System (ADS)
Vázquez-Suñé, Enric; Ángel Marazuela, Miguel; Velasco, Violeta; Diviu, Marc; Pérez-Estaún, Andrés; Álvarez-Marrón, Joaquina
2016-09-01
The overdevelopment of cities since the industrial revolution has shown the need to incorporate a sound geological knowledge in the management of required subsurface infrastructures and in the assessment of increasingly needed groundwater resources. Additionally, the scarcity of outcrops and the technical difficulty to conduct underground exploration in urban areas highlights the importance of implementing efficient management plans that deal with the legacy of heterogeneous subsurface information. To deal with these difficulties, a methodology has been proposed to integrate all the available spatio-temporal data into a comprehensive spatial database and a set of tools that facilitates the analysis and processing of the existing and newly added data for the city of Barcelona (NE Spain). Here we present the resulting actual subsurface 3-D geological model that incorporates and articulates all the information stored in the database. The methodology applied to Barcelona benefited from a good collaboration between administrative bodies and researchers that enabled the realization of a comprehensive geological database despite logistic difficulties. Currently, the public administration and also private sectors both benefit from the geological understanding acquired in the city of Barcelona, for example, when preparing the hydrogeological models used in groundwater assessment plans. The methodology further facilitates the continuous incorporation of new data in the implementation and sustainable management of urban groundwater, and also contributes to significantly reducing the costs of new infrastructures.
NASA Astrophysics Data System (ADS)
Costanzi, Stefano; Tikhonova, Irina G.; Harden, T. Kendall; Jacobson, Kenneth A.
2009-11-01
Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.
Identification of critical sediment source areas at regional level
NASA Astrophysics Data System (ADS)
Fargas, D.; Casasnovas, J. A. Martínez; Poch, R.
In order to identify critical sediment sources in large catchments, using easily available terrain information at regional scale, a methodology has been developed to obtain a qualitative assessment necessary for further studies. The main objective of the model is to use basic terrain data related to the erosive processes which contribute to the production, transport and accumulation of sediments through the main water paths in the watershed. The model is based on the selection of homogeneous zones regarding drainage density and lithology, achieved by joining the spatial basic units by a rating system. The values of drainage density are rated according to an erosion class (Bucko & Mazurova, 1958). The lithology is rated by erosion indexes, adapted from FAO (1977). The combination and reclassification of the results brings about five qualitative classes of sediment emission risk. This methodology has been tested an validated for the watershed of the Joaquín Costa reservoir (NE Spain), with a surface of 1500 km 2. The mapping scale was 1:100.000 and the model was implemented through a vector GIS (Arc/Info). The prediction was checked by means of photo-interpretation and field work, which gave a accuracy of 78.5%. The proposed methodology has been proved useful as an initial approach for erosion assessment and soil conservation planning at the regional level, and also to select priority areas where further analyses can be developed.
Gubskaya, Anna V.; Khan, I. John; Valenzuela, Loreto M.; Lisnyak, Yuriy V.; Kohn, Joachim
2013-01-01
The objectives of this work were: (1) to select suitable compositions of tyrosine-derived polycarbonates for controlled delivery of voclosporin, a potent drug candidate to treat ocular diseases, (2) to establish a structure-function relationship between key molecular characteristics of biodegradable polymer matrices and drug release kinetics, and (3) to identify factors contributing in the rate of drug release. For the first time, the experimental study of polymeric drug release was accompanied by a hierarchical sequence of three computational methods. First, suitable polymer compositions used in subsequent neural network modeling were determined by means of response surface methodology (RSM). Second, accurate artificial neural network (ANN) models were built to predict drug release profiles for fifteen polymers located outside the initial design space. Finally, thermodynamic properties and hydrogen-bonding patterns of model drug-polymer complexes were studied using molecular dynamics (MD) technique to elucidate a role of specific interactions in drug release mechanism. This research presents further development of methodological approaches to meet challenges in the design of polymeric drug delivery systems. PMID:24039300
NASA Astrophysics Data System (ADS)
Ohlsson, Stellan; Cosejo, David G.
2014-07-01
The problem of how people process novel and unexpected information— deep learning (Ohlsson in Deep learning: how the mind overrides experience. Cambridge University Press, New York, 2011)—is central to several fields of research, including creativity, belief revision, and conceptual change. Researchers have not converged on a single theory for conceptual change, nor has any one theory been decisively falsified. One contributing reason is the difficulty of collecting informative data in this field. We propose that the commonly used methodologies of historical analysis, classroom interventions, and developmental studies, although indispensible, can be supplemented with studies of laboratory models of conceptual change. We introduce re- categorization, an experimental paradigm in which learners transition from one definition of a categorical concept to another, incompatible definition of the same concept, a simple form of conceptual change. We describe a re-categorization experiment, report some descriptive findings pertaining to the effects of category complexity, the temporal unfolding of learning, and the nature of the learner's final knowledge state. We end with a brief discussion of ways in which the re-categorization model can be improved.
Santelices C, Emilio; Muñoz P, Fernando; Muñiz, Patricio; Rojas, José
2016-03-01
Health care must be provided with strong primary health care models, emphasizing prevention and a continued, integrated and interdisciplinary care. Tools should be used to allow a better planning and more efficient use of resources. To assess risk adjustment methodologies, such as the Adjusted Clinical Groups (ACG) developed by The Johns Hopkins University, to allow the identification of chronic condition patterns and allocate resources accordingly. We report the results obtained applying the ACG methodology in primary care systems of 22 counties for three chronic diseases, namely Diabetes Mellitus, Hypertension and Heart Failure. The outcomes show a great variability in the prevalence of these conditions in the different health centers. There is also a great diversity in the use of resources for a given condition in the different health care centers. This methodology should contribute to a better distribution of health care resources, which should be based on the disease burden of each health care center.
Gilbert-Ouimet, Mahée; Trudel, Xavier; Brisson, Chantal; Milot, Alain; Vézina, Michel
2014-03-01
A growing body of research has investigated the adverse effects of psychosocial work factors on blood pressure (BP) elevation. There is now a clear need for an up-to-date, critical synthesis of reliable findings on this topic. This systematic review aimed to evaluate the adverse effects of psychosocial work factors of both the demand-control-support (DCS) and effort-reward imbalance (ERI) models on BP among men and women, according to the methodological quality of the studies. To be eligible, studies had to: (i) evaluate at least one psychosocial work factor, (ii) evaluate BP or hypertension, (iii) comprise ≥100 workers, (iv) be written in English or French, and (v) be published in a peer-reviewed journal. A total of 74 studies were included. Of these, 64 examined the DCS model, and 12 looked at the ERI model, with 2 studies considering both models. Approximately half the studies observed a significant adverse effect of psychosocial work factors on BP. A more consistent effect was observed, however, among men than women. For job strain, a more consistent effect was also observed in studies of higher methodological quality, ie, studies using a prospective design and ambulatory BP measures. A more consistent adverse effect of psychosocial work factors was observed among men than women and in studies of higher methodological quality. These findings contribute to the current effort of primary prevention of cardiovascular disease by documenting the psychosocial etiology of elevated BP, a major cardiovascular risk factor.
Methodological pluralism in the teaching of Astronomy
NASA Astrophysics Data System (ADS)
de Macedo, Josué Antunes; Voelzke, Marcos Rincon
2015-04-01
This paper discusses the feasibility of using a teaching strategy called methodological pluralism, consisting of the use of various methodological resources in order to provide a meaningful learning. It is part of a doctoral thesis, which aims to investigate contributions to the use of traditional resources combined with digital technologies, in order to create autonomy for future teachers of Natural Sciences and Mathematics in relation to themes in Astronomy. It was offered an extension course at the "Federal Institution of Education, Science and Technology" in the North of Minas Gerais (FINMG), Campus Januaria, for thirty-two students of licentiate courses in Physics, Mathematics and Biological Sciences, involving themes of Astronomy, in order to search and contribute to improving the training of future teachers. The following aspects are used: the mixed methodology, with pre-experimental design, combined with content analysis. The results indicate the rates of students' prior knowledge in relation to Astronomy was low; meaningful learning indications of concepts related to Astronomy, and the feasibility of using digital resources Involving technologies, articulated with traditional materials in the teaching of Astronomy. This research sought to contribute to the initial teacher training, especially in relation to Astronomy Teaching, proposing new alternatives to promote the teaching of this area of knowledge, extending the methodological options of future teachers.
A model to calculate consistent atmospheric emission projections and its application to Spain
NASA Astrophysics Data System (ADS)
Lumbreras, Julio; Borge, Rafael; de Andrés, Juan Manuel; Rodríguez, Encarnación
Global warming and air quality are headline environmental issues of our time and policy must preempt negative international effects with forward-looking strategies. As part of the revision of the European National Emission Ceilings Directive, atmospheric emission projections for European Union countries are being calculated. These projections are useful to drive European air quality analyses and to support wide-scale decision-making. However, when evaluating specific policies and measures at sectoral level, a more detailed approach is needed. This paper presents an original methodology to evaluate emission projections. Emission projections are calculated for each emitting activity that has emissions under three scenarios: without measures (business as usual), with measures (baseline) and with additional measures (target). The methodology developed allows the estimation of highly disaggregated multi-pollutant, consistent emissions for a whole country or region. In order to assure consistency with past emissions included in atmospheric emission inventories and coherence among the individual activities, the consistent emission projection (CEP) model incorporates harmonization and integration criteria as well as quality assurance/quality check (QA/QC) procedures. This study includes a sensitivity analysis as a first approach to uncertainty evaluation. The aim of the model presented in this contribution is to support decision-making process through the assessment of future emission scenarios taking into account the effect of different detailed technical and non-technical measures and it may also constitute the basis for air quality modelling. The system is designed to produce the information and formats related to international reporting requirements and it allows performing a comparison of national results with lower resolution models such as RAINS/GAINS. The methodology has been successfully applied and tested to evaluate Spanish emission projections up to 2020 for 26 pollutants but the methodology could be adopted for any particular region for different purposes, especially for European countries.
NASA Astrophysics Data System (ADS)
Fekete, Tamás
2018-05-01
Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well-grounded theoretical foundation for a new modeling framework of structural integrity. This paper presents the first findings of the research project.
Using Model Replication to Improve the Reliability of Agent-Based Models
NASA Astrophysics Data System (ADS)
Zhong, Wei; Kim, Yushim
The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.
Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor
NASA Astrophysics Data System (ADS)
Mkhabela, Peter Tshepo
The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis methodology for the PBMR to provide reference solutions. Investigation of different aspects of the coupled methodology and development of efficient kinetics treatment for the PBMR were carried out, which accounts for all feedback phenomena in an efficient manner. The OECD/NEA PBMR-400 coupled code benchmark was used as a test matrix for the proposed investigations. The integrated thermal-hydraulics and neutronics (multi-physics) methods were extended to enable modeling of a wider range of transients pertinent to the PBMR. First, the effect of the spatial mapping schemes (spatial coupling) was studied and quantified for different types of transients, which resulted in implementation of improved mapping methodology based on user defined criteria. The second aspect that was studied and optimized is the temporal coupling and meshing schemes between the neutronics and thermal-hydraulics time step selection algorithms. The coupled code convergence was achieved supplemented by application of methods to accelerate it. Finally, the modeling of all feedback phenomena in PBMRs was investigated and a novel treatment of cross-section dependencies was introduced for improving the representation of cross-section variations. The added benefit was that in the process of studying and improving the coupled multi-physics methodology more insight was gained into the physics and dynamics of PBMR, which will help also to optimize the PBMR design and improve its safety. One unique contribution of the PhD research is the investigation of the importance of the correct representation of the three-dimensional (3-D) effects in the PBMR analysis. The performed studies demonstrated that explicit 3-D modeling of control rod movement is superior and removes the errors associated with the grey curtain (2-D homogenized) approximation.
Gregori, Dario; Rosato, Rosalba; Zecchin, Massimo; Di Lenarda, Andrea
2005-01-01
This paper discusses the use of bivariate survival curves estimators within the competing risk framework. Competing risks models are used for the analysis of medical data with more than one cause of death. The case of dilated cardiomiopathy is explored. Bivariate survival curves plot the conjoint mortality processes. The different graphic representation of bivariate survival analysis is the major contribute of this methodology to the competing risks analysis.
Torsional Ultrasound Sensor Optimization for Soft Tissue Characterization
Melchor, Juan; Muñoz, Rafael; Rus, Guillermo
2017-01-01
Torsion mechanical waves have the capability to characterize shear stiffness moduli of soft tissue. Under this hypothesis, a computational methodology is proposed to design and optimize a piezoelectrics-based transmitter and receiver to generate and measure the response of torsional ultrasonic waves. The procedure employed is divided into two steps: (i) a finite element method (FEM) is developed to obtain a transmitted and received waveform as well as a resonance frequency of a previous geometry validated with a semi-analytical simplified model and (ii) a probabilistic optimality criteria of the design based on inverse problem from the estimation of robust probability of detection (RPOD) to maximize the detection of the pathology defined in terms of changes of shear stiffness. This study collects different options of design in two separated models, in transmission and contact, respectively. The main contribution of this work describes a framework to establish such as forward, inverse and optimization procedures to choose a set of appropriate parameters of a transducer. This methodological framework may be generalizable for other different applications. PMID:28617353
Surface Connectivity and Interocean Exchanges From Drifter-Based Transition Matrices
NASA Astrophysics Data System (ADS)
McAdam, Ronan; van Sebille, Erik
2018-01-01
Global surface transport in the ocean can be represented by using the observed trajectories of drifters to calculate probability distribution functions. The oceanographic applications of the Markov Chain approach to modeling include tracking of floating debris and water masses, globally and on yearly-to-centennial time scales. Here we analyze the error inherent with mapping trajectories onto a grid and the consequences for ocean transport modeling and detection of accumulation structures. A sensitivity analysis of Markov Chain parameters is performed in an idealized Stommel gyre and western boundary current as well as with observed ocean drifters, complementing previous studies on widespread floating debris accumulation. Focusing on two key areas of interocean exchange—the Agulhas system and the North Atlantic intergyre transport barrier—we assess the capacity of the Markov Chain methodology to detect surface connectivity and dynamic transport barriers. Finally, we extend the methodology's functionality to separate the geostrophic and nongeostrophic contributions to interocean exchange in these key regions.
Bocknek, Erika L; Hossain, Ziarat; Roggman, Lori
2014-01-01
Research on fathering and the father-child relationship has made substantial progress in the most recent 15 years since the last special issue of the Infant Mental Health Journal on fathers and young children. This special issue on fathers and young children contains a series of papers exemplifying this progress, including advances in methodology-more direct assessment and more observational measures-in addition to the increasing dynamic complexity of the conceptual models used to study fathers, the diversity of fathers studied, and the growth of programs to support early father involvement. In assessing the current state of the field, special attention is given to contributions made by the papers contained in this special issue, and two critical areas for continued progress are addressed: (1) methodological and measurement development that specifically address fathers and fathering relationships and (2) cross-cultural and ecologically valid research examining the diversity of models of fathering. © 2014 Michigan Association for Infant Mental Health.
Using a nursing theory or a model in nursing PhD dissertations: a qualitative study from Turkey.
Mete, Samiye; Gokçe İsbir, Gozde
2015-04-01
The aim of this study was to reveal experiences of nursing students and their advisors using theories and models in their PhD dissertations. The study adopted a descriptive qualitative approach. This study was performed with 10 PhD candidates and their five advisors from nursing faculty. The results of the study were categorized into four. These are reasons for using a theory/model in a PhD dissertation, reasons for preferring a given model, causes of difficulties in using models in PhD dissertations, and facilitating factors of using theories and models in PhD of dissertations. It was also reported to contribute to the methodology of research and professional development of the students and advisors. © 2014 NANDA International, Inc.
Emerging Concepts and Methodologies in Cancer Biomarker Discovery.
Lu, Meixia; Zhang, Jinxiang; Zhang, Lanjing
2017-01-01
Cancer biomarker discovery is a critical part of cancer prevention and treatment. Despite the decades of effort, only a small number of cancer biomarkers have been identified for and validated in clinical settings. Conceptual and methodological breakthroughs may help accelerate the discovery of additional cancer biomarkers, particularly their use for diagnostics. In this review, we have attempted to review the emerging concepts in cancer biomarker discovery, including real-world evidence, open access data, and data paucity in rare or uncommon cancers. We have also summarized the recent methodological progress in cancer biomarker discovery, such as high-throughput sequencing, liquid biopsy, big data, artificial intelligence (AI), and deep learning and neural networks. Much attention has been given to the methodological details and comparison of the methodologies. Notably, these concepts and methodologies interact with each other and will likely lead to synergistic effects when carefully combined. Newer, more innovative concepts and methodologies are emerging as the current emerging ones became mainstream and widely applied to the field. Some future challenges are also discussed. This review contributes to the development of future theoretical frameworks and technologies in cancer biomarker discovery and will contribute to the discovery of more useful cancer biomarkers.
ERIC Educational Resources Information Center
Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa
2016-01-01
This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…
Methodological, Theoretical, Infrastructural, and Design Issues in Conducting Good Outcome Studies
ERIC Educational Resources Information Center
Kelly, Michael P.; Moore, Tessa A.
2011-01-01
This article outlines a set of methodological, theoretical, and other issues relating to the conduct of good outcome studies. The article begins by considering the contribution of evidence-based medicine to the methodology of outcome research. The lessons which can be applied in outcome studies in nonmedical settings are described. The article…
A Robustness Testing Campaign for IMA-SP Partitioning Kernels
NASA Astrophysics Data System (ADS)
Grixti, Stephen; Lopez Trecastro, Jorge; Sammut, Nicholas; Zammit-Mangion, David
2015-09-01
With time and space partitioned architectures becoming increasingly appealing to the European space sector, the dependability of partitioning kernel technology is a key factor to its applicability in European Space Agency projects. This paper explores the potential of the data type fault model, which injects faults through the Application Program Interface, in partitioning kernel robustness testing. This fault injection methodology has been tailored to investigate its relevance in uncovering vulnerabilities within partitioning kernels and potentially contributing towards fault removal campaigns within this domain. This is demonstrated through a robustness testing case study of the XtratuM partitioning kernel for SPARC LEON3 processors. The robustness campaign exposed a number of vulnerabilities in XtratuM, exhibiting the potential benefits of using such a methodology for the robustness assessment of partitioning kernels.
The Maritime Cultural Landscape of Northern Patagonia
NASA Astrophysics Data System (ADS)
Lira, Nicolás
2017-12-01
This article is a contribution to the study of the indigenous navigation and its boats in the region of northern Patagonia. This article also aims to contribute to the understanding of indigenous navigation practices and technologies and their origins from prehistoric times to the mid-twentieth century. It presents and discusses the concept of Westerdahl's Maritime Cultural Landscape in relation to other landscape concepts. This model is applied to northern Patagonia in order to discuss if it is possible to speak of a true maritime culture in the region. For this purpose, archaeological, historical and ethnographic data are presented in an integrative and innovative methodology for the discipline. Finally, the Maritime Cultural Landscape model will allow the integration of aquatic and terrestrial landscapes as routes traveled by native inhabitants of northern Patagonia and southern Chile, and propose an important and diversified maritime, river and lake tradition.
Armario, Antonio; Nadal, Roser
2013-01-01
Despite the development of valuable new techniques (i.e., genetics, neuroimage) for the study of the neurobiological substrate of psychiatric diseases, there are strong limitations in the information that can be gathered from human studies. It is thus critical to develop appropriate animal models of psychiatric diseases to characterize their putative biological bases and the development of new therapeutic strategies. The present review tries to offer a general perspective and several examples of how individual differences in animals can contribute to explain differential susceptibility to develop behavioral alterations, but also emphasizes methodological problems that can lead to inappropriate or over-simplistic interpretations. A critical analysis of the approaches currently used could contribute to obtain more reliable data and allow taking full advantage of new and sophisticated technologies. The discussion is mainly focused on anxiety-like and to a lower extent on depression-like behavior in rodents.
Armario, Antonio; Nadal, Roser
2013-01-01
Despite the development of valuable new techniques (i.e., genetics, neuroimage) for the study of the neurobiological substrate of psychiatric diseases, there are strong limitations in the information that can be gathered from human studies. It is thus critical to develop appropriate animal models of psychiatric diseases to characterize their putative biological bases and the development of new therapeutic strategies. The present review tries to offer a general perspective and several examples of how individual differences in animals can contribute to explain differential susceptibility to develop behavioral alterations, but also emphasizes methodological problems that can lead to inappropriate or over-simplistic interpretations. A critical analysis of the approaches currently used could contribute to obtain more reliable data and allow taking full advantage of new and sophisticated technologies. The discussion is mainly focused on anxiety-like and to a lower extent on depression-like behavior in rodents. PMID:24265618
NASA Astrophysics Data System (ADS)
Zolfaghari, Mohammad R.
2009-07-01
Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.
Tien, Christopher J; Winslow, James F; Hintenlang, David E
2011-01-31
In helical computed tomography (CT), reconstruction information from volumes adjacent to the clinical volume of interest (VOI) is required for proper reconstruction. Previous studies have relied upon either operator console readings or indirect extrapolation of measurements in order to determine the over-ranging length of a scan. This paper presents a methodology for the direct quantification of over-ranging dose contributions using real-time dosimetry. A Siemens SOMATOM Sensation 16 multislice helical CT scanner is used with a novel real-time "point" fiber-optic dosimeter system with 10 ms temporal resolution to measure over-ranging length, which is also expressed in dose-length-product (DLP). Film was used to benchmark the exact length of over-ranging. Over-ranging length varied from 4.38 cm at pitch of 0.5 to 6.72 cm at a pitch of 1.5, which corresponds to DLP of 131 to 202 mGy-cm. The dose-extrapolation method of Van der Molen et al. yielded results within 3%, while the console reading method of Tzedakis et al. yielded consistently larger over-ranging lengths. From film measurements, it was determined that Tzedakis et al. overestimated over-ranging lengths by one-half of beam collimation width. Over-ranging length measured as a function of reconstruction slice thicknesses produced two linear regions similar to previous publications. Over-ranging is quantified with both absolute length and DLP, which contributes about 60 mGy-cm or about 10% of DLP for a routine abdominal scan. This paper presents a direct physical measurement of over-ranging length within 10% of previous methodologies. Current uncertainties are less than 1%, in comparison with 5% in other methodologies. Clinical implantation can be increased by using only one dosimeter if codependence with console readings is acceptable, with an uncertainty of 1.1% This methodology will be applied to different vendors, models, and postprocessing methods--which have been shown to produce over-ranging lengths differing by 125%.
Quantification of groundwater recharge in urban environments.
Tubau, Isabel; Vázquez-Suñé, Enric; Carrera, Jesús; Valhondo, Cristina; Criollo, Rotman
2017-08-15
Groundwater management in urban areas requires a detailed knowledge of the hydrogeological system as well as the adequate tools for predicting the amount of groundwater and water quality evolution. In that context, a key difference between urban and natural areas lies in recharge evaluation. A large number of studies have been published since the 1990s that evaluate recharge in urban areas, with no specific methodology. Most of these methods show that there are generally higher rates of recharge in urban settings than in natural settings. Methods such as mixing ratios or groundwater modeling can be used to better estimate the relative importance of different sources of recharge and may prove to be a good tool for total recharge evaluation. However, accurate evaluation of this input is difficult. The objective is to present a methodology to help overcome those difficulties, and which will allow us to quantify the variability in space and time of the recharge into aquifers in urban areas. Recharge calculations have been initially performed by defining and applying some analytical equations, and validation has been assessed based on groundwater flow and solute transport modeling. This methodology is applicable to complex systems by considering temporal variability of all water sources. This allows managers of urban groundwater to evaluate the relative contribution of different recharge sources at a city scale by considering quantity and quality factors. The methodology is applied to the assessment of recharge sources in the Barcelona city aquifers. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar
2016-01-01
This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.
NASA Astrophysics Data System (ADS)
dall'Acqua, Luisa
2011-08-01
The teleology of our research is to propose a solution to the request of "innovative, creative teaching", proposing a methodology to educate creative Students in a society characterized by multiple reference points and hyper dynamic knowledge, continuously subject to reviews and discussions. We apply a multi-prospective Instructional Design Model (PENTHA ID Model), defined and developed by our research group, which adopts a hybrid pedagogical approach, consisting of elements of didactical connectivism intertwined with aspects of social constructivism and enactivism. The contribution proposes an e-course structure and approach, applying the theoretical design principles of the above mentioned ID Model, describing methods, techniques, technologies and assessment criteria for the definition of lesson modes in an e-course.
Hou, Ying; Chen, Weiping; Liao, Yuehua; Luo, Yueping
2017-11-03
Considerable growth in the economy and population of the Dongting Lake watershed in Southern China has increased phosphorus loading to the lake and resulted in a growing risk of lake eutrophication. This study aimed to reveal the spatial pattern and sources of phosphorus export and loading from the watershed. We applied an export coefficient model and the Dillon-Rigler model to quantify contributions of different sub-watersheds and sources to the total phosphorus (TP) export and loading in 2010. Together, the upper and lower reaches of the Xiang River watershed and the Dongting Lake Area contributed 60.9% of the TP exported from the entire watershed. Livestock husbandry appeared to be the largest anthropogenic source of TP, contributing more than 50% of the TP exported from each secondary sub-watersheds. The actual TP loading to the lake in 2010 was 62.9% more than the permissible annual TP loading for compliance with the Class III water quality standard for lakes. Three primary sub-watersheds-the Dongting Lake Area, the Xiang River, and the Yuan River watersheds-contributed 91.2% of the total TP loading. As the largest contributor among all sources, livestock husbandry contributed nearly 50% of the TP loading from the Dongting Lake Area and more than 60% from each of the other primary sub-watersheds. This study provides a methodology to identify the key sources and locations of TP export and loading in large lake watersheds. The study can provide a reference for the decision-making for controlling P pollution in the Dongting Lake watershed.
Chen, Hongming; Carlsson, Lars; Eriksson, Mats; Varkonyi, Peter; Norinder, Ulf; Nilsson, Ingemar
2013-06-24
A novel methodology was developed to build Free-Wilson like local QSAR models by combining R-group signatures and the SVM algorithm. Unlike Free-Wilson analysis this method is able to make predictions for compounds with R-groups not present in a training set. Eleven public data sets were chosen as test cases for comparing the performance of our new method with several other traditional modeling strategies, including Free-Wilson analysis. Our results show that the R-group signature SVM models achieve better prediction accuracy compared with Free-Wilson analysis in general. Moreover, the predictions of R-group signature models are also comparable to the models using ECFP6 fingerprints and signatures for the whole compound. Most importantly, R-group contributions to the SVM model can be obtained by calculating the gradient for R-group signatures. For most of the studied data sets, a significant correlation with that of a corresponding Free-Wilson analysis is shown. These results suggest that the R-group contribution can be used to interpret bioactivity data and highlight that the R-group signature based SVM modeling method is as interpretable as Free-Wilson analysis. Hence the signature SVM model can be a useful modeling tool for any drug discovery project.
NASA Astrophysics Data System (ADS)
Zhang, Yizhen
Biofuels are often forecast to provide significant reductions in the greenhouse gas (GHG) emissions from the transportation sector globally. Many countries have regarded bioenergy development as a solution to both climate change mitigation and foreign energy dependence. It is projected that biofuel production may contribute up to a quarter of transportation fuel supply by 2050. But uncertainties and concerns still remain with respect to the environmental performance of biofuels, including their contribution to GHGs. Life cycle assessment (LCA) is a powerful tool for evaluating the environmental impacts of emerging technologies. However, existing LCAs are inconsistent in their selection of system boundaries, modeling assumptions, and treatment of co-products, which lead to wide variations in results, and make the comparisons of biofuel pathways challenging. Co-products usually play an essential role in biofuel production system, both economically and environmentally. Thus the treatment strategies of co-product are considered critical to LCA results. Studies presented in this dissertation assess several types of biofuels, including first generation, second generation and advanced biofuels, which are produced from terrestrial feedstocks (e.g., corn grain and corn stover) and algae. A variety of researchers have identified the importance of treating co-products in LCAs. This study focuses on the improvement of LCA methodology for assessing biofuel co-products. This dissertation contributes to current knowledge and methodology in following ways: 1) it develops a comprehensive life cycle energy, carbon and water model for microalgae biofuel production 2) it improves co-product allocation strategies in LCA; and 3) it explores the indirect impacts on ocean resources induced by algal oil production at large scale, which has not been examined previously.
The effect of training methodology on knowledge representation in categorization.
Hélie, Sébastien; Shamloo, Farzin; Ell, Shawn W
2017-01-01
Category representations can be broadly classified as containing within-category information or between-category information. Although such representational differences can have a profound impact on decision-making, relatively little is known about the factors contributing to the development and generalizability of different types of category representations. These issues are addressed by investigating the impact of training methodology and category structures using a traditional empirical approach as well as the novel adaptation of computational modeling techniques from the machine learning literature. Experiment 1 focused on rule-based (RB) category structures thought to promote between-category representations. Participants learned two sets of two categories during training and were subsequently tested on a novel categorization problem using the training categories. Classification training resulted in a bias toward between-category representations whereas concept training resulted in a bias toward within-category representations. Experiment 2 focused on information-integration (II) category structures thought to promote within-category representations. With II structures, there was a bias toward within-category representations regardless of training methodology. Furthermore, in both experiments, computational modeling suggests that only within-category representations could support generalization during the test phase. These data suggest that within-category representations may be dominant and more robust for supporting the reconfiguration of current knowledge to support generalization.
The effect of training methodology on knowledge representation in categorization
Shamloo, Farzin; Ell, Shawn W.
2017-01-01
Category representations can be broadly classified as containing within–category information or between–category information. Although such representational differences can have a profound impact on decision–making, relatively little is known about the factors contributing to the development and generalizability of different types of category representations. These issues are addressed by investigating the impact of training methodology and category structures using a traditional empirical approach as well as the novel adaptation of computational modeling techniques from the machine learning literature. Experiment 1 focused on rule–based (RB) category structures thought to promote between–category representations. Participants learned two sets of two categories during training and were subsequently tested on a novel categorization problem using the training categories. Classification training resulted in a bias toward between–category representations whereas concept training resulted in a bias toward within–category representations. Experiment 2 focused on information-integration (II) category structures thought to promote within–category representations. With II structures, there was a bias toward within–category representations regardless of training methodology. Furthermore, in both experiments, computational modeling suggests that only within–category representations could support generalization during the test phase. These data suggest that within–category representations may be dominant and more robust for supporting the reconfiguration of current knowledge to support generalization. PMID:28846732
An approach to accidents modeling based on compounds road environments.
Fernandes, Ana; Neves, Jose
2013-04-01
The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. Copyright © 2013 Elsevier Ltd. All rights reserved.
Persechino, Benedetta; Valenti, Antonio; Ronchetti, Matteo; Rondinone, Bruna Maria; Di Tecco, Cristina; Vitali, Sara; Iavicoli, Sergio
2013-06-01
Work-related stress is one of the major causes of occupational ill health. In line with the regulatory framework on occupational health and safety (OSH), adequate models for assessing and managing risk need to be identified so as to minimize the impact of this stress not only on workers' health, but also on productivity. After close analysis of the Italian and European reference regulatory framework and work-related stress assessment and management models used in some European countries, we adopted the UK Health and Safety Executive's (HSE) Management Standards (MS) approach, adapting it to the Italian context in order to provide a suitable methodological proposal for Italy. We have developed a work-related stress risk assessment strategy, meeting regulatory requirements, now available on a specific web platform that includes software, tutorials, and other tools to assist companies in their assessments. This methodological proposal is new on the Italian work-related stress risk assessment scene. Besides providing an evaluation approach using scientifically validated instruments, it ensures the active participation of occupational health professionals in each company. The assessment tools provided enable companies not only to comply with the law, but also to contribute to a database for monitoring and assessment and give access to a reserved area for data analysis and comparisons.
Ubago Pérez, Ruth; Castillo Muñoz, María Auxiliadora; Banqueri, Mercedes Galván; García Estepa, Raúl; Alfaro Lara, Eva Rocío; Vega Coca, María Dolores; Beltrán Calvo, Carmen; Molina López, Teresa
The European network for Health Technology Assessment (EUnetHTA) is the network of public health technology assessment (HTA) agencies and entities from across the EU. In this context, the HTA Core Model ® , has been developed. The Andalusian Agency for Health Technology Assessment (AETSA) is a member of the Spanish HTA Network and EUnetHTA collaboration In addition, AETSA participates in the new EUnetHTA Joint Action 3 (JA, 2016-2019). Furthermore, AETSA works on pharmaceutical assessments. Part of this work involves drafting therapeutic positioning reports (TPRs) on drugs that have recently been granted marketing authorisation, which is overseen by the Spanish Agency of Medicines and Medical Devices (AEMPS). AETSA contributes by drafting "Evidence synthesis reports: pharmaceuticals" in which a rapid comparative efficacy and safety assessment is performed for drugs for which a TPR will be created. To create this type of report, AETSA follows its own methodological guideline based on EUnetHTA guidelines and the HTA Core Model ® . In this paper, the methodology that AETSA has developed to create the guideline for "Evidence synthesis reports: pharmaceuticals" is described. The structure of the report itself is also presented. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.
Persechino, Benedetta; Valenti, Antonio; Ronchetti, Matteo; Rondinone, Bruna Maria; Di Tecco, Cristina; Vitali, Sara; Iavicoli, Sergio
2013-01-01
Background Work-related stress is one of the major causes of occupational ill health. In line with the regulatory framework on occupational health and safety (OSH), adequate models for assessing and managing risk need to be identified so as to minimize the impact of this stress not only on workers' health, but also on productivity. Methods After close analysis of the Italian and European reference regulatory framework and work-related stress assessment and management models used in some European countries, we adopted the UK Health and Safety Executive's (HSE) Management Standards (MS) approach, adapting it to the Italian context in order to provide a suitable methodological proposal for Italy. Results We have developed a work-related stress risk assessment strategy, meeting regulatory requirements, now available on a specific web platform that includes software, tutorials, and other tools to assist companies in their assessments. Conclusion This methodological proposal is new on the Italian work-related stress risk assessment scene. Besides providing an evaluation approach using scientifically validated instruments, it ensures the active participation of occupational health professionals in each company. The assessment tools provided enable companies not only to comply with the law, but also to contribute to a database for monitoring and assessment and give access to a reserved area for data analysis and comparisons. PMID:23961332
Hammarlund-Udenaes, Margareta
2017-09-01
Microdialysis has contributed with very important knowledge to the understanding of target-specific concentrations and their relationship to pharmacodynamic effects from a systems pharmacology perspective, aiding in the global understanding of drug effects. This review focuses on the historical development of microdialysis as a method to quantify the pharmacologically very important unbound tissue concentrations and of recent findings relating to modeling microdialysis data to extrapolate from rodents to humans, understanding distribution of drugs in different tissues and disease conditions. Quantitative microdialysis developed very rapidly during the early 1990s. Method development was in focus in the early years including development of quantitative microdialysis, to be able to estimate true extracellular concentrations. Microdialysis has significantly contributed to the understanding of active transport at the blood-brain barrier and in other organs. Examples are presented where microdialysis together with modeling has increased the knowledge on tissue distribution between species, in overweight patients and in tumors, and in metabolite contribution to drug effects. More integrated metabolomic studies are still sparse within the microdialysis field, although a great potential for tissue and disease-specific measurements is evident.
NASA Astrophysics Data System (ADS)
Develaki, Maria
2017-11-01
Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.
A system-of-systems modeling methodology for strategic general aviation design decision-making
NASA Astrophysics Data System (ADS)
Won, Henry Thome
General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting independently. Implementation of this methodology can afford engineers a more autonomous perspective in the concept exploration process, providing dynamic feedback about a design's potential success in specific market segments. The method also has potential to strengthen the connection between design and business departments, as well as between manufacturers, service providers, and infrastructure planners---bringing information about how the respective systems interact, and what might be done to improve synergism of systems.
Hierarchical modeling of heat transfer in silicon-based electronic devices
NASA Astrophysics Data System (ADS)
Goicochea Pineda, Javier V.
In this work a methodology for the hierarchical modeling of heat transfer in silicon-based electronic devices is presented. The methodology includes three steps to integrate the different scales involved in the thermal analysis of these devices. The steps correspond to: (i) the estimation of input parameters and thermal properties required to solve the Boltzmann transport equation (BTE) for phonons by means of molecular dynamics (MD) simulations, (ii) the quantum correction of some of the properties estimated with MD to make them suitable for BTE and (iii) the numerical solution of the BTE using the lattice Boltzmann method (LBM) under the single mode relaxation time approximation subject to different initial and boundary conditions, including non-linear dispersion relations and different polarizations in the [100] direction. Each step of the methodology is validated with numerical, analytical or experimental reported data. In the first step of the methodology, properties such as, phonon relaxation times, dispersion relations, group and phase velocities and specific heat are obtained with MD at of 300 and 1000 K (i.e. molecular temperatures). The estimation of the properties considers the anhamonic nature of the potential energy function, including the thermal expansion of the crystal. Both effects are found to modify the dispersion relations with temperature. The behavior of the phonon relaxation times for each mode (i.e. longitudinal and transverse, acoustic and optical phonons) is identified using power functions. The exponents of the acoustic modes are agree with those predicted theoretically perturbation theory at high temperatures, while those for the optical modes are higher. All properties estimated with MD are validated with values for the thermal conductivity obtained from the Green-Kubo method. It is found that the relative contribution of acoustic modes to the overall thermal conductivity is approximately 90% at both temperatures. In the second step, two new quantum correction alternatives are applied to correct the results obtained with MD. The alternatives consider the quantization of the energy per phonon mode. In addition, the effect of isotope scattering is included in the phonon-phonon relaxation time values previously determined in the first step. It is found that both the quantization of the energy and the inclusion of scattering with isotopes significant reduce the contribution of high-frequency modes to the overall thermal conductivity. After these two effects are considered, the contribution of optical modes reduces to less than 2.4%. In this step, two sets of properties are obtained. The first one results from the application of quantum corrections to abovementioned properties, while the second is obtained including also the isotope scattering. These sets of properties are identified in this work as isotope-enriched silicon (isoSi) and natural silicon (natSi) and are used along other phonon relaxation time models in the last step of our methodology. Before we solve the BTE using the LBM, a new dispersive lattice Boltzmann formulation is proposed. The new dispersive formulation is based on constant lattice spacings (CLS) and flux limiters, rather than constant time steps (as previously reported). It is found that the new formulation significantly reduces the computation cost and complexity of the solution of the BTE, without affecting the thermal predictions. Lastly, in the last step of our methodology, we solve the BTE. The equation is solved under the relaxation time approximation using our thermal properties estimated for isoSi and natSi and using two phonon formulations. The phonon formulations include a gray model and the new dispersive method. For comparison purposes, the BTE is also solved using the phenomenological and theoretical phonon relaxation time models of Holland, and Han and Klemens. Different thermal predictions in steady and transient states are performed to illustrate the application of the methodology in one- and two-dimensional silicon films and in silicon-over-insulator (SOI) transistors. These include the determination of bulk and film thermal conductivities (i.e. out-of-plane and in-plane), and the transient evolution of the wall heat flux and temperature for films of different thicknesses. In addition, the physics of phonons is further analyzed in terms of the influence and behavior of acoustic and optical modes in the thermal predictions and the effect of phonon confinement in the thermal response of SOI-like transistors subject to different self-heating conditions.
You, Zhiqiang; Zhu, Yun; Jang, Carey; Wang, Shuxiao; Gao, Jian; Lin, Che-Jen; Li, Minhui; Zhu, Zhenghua; Wei, Hao; Yang, Wenwei
2017-01-01
To develop a sound ozone (O 3 ) pollution control strategy, it is important to well understand and characterize the source contribution due to the complex chemical and physical formation processes of O 3 . Using the "Shunde" city as a pilot summer case study, we apply an innovative response surface modeling (RSM) methodology based on the Community Multi-Scale Air Quality (CMAQ) modeling simulations to identify the O 3 regime and provide dynamic analysis of the precursor contributions to effectively assess the O 3 impacts of volatile organic compound (VOC) control strategy. Our results show that Shunde is a typical VOC-limited urban O 3 polluted city. The "Jiangmen" city, as the main upper wind area during July 2014, its VOCs and nitrogen oxides (NO x ) emissions make up the largest contribution (9.06%). On the contrary, the contribution from local (Shunde) emission is lowest (6.35%) among the seven neighbor regions. The local VOCs industrial source emission has the largest contribution comparing to other precursor emission sectors in Shunde. The results of dynamic source contribution analysis further show that the local NO x control could slightly increase the ground O 3 under low (10.00%) and medium (40.00%) reduction ratios, while it could start to turn positive to decrease ground O 3 under the high NO x abatement ratio (75.00%). The real-time assessment of O 3 impacts from VOCs control strategies in Pearl River Delta (PRD) shows that the joint regional VOCs emission control policy will effectively reduce the ground O 3 concentration in Shunde. Copyright © 2016. Published by Elsevier B.V.
Environment, genes, and experience: lessons from behavior genetics.
Barsky, Philipp I
2010-11-01
The article reviews the theoretical analysis of the problems inherent in studying the environment within behavior genetics across several periods in the development of environmental studies in behavior genetics and proposes some possible alternatives to traditional approaches to studying the environment in behavior genetics. The first period (from the end of the 1920s to the end of the 1970s), when the environment was not actually studied, is called pre-environmental; during this time, the basic principles and theoretical models of understanding environmental effects in behavior genetics were developed. The second period is characterized by the development of studies on environmental influences within the traditional behavior genetics paradigm; several approaches to studying the environment emerged in behavior genetics during this period, from the beginning of the 1980s until today. At the present time, the field is undergoing paradigmatic changes, concerned with methodology, theory, and mathematical models of genotype-environment interplay; this might be the beginning of a third period of development of environmental studies in behavior genetics. In another part, the methodological problems related to environmental studies in behavior genetics are discussed. Although the methodology used in differential psychology is applicable for assessment of differences between individuals, it is insufficient to explain the sources of these differences. In addition, we stress that psychoanalytic studies of twins and their experiences, initiated in the 1930s and continued episodically until the 1980s, could bring an interesting methodology and contribute to the explanation of puzzling findings from environmental studies of behavior genetics. Finally, we will conclude with implications from the results of environmental studies in behavior genetics, including methodological issues. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Burton-Johnson, A.; Halpin, J. A.; Whittaker, J. M.; Graham, F. S.; Watson, S. J.
2017-06-01
A new method for modeling heat flux shows that the upper crust contributes up to 70% of the Antarctic Peninsula's subglacial heat flux and that heat flux values are more variable at smaller spatial resolutions than geophysical methods can resolve. Results indicate a higher heat flux on the east and south of the Peninsula (mean 81 mW m-2) where silicic rocks predominate, than on the west and north (mean 67 mW m-2) where volcanic arc and quartzose sediments are dominant. While the data supports the contribution of heat-producing element-enriched granitic rocks to high heat flux values, sedimentary rocks can be of comparative importance dependent on their provenance and petrography. Models of subglacial heat flux must utilize a heterogeneous upper crust with variable radioactive heat production if they are to accurately predict basal conditions of the ice sheet. Our new methodology and data set facilitate improved numerical model simulations of ice sheet dynamics.
The PHM-Ethics methodology: interdisciplinary technology assessment of personal health monitoring.
Schmidt, Silke; Verweij, Marcel
2013-01-01
The contribution briefly introduces the PHM Ethics project and the PHM methodology. Within the PHM-Ethics project, a set of tools and modules had been developed that may assist in the evaluation and assessment of new technologies for personal health monitoring, referred to as "PHM methodology" or "PHM toolbox". An overview on this interdisciplinary methodology and its comprising modules is provided, areas of application and intended target groups are indicated.
NASA Astrophysics Data System (ADS)
Jackson, C.; Todhunter, P. E.
2017-12-01
Since 1993, Devils Lake in North Dakota has experienced a prolonged rise in lake level and flooding of the lake's neighboring areas within the closed basin system. Understanding the relative contribution of climate change and land use change is needed to explain the historical rise in lake level, and to evaluate the potential impact of anthropogenic climate change upon future lake conditions and management. Four methodologies were considered to examine the relative contribution of climatic and human landscape drivers to streamflow variations: statistical, ecohydrologic, physically-based modeling, and elasticity of streamflow; for this study, ecohydrologic and climate elasticity were selected. Agricultural statistics determined that Towner and Ramsey counties underwent a crop conversion from small grains to row crops within the last 30 years. Through the Topographic Wetness Index (TWI), a 10 meter resolution DEM confirmed the presence of innumerable wetland depressions within the non-contributing area of the Mauvais Coulee Sub-basin. Although the ecohydrologic and climate elasticity methodologies are the most commonly used in literature, they make assumptions that are not applicable to basin conditions. A modified and more informed approach to the use of these methods was applied to account for these unique sub-basin characteristics. Ultimately, hydroclimatic variability was determined as the largest driver to streamflow variation in Mauvais Coulee and Devils Lake.
NASA Astrophysics Data System (ADS)
Razafindratsima, Stephen; Guérin, Roger; Bendjoudi, Hocine; de Marsily, Ghislain
2014-09-01
A methodological approach is described which combines geophysical and geochemical data to delineate the extent of a chlorinated ethenes plume in northern France; the methodology was used to calibrate a hydrogeological model of the contaminants' migration and degradation. The existence of strong reducing conditions in some parts of the aquifer is first determined by measuring in situ the redox potential and dissolved oxygen, dissolved ferrous iron and chloride concentrations. Electrical resistivity imaging and electromagnetic mapping, using the Slingram method, are then used to determine the shape of the pollutant plume. A decreasing empirical exponential relation between measured chloride concentrations in the water and aquifer electrical resistivity is observed; the resistivity formation factor calculated at a few points also shows a major contribution of chloride concentration in the resistivity of the saturated porous medium. MODFLOW software and MT3D99 first-order parent-daughter chain reaction and the RT3D aerobic-anaerobic model for tetrachloroethene (PCE)/trichloroethene (TCE) dechlorination are finally used for a first attempt at modeling the degradation of the chlorinated ethenes. After calibration, the distribution of the chlorinated ethenes and their degradation products simulated with the model approximately reflects the mean measured values in the observation wells, confirming the data-derived image of the plume.
Towards a voxel-based geographic automata for the simulation of geospatial processes
NASA Astrophysics Data System (ADS)
Jjumba, Anthony; Dragićević, Suzana
2016-07-01
Many geographic processes evolve in a three dimensional space and time continuum. However, when they are represented with the aid of geographic information systems (GIS) or geosimulation models they are modelled in a framework of two-dimensional space with an added temporal component. The objective of this study is to propose the design and implementation of voxel-based automata as a methodological approach for representing spatial processes evolving in the four-dimensional (4D) space-time domain. Similar to geographic automata models which are developed to capture and forecast geospatial processes that change in a two-dimensional spatial framework using cells (raster geospatial data), voxel automata rely on the automata theory and use three-dimensional volumetric units (voxels). Transition rules have been developed to represent various spatial processes which range from the movement of an object in 3D to the diffusion of airborne particles and landslide simulation. In addition, the proposed 4D models demonstrate that complex processes can be readily reproduced from simple transition functions without complex methodological approaches. The voxel-based automata approach provides a unique basis to model geospatial processes in 4D for the purpose of improving representation, analysis and understanding their spatiotemporal dynamics. This study contributes to the advancement of the concepts and framework of 4D GIS.
ERIC Educational Resources Information Center
Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna
2004-01-01
Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Chen, Xingyuan; Ye, Ming
Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less
A Methodology for Phased Array Radar Threshold Modeling Using the Advanced Propagation Model (APM)
2017-10-01
TECHNICAL REPORT 3079 October 2017 A Methodology for Phased Array Radar Threshold Modeling Using the Advanced Propagation Model (APM...Head 55190 Networks Division iii EXECUTIVE SUMMARY This report summarizes the methodology developed to improve the radar threshold modeling...PHASED ARRAY RADAR CONFIGURATION ..................................................................... 1 3. METHODOLOGY
Statistical power calculations for mixed pharmacokinetic study designs using a population approach.
Kloprogge, Frank; Simpson, Julie A; Day, Nicholas P J; White, Nicholas J; Tarning, Joel
2014-09-01
Simultaneous modelling of dense and sparse pharmacokinetic data is possible with a population approach. To determine the number of individuals required to detect the effect of a covariate, simulation-based power calculation methodologies can be employed. The Monte Carlo Mapped Power method (a simulation-based power calculation methodology using the likelihood ratio test) was extended in the current study to perform sample size calculations for mixed pharmacokinetic studies (i.e. both sparse and dense data collection). A workflow guiding an easy and straightforward pharmacokinetic study design, considering also the cost-effectiveness of alternative study designs, was used in this analysis. Initially, data were simulated for a hypothetical drug and then for the anti-malarial drug, dihydroartemisinin. Two datasets (sampling design A: dense; sampling design B: sparse) were simulated using a pharmacokinetic model that included a binary covariate effect and subsequently re-estimated using (1) the same model and (2) a model not including the covariate effect in NONMEM 7.2. Power calculations were performed for varying numbers of patients with sampling designs A and B. Study designs with statistical power >80% were selected and further evaluated for cost-effectiveness. The simulation studies of the hypothetical drug and the anti-malarial drug dihydroartemisinin demonstrated that the simulation-based power calculation methodology, based on the Monte Carlo Mapped Power method, can be utilised to evaluate and determine the sample size of mixed (part sparsely and part densely sampled) study designs. The developed method can contribute to the design of robust and efficient pharmacokinetic studies.
NASA Astrophysics Data System (ADS)
Arendt, Carli A.; Aciego, Sarah M.; Hetland, Eric A.
2015-05-01
The implementation of isotopic tracers as constraints on source contributions has become increasingly relevant to understanding Earth surface processes. Interpretation of these isotopic tracers has become more accessible with the development of Bayesian Monte Carlo (BMC) mixing models, which allow uncertainty in mixing end-members and provide methodology for systems with multicomponent mixing. This study presents an open source multiple isotope BMC mixing model that is applicable to Earth surface environments with sources exhibiting distinct end-member isotopic signatures. Our model is first applied to new δ18O and δD measurements from the Athabasca Glacier, which showed expected seasonal melt evolution trends and vigorously assessed the statistical relevance of the resulting fraction estimations. To highlight the broad applicability of our model to a variety of Earth surface environments and relevant isotopic systems, we expand our model to two additional case studies: deriving melt sources from δ18O, δD, and 222Rn measurements of Greenland Ice Sheet bulk water samples and assessing nutrient sources from ɛNd and 87Sr/86Sr measurements of Hawaiian soil cores. The model produces results for the Greenland Ice Sheet and Hawaiian soil data sets that are consistent with the originally published fractional contribution estimates. The advantage of this method is that it quantifies the error induced by variability in the end-member compositions, unrealized by the models previously applied to the above case studies. Results from all three case studies demonstrate the broad applicability of this statistical BMC isotopic mixing model for estimating source contribution fractions in a variety of Earth surface systems.
NASA Astrophysics Data System (ADS)
Krapp, Mario; Gütschow, Johannes; Rocha, Marcia; Schaeffer, Michiel
2016-04-01
The notion of historical responsibility is central to the equity debate and the measure of responsibility as a countries' share of historical global emissions remains one of the essential parameters in so-called equity proposals, which attempt to distribute effort among countries in an equitable manner. The focus of this contribution is on the historical contribution of countries, but it takes it one step further: its general objective lies on estimating countries' contribution directly to the change in climate. The historical responsibility is not based on cumulative emissions but instead measured in terms of the countries' estimated contribution to the increase in global-mean surface-air temperature. This is achieved by (1) compiling a historical emissions dataset for the period from 1850 until 2012 for each individual Kyoto-greenhouse gas and each UNFCCC Party using a consistent methodology and (2) applying those historical emissions to a revised version of the so-called Policy-maker Model put forward by the Ministry of Science and Technology of the Federative Republic of Brazil, which is a simple, yet powerful tool that allows historical GHG emissions of individual countries to be directly related to their effect on global temperature changes. We estimate that the cumulative GHG emissions until 2012 from the USA, the European Union and China contribute to a total temperature increase of about 0.50°C in 2100, which is equivalent to about 50% of the temperature increase from total global GHG emissions by that year (of about 1.0°C). Respectively, the USA, the European Union, and China are responsible for 20.2%, 17.3%, and 12.1% of global temperature increase in 2100. Russian historical emissions are responsible for 0.06°C temperature increase by 2100, ranking as the fourth largest contributor to temperature increase with 6.2% of the total contribution. India ranks fifth: Indian emissions to date would contribute to roughly 0.05°C of global mean temperature increase by 2100, or about 5.3%. Brazilian historical emissions would contribute to 0.04°C to global temperature increase by 2100 or 4.4% to total temperature increase. If the European Union countries were considered independently, Germany and Great Britain would be responsible respectively to 3.9% and 3.4% of global temperature increase in 2100. We present the results on countries' historical responsibilities and then outline in detail the methodology employed to obtain the historical emissions dataset and final temperature contributions including the different approaches to derive a revised version of the Policy-maker Model, its underlying assumptions, advantages, and limitations for estimating countries' historical contribution to temperature increase.
NASA Astrophysics Data System (ADS)
Krell, Moritz; Walzer, Christine; Hergert, Susann; Krüger, Dirk
2017-09-01
As part of their professional competencies, science teachers need an elaborate meta-modelling knowledge as well as modelling skills in order to guide and monitor modelling practices of their students. However, qualitative studies about (pre-service) science teachers' modelling practices are rare. This study provides a category system which is suitable to analyse and to describe pre-service science teachers' modelling activities and to infer modelling strategies. The category system was developed based on theoretical considerations and was inductively refined within the methodological frame of qualitative content analysis. For the inductive refinement, modelling practices of pre-service teachers (n = 4) have been video-taped and analysed. In this study, one case was selected to demonstrate the application of the category system to infer modelling strategies. The contribution of this study for science education research and science teacher education is discussed.
Dicks, Sean Glenton; Ranse, Kristen; van Haren, Frank MP; Boer, Douglas P
2017-01-01
Information and compassion assist families of potential organ donors to make informed decisions. However, psychological implications of the in-hospital process are not well described with past research focusing on decision-making. To enhance understanding and improve service delivery, a systematic review was conducted. Inductive analysis and synthesis utilised Grounded Theory Methodology within a systems theory framework and contributed to a model proposing that family and staff form a System of Systems with shared responsibility for process outcomes. This model can guide evaluation and improvement of care and will be tested by means of a longitudinal study of family experiences. PMID:28680696
Watershed Complexity Impacts on Rainfall-Runoff Modeling
NASA Astrophysics Data System (ADS)
Goodrich, D. C.; Grayson, R.; Willgoose, G.; Palacios-Velez, O.; Bloeschl, G.
2002-12-01
Application of distributed hydrologic watershed models fundamentally requires watershed partitioning or discretization. In addition to partitioning the watershed into modeling elements, these elements typically represent a further abstraction of the actual watershed surface and its relevant hydrologic properties. A critical issue that must be addressed by any user of these models prior to their application is definition of an acceptable level of watershed discretization or geometric model complexity. A quantitative methodology to define a level of geometric model complexity commensurate with a specified level of model performance is developed for watershed rainfall-runoff modeling. In the case where watershed contributing areas are represented by overland flow planes, equilibrium discharge storage was used to define the transition from overland to channel dominated flow response. The methodology is tested on four subcatchments which cover a range of watershed scales of over three orders of magnitude in the USDA-ARS Walnut Gulch Experimental Watershed in Southeastern Arizona. It was found that distortion of the hydraulic roughness can compensate for a lower level of discretization (fewer channels) to a point. Beyond this point, hydraulic roughness distortion cannot compensate for topographic distortion of representing the watershed by fewer elements (e.g. less complex channel network). Similarly, differences in representation of topography by different model or digital elevation model (DEM) types (e.g. Triangular Irregular Elements - TINs; contour lines; and regular grid DEMs) also result in difference in runoff routing responses that can be largely compensated for by a distortion in hydraulic roughness.
Towards the unification of inference structures in medical diagnostic tasks.
Mira, J; Rives, J; Delgado, A E; Martínez, R
1998-01-01
The central purpose of artificial intelligence applied to medicine is to develop models for diagnosis and therapy planning at the knowledge level, in the Newell sense, and software environments to facilitate the reduction of these models to the symbol level. The usual methodology (KADS, Common-KADS, GAMES, HELIOS, Protégé, etc) has been to develop libraries of generic tasks and reusable problem-solving methods with explicit ontologies. The principal problem which clinicians have with these methodological developments concerns the diversity and complexity of new terms whose meaning is not sufficiently clear, precise, unambiguous and consensual for them to be accessible in the daily clinical environment. As a contribution to the solution of this problem, we develop in this article the conjecture that one inference structure is enough to describe the set of analysis tasks associated with medical diagnoses. To this end, we first propose a modification of the systematic diagnostic inference scheme to obtain an analysis generic task and then compare it with the monitoring and the heuristic classification task inference schemes using as comparison criteria the compatibility of domain roles (data structures), the similarity in the inferences, and the commonality in the set of assumptions which underlie the functionally equivalent models. The equivalences proposed are illustrated with several examples. Note that though our ongoing work aims to simplify the methodology and to increase the precision of the terms used, the proposal presented here should be viewed more in the nature of a conjecture.
Laner, David; Rechberger, Helmut
2009-02-01
Waste prevention is a principle means of achieving the goals of waste management and a key element for developing sustainable economies. Small and medium sized enterprises (SMEs) contribute substantially to environmental degradation, often not even being aware of their environmental effects. Therefore, several initiatives have been launched in Austria aimed at supporting waste prevention measures on the level of SMEs. To promote the most efficient projects, they have to be evaluated with respect to their contribution to the goals of waste management. It is the aim of this paper to develop a methodology for evaluating waste prevention measures in SMEs based on their goal orientation. At first, conceptual problems of defining and delineating waste prevention activities are briefly discussed. Then an approach to evaluate waste prevention activities with respect to their environmental performance is presented and benchmarks which allow for an efficient use of the available funds are developed. Finally the evaluation method is applied to a number of former projects and the calculated results are analysed with respect to shortcomings and limitations of the model. It is found that the developed methodology can provide a tool for a more objective and comprehensible evaluation of waste prevention measures.
Greenhouse gas emissions from reservoir water surfaces: A ...
Collectively, reservoirs created by dams are thought to be an important source ofgreenhouse gases (GHGs) to the atmosphere. So far, efforts to quantify, model, andmanage these emissions have been limited by data availability and inconsistenciesin methodological approach. Here we synthesize worldwide reservoir methane,carbon dioxide, and nitrous oxide emission data with three main objectives: (1) togenerate a global estimate of GHG emissions from reservoirs, (2) to identify the bestpredictors of these emissions, and (3) to consider the effect of methodology onemission estimates. We estimate that GHG emission from reservoir water surfacesaccount for 0.8 (0.5-1.2) Pg CO2-equivalents per year, equal to ~1.3 % of allanthropogenic GHG emissions, with the majority (79%) of this forcing due tomethane. We also discuss the potential for several alternative pathways such as damdegassing and downstream emissions to contribute significantly to overall GHGemissions. Although prior studies have linked reservoir GHG emissions to systemage and latitude, we find that factors related to reservoir productivity are betterpredictors of emission. Finally, as methane contributed the most to total reservoirGHG emissions, it is important that future monitoring campaigns incorporatemethane emission pathways, especially ebullition. To inform the public.
Ng, Kim Hoong; Cheng, Yoke Wang; Khan, Maksudur R; Cheng, Chin Kui
2016-12-15
This paper reports on the optimization of palm oil mill effluent (POME) degradation in a UV-activated-ZnO system based on central composite design (CCD) in response surface methodology (RSM). Three potential factors, viz. O 2 flowrate (A), ZnO loading (B) and initial concentration of POME (C) were evaluated for the significance analysis using a 2 3 full factorial design before the optimization process. It is found that all the three main factors were significant, with contributions of 58.27% (A), 15.96% (B) and 13.85% (C), respectively, to the POME degradation. In addition, the interactions between the factors AB, AC and BC also have contributed 4.02%, 3.12% and 1.01% to the POME degradation. Subsequently, all the three factors were subjected to statistical central composite design (CCD) analysis. Quadratic models were developed and rigorously checked. A 3D-response surface was subsequently generated. Two successive validation experiments were carried out and the degradation achieved were 55.25 and 55.33%, contrasted with 52.45% for predicted degradation value. Copyright © 2016 Elsevier Ltd. All rights reserved.
OPUS: Optimal Projection for Uncertain Systems. Volume 1
1991-09-01
unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of
Going Public with Pedagogical Inquiries: SoTL as a Methodology for Faculty Professional Development
ERIC Educational Resources Information Center
Fanghanel, Joëlle
2013-01-01
In this paper, I discuss SoTL as a methodology for the professional development of academics. I propose that as an agentic form of inquiry that focuses on processes, boundary-crossing, and making public its findings, SoTL is a sophisticated methodology that brings the activities of teaching and research in close alignment, and contributes to…
New methodology for adjusting rotating shadowband irradiometer measurements
NASA Astrophysics Data System (ADS)
Vignola, Frank; Peterson, Josh; Wilbert, Stefan; Blanc, Philippe; Geuder, Norbert; Kern, Chris
2017-06-01
A new method is developed for correcting systematic errors found in rotating shadowband irradiometer measurements. Since the responsivity of photodiode-based pyranometers typically utilized for RST sensors is dependent upon the wavelength of the incident radiation and the spectral distribution of the incident radiation is different for the Direct Normal Trradiance and the Diffuse Horizontal Trradiance, spectral effects have to be considered. These cause the most problematic errors when applying currently available correction functions to RST measurements. Hence, direct normal and diffuse contributions are analyzed and modeled separately. An additional advantage of this methodology is that it provides a prescription for how to modify the adjustment algorithms to locations with different atmospheric characteristics from the location where the calibration and adjustment algorithms were developed. A summary of results and areas for future efforts are then discussed.
Methodological Concerns in Experimental Reading Research: All That Glitters...
ERIC Educational Resources Information Center
Henk, William A.
1987-01-01
Describes the nature and consequences of liberally or improperly applying the traditional reading research methodology and provides an argument for tempering judgments about the relative contributions that experimental studies make to the professional literature in reading. (SKC)
[Decision modeling for economic evaluation of health technologies].
de Soárez, Patrícia Coelho; Soares, Marta Oliveira; Novaes, Hillegonda Maria Dutilh
2014-10-01
Most economic evaluations that participate in decision-making processes for incorporation and financing of technologies of health systems use decision models to assess the costs and benefits of the compared strategies. Despite the large number of economic evaluations conducted in Brazil, there is a pressing need to conduct an in-depth methodological study of the types of decision models and their applicability in our setting. The objective of this literature review is to contribute to the knowledge and use of decision models in the national context of economic evaluations of health technologies. This article presents general definitions about models and concerns with their use; it describes the main models: decision trees, Markov chains, micro-simulation, simulation of discrete and dynamic events; it discusses the elements involved in the choice of model; and exemplifies the models addressed in national economic evaluation studies of diagnostic and therapeutic preventive technologies and health programs.
Modeling Common-Sense Decisions in Artificial Intelligence
NASA Technical Reports Server (NTRS)
Zak, Michail
2010-01-01
A methodology has been conceived for efficient synthesis of dynamical models that simulate common-sense decision- making processes. This methodology is intended to contribute to the design of artificial-intelligence systems that could imitate human common-sense decision making or assist humans in making correct decisions in unanticipated circumstances. This methodology is a product of continuing research on mathematical models of the behaviors of single- and multi-agent systems known in biology, economics, and sociology, ranging from a single-cell organism at one extreme to the whole of human society at the other extreme. Earlier results of this research were reported in several prior NASA Tech Briefs articles, the three most recent and relevant being Characteristics of Dynamics of Intelligent Systems (NPO -21037), NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 48; Self-Supervised Dynamical Systems (NPO-30634), NASA Tech Briefs, Vol. 27, No. 3 (March 2003), page 72; and Complexity for Survival of Living Systems (NPO- 43302), NASA Tech Briefs, Vol. 33, No. 7 (July 2009), page 62. The methodology involves the concepts reported previously, albeit viewed from a different perspective. One of the main underlying ideas is to extend the application of physical first principles to the behaviors of living systems. Models of motor dynamics are used to simulate the observable behaviors of systems or objects of interest, and models of mental dynamics are used to represent the evolution of the corresponding knowledge bases. For a given system, the knowledge base is modeled in the form of probability distributions and the mental dynamics is represented by models of the evolution of the probability densities or, equivalently, models of flows of information. Autonomy is imparted to the decisionmaking process by feedback from mental to motor dynamics. This feedback replaces unavailable external information by information stored in the internal knowledge base. Representation of the dynamical models in a parameterized form reduces the task of common-sense-based decision making to a solution of the following hetero-associated-memory problem: store a set of m predetermined stochastic processes given by their probability distributions in such a way that when presented with an unexpected change in the form of an input out of the set of M inputs, the coupled motormental dynamics converges to the corresponding one of the m pre-assigned stochastic process, and a sample of this process represents the decision.
On uncertainty quantification in hydrogeology and hydrogeophysics
NASA Astrophysics Data System (ADS)
Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud
2017-12-01
Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.
Antoniou, Stavros A; Andreou, Alexandros; Antoniou, George A; Koch, Oliver O; Köhler, Gernot; Luketina, Ruzica-R; Bertsias, Antonios; Pointner, Rudolph; Granderath, Frank-Alexander
2015-11-01
Measures have been taken to improve methodological quality of randomized controlled trials (RCTs). This review systematically assessed the trends in volume and methodological quality of RCTs on minimally invasive surgery within a 10-year period. RCTs on minimally invasive surgery were searched in the 10 most cited general surgical journals and the 5 most cited journals of laparoscopic interest for the years 2002 and 2012. Bibliometric and methodological quality components were abstracted using the Scottish Intercollegiate Guidelines Network. The pooled number of RCTs from low-contribution regions demonstrated an increasing proportion of the total published RCTs, compensating for a concomitant decrease of the respective contributions from Europe and North America. International collaborations were more frequent in 2012. Acceptable or high quality RCTs accounted for 37.9% and 54.4% of RCTs published in 2002 and 2012, respectively. Components of external validity were poorly reported. Both the volume and the reporting quality of laparoscopic RCTs have increased from 2002 to 2012, but there seems to be ample room for improvement of methodological quality. Copyright © 2015 Elsevier Inc. All rights reserved.
Gramatica, Ruggero; Di Matteo, T; Giorgetti, Stefano; Barbiani, Massimo; Bevec, Dorian; Aste, Tomaso
2014-01-01
We introduce a methodology to efficiently exploit natural-language expressed biomedical knowledge for repurposing existing drugs towards diseases for which they were not initially intended. Leveraging on developments in Computational Linguistics and Graph Theory, a methodology is defined to build a graph representation of knowledge, which is automatically analysed to discover hidden relations between any drug and any disease: these relations are specific paths among the biomedical entities of the graph, representing possible Modes of Action for any given pharmacological compound. We propose a measure for the likeliness of these paths based on a stochastic process on the graph. This measure depends on the abundance of indirect paths between a peptide and a disease, rather than solely on the strength of the shortest path connecting them. We provide real-world examples, showing how the method successfully retrieves known pathophysiological Mode of Action and finds new ones by meaningfully selecting and aggregating contributions from known bio-molecular interactions. Applications of this methodology are presented, and prove the efficacy of the method for selecting drugs as treatment options for rare diseases.
Study of the uncertainty in estimation of the exposure of non-human biota to ionising radiation.
Avila, R; Beresford, N A; Agüero, A; Broed, R; Brown, J; Iospje, M; Robles, B; Suañez, A
2004-12-01
Uncertainty in estimations of the exposure of non-human biota to ionising radiation may arise from a number of sources including values of the model parameters, empirical data, measurement errors and biases in the sampling. The significance of the overall uncertainty of an exposure assessment will depend on how the estimated dose compares with reference doses used for risk characterisation. In this paper, we present the results of a study of the uncertainty in estimation of the exposure of non-human biota using some of the models and parameters recommended in the FASSET methodology. The study was carried out for semi-natural terrestrial, agricultural and marine ecosystems, and for four radionuclides (137Cs, 239Pu, 129I and 237Np). The parameters of the radionuclide transfer models showed the highest sensitivity and contributed the most to the uncertainty in the predictions of doses to biota. The most important ones were related to the bioavailability and mobility of radionuclides in the environment, for example soil-to-plant transfer factors, the bioaccumulation factors for marine biota and the gut uptake fraction for terrestrial mammals. In contrast, the dose conversion coefficients showed low sensitivity and contributed little to the overall uncertainty. Radiobiological effectiveness contributed to the overall uncertainty of the dose estimations for alpha emitters although to a lesser degree than a number of transfer model parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tribbia, Joseph
NCAR brought the latest version of the Community Earth System Model (version 1, CESM1) into the mix of models in the NMME effort. This new version uses our newest atmospheric model CAM5 and produces a coupled climate and ENSO that are generally as good or better than those of the Community Climate System Model version 4 (CCSM4). Compared to CCSM4, the new coupled model has a superior climate response with respect to low clouds in both the subtropical stratus regimes and the Arctic. However, CESM1 has been run to date using a prognostic aerosol model that more than doubles itsmore » computational cost. We are currently evaluating a version of the new model using prescribed aerosols and expect it will be ready for integrations in summer 2012. Because of this NCAR has not been able to complete the hindcast integrations using the NCAR loosely-coupled ensemble Kalman filter assimilation method nor has it contributed to the current (Stage I) NMME operational utilization. The expectation is that this model will be included in the NMME in late 2012 or early 2013. The initialization method will utilize the Ensemble Kalman Filter Assimilation methods developed at NCAR using the Data Assimilation Research Testbed (DART) in conjunction with Jeff Anderson’s team in CISL. This methodology has been used in our decadal prediction contributions to CMIP5. During the course of this project, NCAR has setup and performed all the needed hindcast and forecast simulations and provide the requested fields to our collaborators. In addition, NCAR researchers have participated fully in research themes (i) and (ii). Specifically, i) we have begun to evaluate and optimize our system in hindcast mode, focusing on the optimal number of ensemble members, methodologies to recalibrate individual dynamical models, and accessing our forecasts across multiple time scales, i.e., beyond two weeks, and ii) we have begun investigation of the role of different ocean initial conditions in seasonal forecasts. The completion of the calibration hindcasts for Seasonal to Interannual (SI) predictions and the maintenance of the data archive associated with the NCAR portion of this effort has been the responsibility of the Project Scientist I (Alicia Karspeck) that was partially supported on this project.« less
Effective Swimmer’s Action during the Grab Start Technique
Mourão, Luis; de Jesus, Karla; Roesler, Hélio; Machado, Leandro J.; Fernandes, Ricardo J.; Vilas-Boas, João Paulo; Vaz, Mário A. P.
2015-01-01
The external forces applied in swimming starts have been often studied, but using direct analysis and simple interpretation data processes. This study aimed to develop a tool for vertical and horizontal force assessment based on the swimmers’ propulsive and structural forces (passive forces due to dead weight) applied during the block phase. Four methodological pathways were followed: the experimented fall of a rigid body, the swimmers’ inertia effect, the development of a mathematical model to describe the outcome of the rigid body fall and its generalization to include the effects of the inertia, and the experimental swimmers’ starting protocol analysed with the inclusion of the developed mathematical tool. The first three methodological steps resulted in the description and computation of the passive force components. At the fourth step, six well-trained swimmers performed three 15 m maximal grab start trials and three-dimensional (3D) kinetic data were obtained using a six degrees of freedom force plate. The passive force contribution to the start performance obtained from the model was subtracted from the experimental force due to the swimmers resulting in the swimmers’ active forces. As expected, the swimmers’ vertical and horizontal active forces accounted for the maximum variability contribution of the experimental forces. It was found that the active force profile for the vertical and horizontal components resembled one another. These findings should be considered in clarifying the active swimmers’ force variability and the respective geometrical profile as indicators to redefine steering strategies. PMID:25978370
Kumar, Abhishek; Clement, Shibu; Agrawal, V P
2010-07-15
An attempt is made to address a few ecological and environment issues by developing different structural models for effluent treatment system for electroplating. The effluent treatment system is defined with the help of different subsystems contributing to waste minimization. Hierarchical tree and block diagram showing all possible interactions among subsystems are proposed. These non-mathematical diagrams are converted into mathematical models for design improvement, analysis, comparison, storage retrieval and commercially off-the-shelf purchases of different subsystems. This is achieved by developing graph theoretic model, matrix models and variable permanent function model. Analysis is carried out by permanent function, hierarchical tree and block diagram methods. Storage and retrieval is done using matrix models. The methodology is illustrated with the help of an example. Benefits to the electroplaters/end user are identified. 2010 Elsevier B.V. All rights reserved.
K-Means Subject Matter Expert Refined Topic Model Methodology
2017-01-01
Refined Topic Model Methodology Topic Model Estimation via K-Means U.S. Army TRADOC Analysis Center-Monterey 700 Dyer Road...January 2017 K-means Subject Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-Means Theodore T. Allen, Ph.D. Zhenhuan...Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-means 5a. CONTRACT NUMBER W9124N-15-P-0022 5b. GRANT NUMBER 5c
Learning representations for the early detection of sepsis with deep neural networks.
Kam, Hye Jin; Kim, Ha Young
2017-10-01
Sepsis is one of the leading causes of death in intensive care unit patients. Early detection of sepsis is vital because mortality increases as the sepsis stage worsens. This study aimed to develop detection models for the early stage of sepsis using deep learning methodologies, and to compare the feasibility and performance of the new deep learning methodology with those of the regression method with conventional temporal feature extraction. Study group selection adhered to the InSight model. The results of the deep learning-based models and the InSight model were compared. With deep feedforward networks, the area under the ROC curve (AUC) of the models were 0.887 and 0.915 for the InSight and the new feature sets, respectively. For the model with the combined feature set, the AUC was the same as that of the basic feature set (0.915). For the long short-term memory model, only the basic feature set was applied and the AUC improved to 0.929 compared with the existing 0.887 of the InSight model. The contributions of this paper can be summarized in three ways: (i) improved performance without feature extraction using domain knowledge, (ii) verification of feature extraction capability of deep neural networks through comparison with reference features, and (iii) improved performance with feedforward neural networks using long short-term memory, a neural network architecture that can learn sequential patterns. Copyright © 2017 Elsevier Ltd. All rights reserved.
Yardley, Sarah; Brosnan, Caragh; Richardson, Jane
2013-01-01
Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.
ERIC Educational Resources Information Center
Besnard, Christine
1995-01-01
Contributions of the field of cognitive psychology to second language instruction are reviewed. It is proposed that these concepts can contribute not only to classroom language instruction, but also to methodology of language teacher education. (MSE)
NASA Astrophysics Data System (ADS)
Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.
2017-05-01
Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2017-12-01
Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.
Díaz-Requejo, M Mar; Belderrain, Tomás R; Nicasio, M Carmen; Pérez, Pedro J
2006-12-21
This contribution intends to highlight the use of the metal-catalyzed functionalization of unreactive carbon-hydrogen bonds by the carbene insertion methodology, that employs diazo compounds as the carbene source.
Alcalá, Francisco J; Martín-Martín, Manuel; Guerrera, Francesco; Martínez-Valderrama, Jaime; Robles-Marín, Pedro
2018-07-15
In a previous paper, the Amtoudi Oasis, a remote area in the northern Sahara in southern Morocco, was chosen to model the dynamics of groundwater-dependent economics under different scenarios of water availability, both the wet 2009-2010 and the average 2010-2011 hydrological years. Groundwater imbalance was reflected by net aquifer recharge (R) less than groundwater allotment for agriculture and urban uses in the average year 2010-2011. Three key groundwater sustainability issues from the hydrologic perspective were raised for future research, which are addressed in this paper. Introducing a feasible methodology for groundwater resource modelling for sustainable use in sparse-data drylands, this paper updates available databases, compiles new databases, and introduces new formulations to: (1) refine the net groundwater balance (W) modelling for years 2009-2010 and 2010-2011, providing the magnitude of net lateral inflow from adjacent formations (R L ), the largest R component contributing to the oasis; (2) evaluate the non-evaporative fraction of precipitation (P) (B) from 1973 onward as a proxy of the potential renewable water resource available for use; and (3) define the critical balance period for variables to reach a comparable stationary condition, as prerequisite for long-term modelling of W. R L was about 0.07-fold P and 0.85-fold R. Historical yearly B-to-P ratios were 0.02 for dry, 0.04 for average, and 0.07 for wet hydrological years; the average yearly P being 124mm. A critical 17-year balance period with stable relative error below 0.1 was defined from the 44-year P and B time-series statistical study. This is the monitoring period proposed for the stationary evaluation of the variables involved in the long-term modelling of W. This paper seeks to offer a feasible methodology for groundwater modelling addressed for planning sustainable water policies in sparse-data drylands. Copyright © 2018 Elsevier B.V. All rights reserved.
An Evolutionary Approach for Identifying Driver Mutations in Colorectal Cancer
Leder, Kevin; Riester, Markus; Iwasa, Yoh; Lengauer, Christoph; Michor, Franziska
2015-01-01
The traditional view of cancer as a genetic disease that can successfully be treated with drugs targeting mutant onco-proteins has motivated whole-genome sequencing efforts in many human cancer types. However, only a subset of mutations found within the genomic landscape of cancer is likely to provide a fitness advantage to the cell. Distinguishing such “driver” mutations from innocuous “passenger” events is critical for prioritizing the validation of candidate mutations in disease-relevant models. We design a novel statistical index, called the Hitchhiking Index, which reflects the probability that any observed candidate gene is a passenger alteration, given the frequency of alterations in a cross-sectional cancer sample set, and apply it to a mutational data set in colorectal cancer. Our methodology is based upon a population dynamics model of mutation accumulation and selection in colorectal tissue prior to cancer initiation as well as during tumorigenesis. This methodology can be used to aid in the prioritization of candidate mutations for functional validation and contributes to the process of drug discovery. PMID:26379039
Generalized causal mediation and path analysis: Extensions and practical considerations.
Albert, Jeffrey M; Cho, Jang Ik; Liu, Yiying; Nelson, Suchitra
2018-01-01
Causal mediation analysis seeks to decompose the effect of a treatment or exposure among multiple possible paths and provide casually interpretable path-specific effect estimates. Recent advances have extended causal mediation analysis to situations with a sequence of mediators or multiple contemporaneous mediators. However, available methods still have limitations, and computational and other challenges remain. The present paper provides an extended causal mediation and path analysis methodology. The new method, implemented in the new R package, gmediation (described in a companion paper), accommodates both a sequence (two stages) of mediators and multiple mediators at each stage, and allows for multiple types of outcomes following generalized linear models. The methodology can also handle unsaturated models and clustered data. Addressing other practical issues, we provide new guidelines for the choice of a decomposition, and for the choice of a reference group multiplier for the reduction of Monte Carlo error in mediation formula computations. The new method is applied to data from a cohort study to illuminate the contribution of alternative biological and behavioral paths in the effect of socioeconomic status on dental caries in adolescence.
NASA Astrophysics Data System (ADS)
Shukla, Nagesh; Wickramasuriya, Rohan; Miller, Andrew; Perez, Pascal
2015-05-01
This paper proposes an integrated modelling process to assess the population accessibility to radiotherapy treatment services in future based on future cancer incidence and road network-based accessibility. Previous research efforts assessed travel distance/time barriers affecting access to cancer treatment services, as well as epidemiological studies that showed that cancer incidence rates vary with population demography. It is established that travel distances to treatment centres and demographic profiles of the accessible regions greatly influence the demand for cancer radiotherapy (RT) services. However, an integrated service planning approach that combines spatially-explicit cancer incidence projections, and the RT services accessibility based on patient road network have never been attempted. This research work presents this novel methodology for the accessibility assessment of RT services and demonstrates its viability by modelling New South Wales (NSW) cancer incidence rates for different age-sex groups based on observed cancer incidence trends; estimating the road network-based access to current NSW treatment centres; and, projecting the demand for RT services in New South Wales, Australia from year 2011 to 2026.
Darajeh, Negisa; Idris, Azni; Fard Masoumi, Hamid Reza; Nourani, Abolfazl; Truong, Paul; Sairi, Nor Asrina
2016-10-01
While the oil palm industry has been recognized for its contribution towards economic growth and rapid development, it has also contributed to environmental pollution due to the production of huge quantities of by-products from the oil extraction process. A phytoremediation technique (floating Vetiver system) was used to treat Palm Oil Mill Secondary Effluent (POMSE). A batch study using 40 L treatment tanks was carried out under different conditions and Response Surface Methodology (RSM) was applied to optimize the treatment process. A three factor central composite design (CCD) was used to predict the experimental variables (POMSE concentration, Vetiver plant density and time). An extraordinary decrease in organic matter as measured by BOD and COD (96% and 94% respectively) was recorded during the experimental duration of 4 weeks using a density of 30 Vetiver plants. The best and lowest final BOD of 2 mg/L was obtained when using 15 Vetiver plants after 13 days for low concentration POMSE (initial BOD = 50 mg/L). The next best result of BOD at 32 mg/L was obtained when using 30 Vetiver plants after 24 days for medium concentration POMSE (initial BOD = 175 mg/L). These results confirmed the validity of the model, and the experimental value was determined to be quite close to the predicted value, implying that the empirical model derived from RSM experimental design can be used to adequately describe the relationship between the independent variables and response. The study showed that the Vetiver system is an effective method of treating POMSE. Copyright © 2016 Elsevier Ltd. All rights reserved.
Permafrost Favourability Index: Spatial modelling in the French Alps using a Rock Glacier Inventory
NASA Astrophysics Data System (ADS)
Marcer, Marco; Bodin, Xavier; Brenning, Alexander; Schoeneich, Philippe; Charvet, Raphaële; Gottardi, Frédéric
2017-12-01
In the present study we used the first rock glacier inventory for the entire French Alps to model spatial permafrost distribution in the region. The inventory, which does not originally belong to this study, was revised by the authors in order to obtain a database suitable for statistical modelling. Climatic and topographic data evaluated at the rock glacier locations were used as predictor variables in a Generalized Linear Model. Model performances are strong, suggesting that, in agreement with several previous studies, this methodology is able to model accurately rock glacier distribution. A methodology to estimate model uncertainties is proposed, revealing that the subjectivity in the interpretation of rock glacier activity and contours may substantially bias the model. The model highlights a North-South trend in the regional pattern of permafrost distribution which is attributed to the climatic influences of the Atlantic and Mediterranean climates. Further analysis suggest that lower amounts of precipitation in the early winter and a thinner snow cover, as typically found in the Mediterranean area, could contribute to the existence of permafrost at higher temperatures compared to the Northern Alps. A comparison with the Alpine Permafrost Index Map (APIM) shows no major differences with our model, highlighting the very good predictive power of the APIM despite its tendency to slightly overestimate permafrost extension with respect to our database. The use of rock glaciers as indicators of permafrost existence despite their time response to climate change is discussed and an interpretation key is proposed in order to ensure the proper use of the model for research as well as for operational purposes.
A consistent modelling methodology for secondary settling tanks in wastewater treatment.
Bürger, Raimund; Diehl, Stefan; Nopens, Ingmar
2011-03-01
The aim of this contribution is partly to build consensus on a consistent modelling methodology (CMM) of complex real processes in wastewater treatment by combining classical concepts with results from applied mathematics, and partly to apply it to the clarification-thickening process in the secondary settling tank. In the CMM, the real process should be approximated by a mathematical model (process model; ordinary or partial differential equation (ODE or PDE)), which in turn is approximated by a simulation model (numerical method) implemented on a computer. These steps have often not been carried out in a correct way. The secondary settling tank was chosen as a case since this is one of the most complex processes in a wastewater treatment plant and simulation models developed decades ago have no guarantee of satisfying fundamental mathematical and physical properties. Nevertheless, such methods are still used in commercial tools to date. This particularly becomes of interest as the state-of-the-art practice is moving towards plant-wide modelling. Then all submodels interact and errors propagate through the model and severely hamper any calibration effort and, hence, the predictive purpose of the model. The CMM is described by applying it first to a simple conversion process in the biological reactor yielding an ODE solver, and then to the solid-liquid separation in the secondary settling tank, yielding a PDE solver. Time has come to incorporate established mathematical techniques into environmental engineering, and wastewater treatment modelling in particular, and to use proven reliable and consistent simulation models. Copyright © 2011 Elsevier Ltd. All rights reserved.
Decision models in the evaluation of psychotropic drugs : useful tool or useless toy?
Barbui, Corrado; Lintas, Camilla
2006-09-01
A current contribution in the European Journal of Health Economics employs a decision model to compare health care costs of olanzapine and risperidone treatment for schizophrenia. The model suggests that a treatment strategy of first-line olanzapine is cost-saving over a 1-year period, with additional clinical benefits in the form of avoided relapses in the long-term. From a clinical perspective this finding is indubitably relevant, but can physicians and policy makers believe it? The study is presented in a balanced way, assumptions are based on data extracted from clinical trials published in major psychiatric journals, and the theoretical underpinnings of the model are reasonable. Despite these positive aspects, we believe that the methodology used in this study-the decision model approach-is an unsuitable and potentially misleading tool for evaluating psychotropic drugs. In this commentary, taking the olanzapine vs. risperidone model as an example, arguments are provided to support this statement.
2013-01-01
Background Individual family planning service delivery organisations currently rely on service provision data and couple-years of protection as health impact measures. Due to the substitution effect and the continuation of users of long-term methods, these metrics cannot estimate an organisation's contribution to the national modern contraceptive prevalence rate (CPR), the standard metric for measuring family planning programme impacts. Increasing CPR is essential for addressing the unmet need for family planning, a recognized global health priority. Current health impact estimation models cannot isolate the impact of an organisation in these efforts. Marie Stopes International designed the Impact 2 model to measure an organisation's contribution to increases in national CPR, as well as resulting health and demographic impacts. This paper aims to describe the methodology for modelling increasing national-level CPR as well as to discuss its benefits and limitations. Methods Impact 2 converts service provision data into estimates of the number of family planning users, accounting for continuation among users of long-term methods and addressing the challenges of converting commodity distribution data of short-term methods into user numbers. These estimates, combined with the client profile and data on the organisation's previous year's CPR contribution, enable Impact 2 to estimate which clients maintain an organisation's baseline contribution, which ones fulfil population growth offsets, and ultimately, which ones increase CPR. Results Illustrative results from Marie Stopes Madagascar show how Impact 2 can be used to estimate an organisation's contribution to national changes in the CPR. Conclusions Impact 2 is a useful tool for service delivery organisations to move beyond cruder output measures to a better understanding of their role in meeting the global unmet need for family planning. By considering health impact from the perspective of an individual organisation, Impact 2 addresses gaps not met by other models for family planning service outcomes. Further, the model helps organisations improve service delivery by demonstrating that increases in the national CPR are not simply about expanding user numbers; rather, the type of user (e.g. adopters, provider changers) must be considered. Impact 2 can be downloaded at http://www.mariestopes.org/impact-2. PMID:23902699
[Ethical considerations about research with women in situations of violence].
Rafael, Ricardo de Mattos Russo; Soares de Moura, Anna Tereza Miranda
2013-01-01
This essay aims at reflecting on the ethical and methodological principles involved in research with women in situation of violence. The text raises the discussion of the application of the principles of beneficence and non-maleficence during researches involving this issue, pointing to recommendations towards privacy, autonomy and immediate contributions for volunteers. Then, taking as theoretical reference the principles of justice and equity, the authors propose a debate on methodological aspects involved in protection of respondents, with a view at improving the quality of the data obtained and possible social contributions.
Design methodology and results evaluation of a heating functionality in modular lab-on-chip systems
NASA Astrophysics Data System (ADS)
Streit, Petra; Nestler, Joerg; Shaporin, Alexey; Graunitz, Jenny; Otto, Thomas
2018-06-01
Lab-on-a-chip (LoC) systems offer the opportunity of fast and customized biological analyses executed at the ‘point-of-need’ without expensive lab equipment. Some biological processes need a temperature treatment. Therefore, it is important to ensure a defined and stable temperature distribution in the biosensor area. An integrated heating functionality is realized with discrete resistive heating elements including temperature measurement. The focus of this contribution is a design methodology and evaluation technique of the temperature distribution in the biosensor area with regard to the thermal-electrical behaviour of the heat sources. Furthermore, a sophisticated control of the biosensor temperature is proposed. A finite element (FE) model with one and more integrated heat sources in a polymer-based LoC system is used to investigate the impact of the number and arrangement of heating elements on the temperature distribution around the heating elements and in the biosensor area. Based on this model, various LOC systems are designed and fabricated. Electrical characterization of the heat sources and independent temperature measurements with infrared technique are performed to verify the model parameters and prove the simulation approach. The FE model and the proposed methodology is the foundation for optimization and evaluation of new designs with regard to temperature requirements of the biosensor. Furthermore, a linear dependency of the heater temperature on the electric current is demonstrated in the targeted temperature range of 20 °C to 70 °C enabling the usage of the heating functionality for biological reactions requiring a steady-state temperature up to 70 °C. The correlation between heater and biosensor area temperature is derived for a direct control through the heating current.
NASA Astrophysics Data System (ADS)
Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.
2017-04-01
Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and resolution.
WAVELET-DOMAIN REGRESSION AND PREDICTIVE INFERENCE IN PSYCHIATRIC NEUROIMAGING
Reiss, Philip T.; Huo, Lan; Zhao, Yihong; Kelly, Clare; Ogden, R. Todd
2016-01-01
An increasingly important goal of psychiatry is the use of brain imaging data to develop predictive models. Here we present two contributions to statistical methodology for this purpose. First, we propose and compare a set of wavelet-domain procedures for fitting generalized linear models with scalar responses and image predictors: sparse variants of principal component regression and of partial least squares, and the elastic net. Second, we consider assessing the contribution of image predictors over and above available scalar predictors, in particular via permutation tests and an extension of the idea of confounding to the case of functional or image predictors. Using the proposed methods, we assess whether maps of a spontaneous brain activity measure, derived from functional magnetic resonance imaging, can meaningfully predict presence or absence of attention deficit/hyperactivity disorder (ADHD). Our results shed light on the role of confounding in the surprising outcome of the recent ADHD-200 Global Competition, which challenged researchers to develop algorithms for automated image-based diagnosis of the disorder. PMID:27330652
Assessment, development, and application of combustor aerothermal models
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Mongia, H. C.; Mularz, E. J.
1989-01-01
The gas turbine combustion system design and development effort is an engineering exercise to obtain an acceptable solution to the conflicting design trade-offs between combustion efficiency, gaseous emissions, smoke, ignition, restart, lean blowout, burner exit temperature quality, structural durability, and life cycle cost. For many years, these combustor design trade-offs have been carried out with the help of fundamental reasoning and extensive component and bench testing, backed by empirical and experience correlations. Recent advances in the capability of computational fluid dynamics codes have led to their application to complex 3-D flows such as those in the gas turbine combustor. A number of U.S. Government and industry sponsored programs have made significant contributions to the formulation, development, and verification of an analytical combustor design methodology which will better define the aerothermal loads in a combustor, and be a valuable tool for design of future combustion systems. The contributions made by NASA Hot Section Technology (HOST) sponsored Aerothermal Modeling and supporting programs are described.
Gateuille, David; Evrard, Olivier; Lefevre, Irène; Moreau-Guigon, Elodie; Alliot, Fabrice; Chevreuil, Marc; Mouchel, Jean-Marie
2014-06-01
Various sources supply PAHs that accumulate in soils. The methodology we developed provided an evaluation of the contribution of local sources (road traffic, local industries) versus remote sources (long range atmospheric transport, fallout and gaseous exchanges) to PAH stocks in two contrasting subcatchments (46-614 km²) of the Seine River basin (France). Soil samples (n = 336) were analysed to investigate the spatial pattern of soil contamination across the catchments and an original combination with radionuclide measurements provided new insights into the evolution of the contamination with depth. Relationships between PAH concentrations and the distance to the potential sources were modelled. Despite both subcatchments are mainly rural, roadside areas appeared to concentrate 20% of the contamination inside the catchment while a local industry was found to be responsible for up to 30% of the stocks. Those results have important implications for understanding and controlling PAH contamination in rural areas of early-industrialized regions. Copyright © 2014 Elsevier Ltd. All rights reserved.
Evolutionary and ecological approaches to the study of personality
Réale, Denis; Dingemanse, Niels J.; Kazem, Anahita J. N.; Wright, Jonathan
2010-01-01
This introduction to the themed issue on Evolutionary and ecological approaches to the study of personality provides an overview of conceptual, theoretical and methodological progress in research on animal personalities over the last decade, and places the contributions to this volume in context. The issue has three main goals. First, we aimed to bring together theoreticians to contribute to the development of models providing adaptive explanations for animal personality that could guide empiricists, and stimulate exchange of ideas between the two groups of researchers. Second, we aimed to stimulate cross-fertilization between different scientific fields that study personality, namely behavioural ecology, psychology, genomics, quantitative genetics, neuroendocrinology and developmental biology. Third, we aimed to foster the application of an evolutionary framework to the study of personality. PMID:21078646
Walsh, Kieran; O'Shea, Eamon
2008-12-01
Older adult active retirement groups encompass health promotion, social and community psychological potential. However, little is known about the internal dynamics of these groups or their contribution to individual well-being and the community. This paper examines the Third Age Foundation as an example of one such group operating in a rural area in Ireland and explores the various relationships at work internally and externally. Methodology included: structured and semi-structured interviews, focus groups and a postal survey. A substantial contribution to members' well-being and community competence and cohesion was found. Findings are discussed in reference to the importance of individual and community empowerment, sustainability, social entrepreneurship/leadership and the potential of such models to support community-based living in older age.
Research and Training on White Dialectics: Some next Steps
ERIC Educational Resources Information Center
Ponterotto, Joseph G.
2011-01-01
This article reviews and extends the contribution of Todd and Abrams (2011). Paradigmatic and methodological strengths of their study are highlighted, and future directions for research, training, and practice are presented. Counseling research anchored in critical theory and incorporating diverse methodologies is encouraged.
Nasir, Hina; Javaid, Nadeem; Sher, Muhammad; Qasim, Umar; Khan, Zahoor Ali; Alrajeh, Nabil; Niaz, Iftikhar Azim
2016-01-01
This paper embeds a bi-fold contribution for Underwater Wireless Sensor Networks (UWSNs); performance analysis of incremental relaying in terms of outage and error probability, and based on the analysis proposition of two new cooperative routing protocols. Subject to the first contribution, a three step procedure is carried out; a system model is presented, the number of available relays are determined, and based on cooperative incremental retransmission methodology, closed-form expressions for outage and error probability are derived. Subject to the second contribution, Adaptive Cooperation in Energy (ACE) efficient depth based routing and Enhanced-ACE (E-ACE) are presented. In the proposed model, feedback mechanism indicates success or failure of data transmission. If direct transmission is successful, there is no need for relaying by cooperative relay nodes. In case of failure, all the available relays retransmit the data one by one till the desired signal quality is achieved at destination. Simulation results show that the ACE and E-ACE significantly improves network performance, i.e., throughput, when compared with other incremental relaying protocols like Cooperative Automatic Repeat reQuest (CARQ). E-ACE and ACE achieve 69% and 63% more throughput respectively as compared to CARQ in hard underwater environment. PMID:27420061
Keuleers, Emmanuel; Balota, David A
2015-01-01
This paper introduces and summarizes the special issue on megastudies, crowdsourcing, and large datasets in psycholinguistics. We provide a brief historical overview and show how the papers in this issue have extended the field by compiling new databases and making important theoretical contributions. In addition, we discuss several studies that use text corpora to build distributional semantic models to tackle various interesting problems in psycholinguistics. Finally, as is the case across the papers, we highlight some methodological issues that are brought forth via the analyses of such datasets.
Integrated Assessment and the Relation Between Land-Use Change and Climate Change
DOE R&D Accomplishments Database
Dale, V. H.
1994-10-07
Integrated assessment is an approach that is useful in evaluating the consequences of global climate change. Understanding the consequences requires knowledge of the relationship between land-use change and climate change. Methodologies for assessing the contribution of land-use change to atmospheric CO{sub 2} concentrations are considered with reference to a particular case study area: south and southeast Asia. The use of models to evaluate the consequences of climate change on forests must also consider an assessment approach. Each of these points is discussed in the following four sections.
MacDonnell, Judith Ann
2014-01-01
The aim of this analysis is to contribute to an understanding of emancipatory nursing in the context of higher education. Engagement with formative studies that used critical feminist methodologies led to my research focus on lesbian, gay, bisexual, and transgender (LGBT) health in my academic research program. Dimensions of emancipatory nursing include reflexivity, transformative learning, interdisciplinarity, praxis, and situated privilege. Several critical feminist methodologies are addressed: feminist ethnography, community-based participatory action research (CBPAR), and comparative life history. Commonalities across methodologies illustrate the potential for emancipatory outcomes/goals.
Students' Perceptions and Emotions Toward Learning in a Flipped General Science Classroom
NASA Astrophysics Data System (ADS)
Jeong, Jin Su; González-Gómez, David; Cañada-Cañada, Florentina
2016-10-01
Recently, the inverted instruction methodologies are gaining attentions in higher educations by claiming that flipping the classroom engages more effectively students with the learning process. Besides, students' perceptions and emotions involved in their learning process must be assessed in order to gauge the usability of this relatively new instruction methodology, since it is vital in the educational formation. For this reason, this study intends to evaluate the students' perceptions and emotions when a flipped classroom setting is used as instruction methodology. This research was conducted in a general science course, sophomore of the Primary Education bachelor degree in the Training Teaching School of the University of Extremadura (Spain). The results show that the students have the overall positive perceptions to a flipped classroom setting. Particularly, over 80 % of them considered that the course was a valuable learning experience. They also found this course more interactive and were willing to have more courses following a flipped model. According to the students' emotions toward a flipped classroom course, the highest scores were given to the positive emotions, being fun and enthusiasm along with keyword frequency test. Then, the lowest scores were corresponded to negative emotions, being boredom and fear. Therefore, the students attending to a flipped course demonstrated to have more positive and less negative emotions. The results obtained in this study allow drawing a promising tendency about the students' perceptions and emotions toward the flipped classroom methodology and will contribute to fully frame this relatively new instruction methodology.
Acidity in DMSO from the embedded cluster integral equation quantum solvation model.
Heil, Jochen; Tomazic, Daniel; Egbers, Simon; Kast, Stefan M
2014-04-01
The embedded cluster reference interaction site model (EC-RISM) is applied to the prediction of acidity constants of organic molecules in dimethyl sulfoxide (DMSO) solution. EC-RISM is based on a self-consistent treatment of the solute's electronic structure and the solvent's structure by coupling quantum-chemical calculations with three-dimensional (3D) RISM integral equation theory. We compare available DMSO force fields with reference calculations obtained using the polarizable continuum model (PCM). The results are evaluated statistically using two different approaches to eliminating the proton contribution: a linear regression model and an analysis of pK(a) shifts for compound pairs. Suitable levels of theory for the integral equation methodology are benchmarked. The results are further analyzed and illustrated by visualizing solvent site distribution functions and comparing them with an aqueous environment.
Engineering Large Animal Species to Model Human Diseases.
Rogers, Christopher S
2016-07-01
Animal models are an important resource for studying human diseases. Genetically engineered mice are the most commonly used species and have made significant contributions to our understanding of basic biology, disease mechanisms, and drug development. However, they often fail to recreate important aspects of human diseases and thus can have limited utility as translational research tools. Developing disease models in species more similar to humans may provide a better setting in which to study disease pathogenesis and test new treatments. This unit provides an overview of the history of genetically engineered large animals and the techniques that have made their development possible. Factors to consider when planning a large animal model, including choice of species, type of modification and methodology, characterization, production methods, and regulatory compliance, are also covered. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
Discrete element weld model, phase 2
NASA Technical Reports Server (NTRS)
Prakash, C.; Samonds, M.; Singhal, A. K.
1987-01-01
A numerical method was developed for analyzing the tungsten inert gas (TIG) welding process. The phenomena being modeled include melting under the arc and the flow in the melt under the action of buoyancy, surface tension, and electromagnetic forces. The latter entails the calculation of the electric potential and the computation of electric current and magnetic field therefrom. Melting may occur at a single temperature or over a temperature range, and the electrical and thermal conductivities can be a function of temperature. Results of sample calculations are presented and discussed at length. A major research contribution has been the development of numerical methodology for the calculation of phase change problems in a fixed grid framework. The model has been implemented on CHAM's general purpose computer code PHOENICS. The inputs to the computer model include: geometric parameters, material properties, and weld process parameters.
Predicting Dissertation Methodology Choice among Doctoral Candidates at a Faith-Based University
ERIC Educational Resources Information Center
Lunde, Rebecca
2017-01-01
Limited research has investigated dissertation methodology choice and the factors that contribute to this choice. Quantitative research is based in mathematics and scientific positivism, and qualitative research is based in constructivism. These underlying philosophical differences posit the question if certain factors predict dissertation…
Jones, Mirkka M; Tuomisto, Hanna; Borcard, Daniel; Legendre, Pierre; Clark, David B; Olivas, Paulo C
2008-03-01
The degree to which variation in plant community composition (beta-diversity) is predictable from environmental variation, relative to other spatial processes, is of considerable current interest. We addressed this question in Costa Rican rain forest pteridophytes (1,045 plots, 127 species). We also tested the effect of data quality on the results, which has largely been overlooked in earlier studies. To do so, we compared two alternative spatial models [polynomial vs. principal coordinates of neighbour matrices (PCNM)] and ten alternative environmental models (all available environmental variables vs. four subsets, and including their polynomials vs. not). Of the environmental data types, soil chemistry contributed most to explaining pteridophyte community variation, followed in decreasing order of contribution by topography, soil type and forest structure. Environmentally explained variation increased moderately when polynomials of the environmental variables were included. Spatially explained variation increased substantially when the multi-scale PCNM spatial model was used instead of the traditional, broad-scale polynomial spatial model. The best model combination (PCNM spatial model and full environmental model including polynomials) explained 32% of pteridophyte community variation, after correcting for the number of sampling sites and explanatory variables. Overall evidence for environmental control of beta-diversity was strong, and the main floristic gradients detected were correlated with environmental variation at all scales encompassed by the study (c. 100-2,000 m). Depending on model choice, however, total explained variation differed more than fourfold, and the apparent relative importance of space and environment could be reversed. Therefore, we advocate a broader recognition of the impacts that data quality has on analysis results. A general understanding of the relative contributions of spatial and environmental processes to species distributions and beta-diversity requires that methodological artefacts are separated from real ecological differences.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Astrophysics Data System (ADS)
McGuire, A. D.
2016-12-01
The Model Integration Group of the Permafrost Carbon Network (see http://www.permafrostcarbon.org/) has conducted studies to evaluate the sensitivity of offline terrestrial permafrost and carbon models to both historical and projected climate change. These studies indicate that there is a wide range of (1) initial states permafrost extend and carbon stocks simulated by these models and (2) responses of permafrost extent and carbon stocks to both historical and projected climate change. In this study, we synthesize what has been learned about the variability in initial states among models and the driving factors that contribute to variability in the sensitivity of responses. We conclude the talk with a discussion of efforts needed by (1) the modeling community to standardize structural representation of permafrost and carbon dynamics among models that are used to evaluate the permafrost carbon feedback and (2) the modeling and observational communities to jointly develop data sets and methodologies to more effectively benchmark models.
NASA Astrophysics Data System (ADS)
Shafii, Mahyar; Basu, Nandita; Schiff, Sherry; Van Cappellen, Philippe
2017-04-01
Dramatic increase in nitrogen circulating in the biosphere due to anthropogenic activities has resulted in impairment of water quality in groundwater and surface water causing eutrophication in coastal regions. Understanding the fate and transport of nitrogen from landscape to coastal areas requires exploring the drivers of nitrogen processes in both time and space, as well as the identification of appropriate flow pathways. Conceptual models can be used as diagnostic tools to provide insights into such controls. However, diagnostic evaluation of coupled hydrological-biogeochemical models is challenging. This research proposes a top-down methodology utilizing hydrochemical signatures to develop conceptual models for simulating the integrated streamflow and nitrate responses while taking into account dominant controls on nitrate variability (e.g., climate, soil water content, etc.). Our main objective is to seek appropriate model complexity that sufficiently reproduces multiple hydrological and nitrate signatures. Having developed a suitable conceptual model for a given watershed, we employ it in sensitivity studies to demonstrate the dominant process controls that contribute to the nitrate response at scales of interest. We apply the proposed approach to nitrate simulation in a range of small to large sub-watersheds in the Grand River Watershed (GRW) located in Ontario. Such multi-basin modeling experiment will enable us to address process scaling and investigate the consequences of lumping processes in terms of models' predictive capability. The proposed methodology can be applied to the development of large-scale models that can help decision-making associated with nutrients management at regional scale.
Modeling Common Cause Failures of Thrusters on ISS Visiting Vehicles
NASA Technical Reports Server (NTRS)
Haught, Megan
2014-01-01
This paper discusses the methodology used to model common cause failures of thrusters on the International Space Station (ISS) Visiting Vehicles. The ISS Visiting Vehicles each have as many as 32 thrusters, whose redundancy makes them susceptible to common cause failures. The Global Alpha Model (as described in NUREG/CR-5485) can be used to represent the system common cause contribution, but NUREG/CR-5496 supplies global alpha parameters for groups only up to size six. Because of the large number of redundant thrusters on each vehicle, regression is used to determine parameter values for groups of size larger than six. An additional challenge is that Visiting Vehicle thruster failures must occur in specific combinations in order to fail the propulsion system; not all failure groups of a certain size are critical.
Methodologies in the modeling of combined chemo-radiation treatments
NASA Astrophysics Data System (ADS)
Grassberger, C.; Paganetti, H.
2016-11-01
The variety of treatment options for cancer patients has increased significantly in recent years. Not only do we combine radiation with surgery and chemotherapy, new therapeutic approaches such as immunotherapy and targeted therapies are starting to play a bigger role. Physics has made significant contributions to radiation therapy treatment planning and delivery. In particular, treatment plan optimization using inverse planning techniques has improved dose conformity considerably. Furthermore, medical physics is often the driving force behind tumor control and normal tissue complication modeling. While treatment optimization and outcome modeling does focus mainly on the effects of radiation, treatment modalities such as chemotherapy are treated independently or are even neglected entirely. This review summarizes the published efforts to model combined modality treatments combining radiation and chemotherapy. These models will play an increasing role in optimizing cancer therapy not only from a radiation and drug dosage standpoint, but also in terms of spatial and temporal optimization of treatment schedules.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heo, Yeonsook; Augenbroe, Godfried; Graziano, Diane
2015-05-01
The increasing interest in retrofitting of existing buildings is motivated by the need to make a major contribution to enhancing building energy efficiency and reducing energy consumption and CO2 emission by the built environment. This paper examines the relevance of calibration in model-based analysis to support decision-making for energy and carbon efficiency retrofits of individual buildings and portfolios of buildings. The authors formulate a set of real retrofit decision-making situations and evaluate the role of calibration by using a case study that compares predictions and decisions from an uncalibrated model with those of a calibrated model. The case study illustratesmore » both the mechanics and outcomes of a practical alternative to the expert- and time-intense application of dynamic energy simulation models for large-scale retrofit decision-making under uncertainty.« less
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
Hagemann, Martin; Ndambi, Asaah; Hemme, Torsten; Latacz-Lohmann, Uwe
2012-02-01
Studies on the contribution of milk production to global greenhouse gas (GHG) emissions are rare (FAO 2010) and often based on crude data which do not appropriately reflect the heterogeneity of farming systems. This article estimates GHG emissions from milk production in different dairy regions of the world based on a harmonised farm data and assesses the contribution of milk production to global GHG emissions. The methodology comprises three elements: (1) the International Farm Comparison Network (IFCN) concept of typical farms and the related globally standardised dairy model farms representing 45 dairy regions in 38 countries; (2) a partial life cycle assessment model for estimating GHG emissions of the typical dairy farms; and (3) standard regression analysis to estimate GHG emissions from milk production in countries for which no typical farms are available in the IFCN database. Across the 117 typical farms in the 38 countries analysed, the average emission rate is 1.50 kg CO(2) equivalents (CO(2)-eq.)/kg milk. The contribution of milk production to the global anthropogenic emissions is estimated at 1.3 Gt CO(2)-eq./year, accounting for 2.65% of total global anthropogenic emissions (49 Gt; IPCC, Synthesis Report for Policy Maker, Valencia, Spain, 2007). We emphasise that our estimates of the contribution of milk production to global GHG emissions are subject to uncertainty. Part of the uncertainty stems from the choice of the appropriate methods for estimating emissions at the level of the individual animal.
Mearns, Susan Lesley
2011-01-01
This paper seeks to highlight the need for employment relations academics and researchers to expand their use of research methodologies in order for them to enable the advancement of theoretical debate within their discipline. It focuses on the contribution that pragmatical critical realism has made to the field of perception and argues that it would add value to the subject of employment relations. It is a theoretically centred review of pragmatical critical realism and the possible contribution this methodology would make to the field of employment relations. The paper concludes that the employment relationship does not take place in a vacuum rather it is focussed on the interaction between imperfect individuals. Therefore, their interactions are moulded by emotions which can not be explored thoroughly or even acknowledged through a positivists' rigorous but limited acknowledgment of what constitutes 'knowledge' and development of theory. While not rejecting the contribution that quantitative data or positivism have made to the field, the study concludes that pragmatic critical realism has a lot to offer the development of the area and its theoretical foundations.
Cisler, Josh M.; Bush, Keith; James, G. Andrew; Smitherman, Sonet; Kilts, Clinton D.
2015-01-01
Posttraumatic Stress Disorder (PTSD) is characterized by intrusive recall of the traumatic memory. While numerous studies have investigated the neural processing mechanisms engaged during trauma memory recall in PTSD, these analyses have only focused on group-level contrasts that reveal little about the predictive validity of the identified brain regions. By contrast, a multivariate pattern analysis (MVPA) approach towards identifying the neural mechanisms engaged during trauma memory recall would entail testing whether a multivariate set of brain regions is reliably predictive of (i.e., discriminates) whether an individual is engaging in trauma or non-trauma memory recall. Here, we use a MVPA approach to test 1) whether trauma memory vs neutral memory recall can be predicted reliably using a multivariate set of brain regions among women with PTSD related to assaultive violence exposure (N=16), 2) the methodological parameters (e.g., spatial smoothing, number of memory recall repetitions, etc.) that optimize classification accuracy and reproducibility of the feature weight spatial maps, and 3) the correspondence between brain regions that discriminate trauma memory recall and the brain regions predicted by neurocircuitry models of PTSD. Cross-validation classification accuracy was significantly above chance for all methodological permutations tested; mean accuracy across participants was 76% for the methodological parameters selected as optimal for both efficiency and accuracy. Classification accuracy was significantly better for a voxel-wise approach relative to voxels within restricted regions-of-interest (ROIs); classification accuracy did not differ when using PTSD-related ROIs compared to randomly generated ROIs. ROI-based analyses suggested the reliable involvement of the left hippocampus in discriminating memory recall across participants and that the contribution of the left amygdala to the decision function was dependent upon PTSD symptom severity. These results have methodological implications for real-time fMRI neurofeedback of the trauma memory in PTSD and conceptual implications for neurocircuitry models of PTSD that attempt to explain core neural processing mechanisms mediating PTSD. PMID:26241958
Cisler, Josh M; Bush, Keith; James, G Andrew; Smitherman, Sonet; Kilts, Clinton D
2015-01-01
Posttraumatic Stress Disorder (PTSD) is characterized by intrusive recall of the traumatic memory. While numerous studies have investigated the neural processing mechanisms engaged during trauma memory recall in PTSD, these analyses have only focused on group-level contrasts that reveal little about the predictive validity of the identified brain regions. By contrast, a multivariate pattern analysis (MVPA) approach towards identifying the neural mechanisms engaged during trauma memory recall would entail testing whether a multivariate set of brain regions is reliably predictive of (i.e., discriminates) whether an individual is engaging in trauma or non-trauma memory recall. Here, we use a MVPA approach to test 1) whether trauma memory vs neutral memory recall can be predicted reliably using a multivariate set of brain regions among women with PTSD related to assaultive violence exposure (N=16), 2) the methodological parameters (e.g., spatial smoothing, number of memory recall repetitions, etc.) that optimize classification accuracy and reproducibility of the feature weight spatial maps, and 3) the correspondence between brain regions that discriminate trauma memory recall and the brain regions predicted by neurocircuitry models of PTSD. Cross-validation classification accuracy was significantly above chance for all methodological permutations tested; mean accuracy across participants was 76% for the methodological parameters selected as optimal for both efficiency and accuracy. Classification accuracy was significantly better for a voxel-wise approach relative to voxels within restricted regions-of-interest (ROIs); classification accuracy did not differ when using PTSD-related ROIs compared to randomly generated ROIs. ROI-based analyses suggested the reliable involvement of the left hippocampus in discriminating memory recall across participants and that the contribution of the left amygdala to the decision function was dependent upon PTSD symptom severity. These results have methodological implications for real-time fMRI neurofeedback of the trauma memory in PTSD and conceptual implications for neurocircuitry models of PTSD that attempt to explain core neural processing mechanisms mediating PTSD.
Munthe-Kaas, Heather; Bohren, Meghan A; Glenton, Claire; Lewin, Simon; Noyes, Jane; Tunçalp, Özge; Booth, Andrew; Garside, Ruth; Colvin, Christopher J; Wainwright, Megan; Rashidian, Arash; Flottorp, Signe; Carlsen, Benedicte
2018-01-25
The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual's methodological limitations component. We developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application. When applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments. This paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual approach. More work is needed to determine which criteria critical appraisal tools should include when assessing methodological limitations. We currently recommend that whichever tool is used, review authors provide a transparent description of their assessments of methodological limitations in a review finding. We expect the CERQual approach and its individual components to develop further as our experiences with the practical implementation of the approach increase.
Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time
Avellar, Gustavo S. C.; Pereira, Guilherme A. S.; Pimenta, Luciano C. A.; Iscold, Paulo
2015-01-01
This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem’s (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles’ maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs. PMID:26540055
Race, racism and health: disparities, mechanisms, and interventions.
Brondolo, Elizabeth; Gallo, Linda C; Myers, Hector F
2009-02-01
The goals of this special section are to examine the state-of-the-science regarding race/ethnicity and racism as they contribute to health disparities and to articulate a research agenda to guide future research. In the first paper, Myers presents an integrative theoretical framework for understanding how racism, poverty, and other major stressors relate to health through inter-related psychosocial and bio-behavioral pathways. Williams and Mohammed review the evidence concerning associations between racism and health, addressing the multiple levels at which racism can operate and commenting on important methodological issues. Klonoff provides a review and update of the literature concerning ethnicity-related disparities in healthcare, and addresses factors that may contribute to these disparities. Brondolo and colleagues consider racism from a stress and coping perspective, and review the literature concerning racial identity, anger coping, and social support as potential moderators of the racism-health association. Finally, Castro and colleagues describe an ecodevelopmental model that can serve as an integrative framework to examine multi-level social-cultural influences on health and health behavior. In aggregate, the special section papers address theoretical and methodological issues central to understanding the determinants of health disparities, with the aim of providing direction for future research critical to developing effective interventions to reduce these disparities.
Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time.
Avellar, Gustavo S C; Pereira, Guilherme A S; Pimenta, Luciano C A; Iscold, Paulo
2015-11-02
This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem's (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles' maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs.
VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.
Little, Todd D; Wang, Eugene W; Gorrall, Britt K
2017-06-01
This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.
ERIC Educational Resources Information Center
Dyehouse, Jeremiah
2007-01-01
Researchers studying technology development often examine how rhetorical activity contributes to technologies' design, implementation, and stabilization. This article offers a possible methodology for studying one role of rhetorical activity in technology development: knowledge consolidation analysis. Applying this method to an exemplar case, the…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... statistical and other methodological consultation to this collaborative project. Discussion: Grantees under... and technical assistance must be designed to contribute to the following outcomes: (a) Maintenance of... methodological consultation available for research projects that use the BMS Database, as well as site- specific...
Feminist Research Methodology Groups: Origins, Forms, Functions.
ERIC Educational Resources Information Center
Reinharz, Shulamit
Feminist Research Methodology Groups (FRMGs) have developed as a specific type of women's group in which feminist academics can find supportive audiences for their work while contributing to a feminist redefinition of research methods. An analysis of two FRMGs reveals common characteristics, dynamics, and outcomes. Both were limited to small…
A Review of Traditional Cloze Testing Methodology.
ERIC Educational Resources Information Center
Heerman, Charles E.
To analyze the validity of W. L. Taylor's cloze testing methodology, this paper first examines three areas contributing to Taylor's thinking: communications theory, the psychology of speech and communication, and the theory of dispositional mechanisms--or nonessential words--in speech. It then evaluates Taylor's research to determine how he…
ERIC Educational Resources Information Center
Zelnio, Ryan J.
2013-01-01
This dissertation seeks to contribute to a fuller understanding of how international scientific collaboration has affected national scientific systems. It does this by developing three methodological approaches grounded in social complexity theory and applying them to the evaluation of national scientific systems. The first methodology identifies…
Developing International Managers: The Contribution of Cultural Experience to Learning
ERIC Educational Resources Information Center
Townsend, Peter; Regan, Padraic; Li, Liang Liang
2015-01-01
Purpose: The purpose of this paper is to evaluate cultural experience as a learning strategy for developing international managers. Design/methodology/approach: Using an integrated framework, two quantitative studies, based on empirical methodology, are conducted. Study 1, with an undergraduate sample situated in the Asia Pacific, aimed to examine…
Integrating Shamanic Methodology into the Spirituality of Addictions Recovery Work
ERIC Educational Resources Information Center
Rich, Marcia L.
2012-01-01
Responding to an increased recognition of the importance of spirituality in the aetiology and treatment of addictions, this article provides an overview of the potential contributions of both transpersonal psychology and shamanic methodology for the addictions field. A case study is provided to illustrate the integration of conventional,…
Researching Assessment as Social Practice: Implications for Research Methodology
ERIC Educational Resources Information Center
Shay, Suellen
2008-01-01
Recent educational journals on both sides of the Atlantic have seen a resurgence of debate about the nature of educational research. As a contribution to these debates, this paper draws on theoretical and methodological "thinking tools" of French sociologist Pierre Bourdieu. Specifically, the paper explores what Jenkins [Jenkins, R.…
How Methodological Features Affect Effect Sizes in Education
ERIC Educational Resources Information Center
Cheung, Alan; Slavin, Robert
2016-01-01
As evidence-based reform becomes increasingly important in educational policy, it is becoming essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. The purpose of this study was to examine how methodological features such as types of publication, sample sizes, and…
ERIC Educational Resources Information Center
Pucci, Bruno
2000-01-01
Considers the differences between quantitative and qualitative research. Cites some essays by Adorno when he was living in New York which led to the conclusion that empirical data has much to say and discusses the theoretical-methodological contributions in a recent master's thesis in education. (BT)
Impact Evaluation of Quality Assurance in Higher Education: Methodology and Causal Designs
ERIC Educational Resources Information Center
Leiber, Theodor; Stensaker, Bjørn; Harvey, Lee
2015-01-01
In this paper, the theoretical perspectives and general methodological elements of impact evaluation of quality assurance in higher education institutions are discussed, which should be a cornerstone of quality development in higher education and contribute to improving the knowledge about the effectiveness (or ineffectiveness) of quality…
SUDOQU, a new dose-assessment methodology for radiological surface contamination.
van Dillen, Teun; van Dijk, Arjan
2018-06-12
A new methodology has been developed for the assessment of the annual effective dose resulting from removable and fixed radiological surface contamination. It is entitled SUDOQU (SUrface DOse QUantification) and it can for instance be used to derive criteria for surface contamination related to the import of non-food consumer goods, containers and conveyances, e.g., limiting values and operational screening levels. SUDOQU imposes mass (activity)-balance equations based on radioactive decay, removal and deposition processes in indoor and outdoor environments. This leads to time-dependent contamination levels that may be of particular importance in exposure scenarios dealing with one or a few contaminated items only (usually public exposure scenarios, therefore referred to as the 'consumer' model). Exposure scenarios with a continuous flow of freshly contaminated goods also fall within the scope of the methodology (typically occupational exposure scenarios, thus referred to as the 'worker model'). In this paper we describe SUDOQU, its applications, and its current limitations. First, we delineate the contamination issue, present the assumptions and explain the concepts. We describe the relevant removal, transfer, and deposition processes, and derive equations for the time evolution of the radiological surface-, air- and skin-contamination levels. These are then input for the subsequent evaluation of the annual effective dose with possible contributions from external gamma radiation, inhalation, secondary ingestion (indirect, from hand to mouth), skin contamination, direct ingestion and skin-contact exposure. The limiting effective surface dose is introduced for issues involving the conservatism of dose calculations. SUDOQU can be used by radiation-protection scientists/experts and policy makers in the field of e.g. emergency preparedness, trade and transport, exemption and clearance, waste management, and nuclear facilities. Several practical examples are worked out demonstrating the potential applications of the methodology. . Creative Commons Attribution license.
A general methodology for population analysis
NASA Astrophysics Data System (ADS)
Lazov, Petar; Lazov, Igor
2014-12-01
For a given population with N - current and M - maximum number of entities, modeled by a Birth-Death Process (BDP) with size M+1, we introduce utilization parameter ρ, ratio of the primary birth and death rates in that BDP, which, physically, determines (equilibrium) macrostates of the population, and information parameter ν, which has an interpretation as population information stiffness. The BDP, modeling the population, is in the state n, n=0,1,…,M, if N=n. In presence of these two key metrics, applying continuity law, equilibrium balance equations concerning the probability distribution pn, n=0,1,…,M, of the quantity N, pn=Prob{N=n}, in equilibrium, and conservation law, and relying on the fundamental concepts population information and population entropy, we develop a general methodology for population analysis; thereto, by definition, population entropy is uncertainty, related to the population. In this approach, what is its essential contribution, the population information consists of three basic parts: elastic (Hooke's) or absorption/emission part, synchronization or inelastic part and null part; the first two parts, which determine uniquely the null part (the null part connects them), are the two basic components of the Information Spectrum of the population. Population entropy, as mean value of population information, follows this division of the information. A given population can function in information elastic, antielastic and inelastic regime. In an information linear population, the synchronization part of the information and entropy is absent. The population size, M+1, is the third key metric in this methodology. Namely, right supposing a population with infinite size, the most of the key quantities and results for populations with finite size, emerged in this methodology, vanish.
Comparing Models and Methods for the Delineation of Stream Baseflow Contribution Areas
NASA Astrophysics Data System (ADS)
Chow, R.; Frind, M.; Frind, E. O.; Jones, J. P.; Sousa, M.; Rudolph, D. L.; Nowak, W.
2016-12-01
This study addresses the delineation of areas that contribute baseflow to a stream reach, also known as stream capture zones. Such areas can be delineated using standard well capture zone delineation methods, with three important differences: (1) natural gradients are smaller compared to those produced by supply wells and are therefore subject to greater numerical errors, (2) stream discharge varies seasonally, and (3) stream discharge varies spatially. This study focuses on model-related uncertainties due to parameter non-uniqueness, discretization schemes, and particle tracking algorithms. The methodology is applied to the Alder Creek watershed in southwestern Ontario. Four different model codes are compared: HydroGeoSphere, WATFLOW, MODFLOW, and FEFLOW. In addition, two delineation methods are compared: reverse particle tracking and reverse transport, where the latter considers local-scale parameter uncertainty by using a macrodispersion term to produce a capture probability plume. The results from this study indicate that different models can calibrate acceptably well to the same data and produce very similar distributions of hydraulic head, but can produce different capture zones. The stream capture zone is found to be highly sensitive to the particle tracking algorithm. It was also found that particle tracking by itself, if applied to complex systems such as the Alder Creek watershed, would require considerable subjective judgement in the delineation of stream capture zones. Reverse transport is an alternate approach that provides probability intervals for the baseflow contribution areas. In situations where the two approaches agree, the confidence in the delineation is reinforced.
Research methods from social science can contribute much to the health sciences.
Wensing, Michel
2008-06-01
Research methods from social science, such as social network analysis, random coefficient modeling, and advanced measurement techniques, can contribute much to the health sciences. There is, however, a slow rate of transmission of social science methodology into the health sciences. This paper identifies some of the barriers for adoption and proposes ideas for the future. Commentary. Contributions of social science to the health sciences are not always recognized as such. It may help if the professional profile of social science in the health sciences would be higher and if its focus would be more on making useful predictions. Clinical epidemiologists may assume that their discipline includes all relevant methods and that social science is largely based on qualitative research. These perceptions need to be challenged in order to widen the scope of clinical epidemiology and include relevant methods from other sciences. New methods help to ask new research questions and to provide better to old questions. This paper has sketched challenges for both social science researchers and clinical epidemiologists.
The Educational Situation Quality Model: Recent Advances.
Doménech-Betoret, Fernando
2018-01-01
The purpose of this work was to present an educational model developed in recent years entitled the "The Educational Situation Quality Model" (MOCSE, acronym in Spanish). MOCSE can be defined as an instructional model that simultaneously considers the teaching-learning process, where motivation plays a central role. It explains the functioning of an educational setting by organizing and relating the most important variables which, according to the literature, contribute to student learning. Besides being a conceptual framework, this model also provides a methodological procedure to guide research and to promote reflection in the classroom. It allows teachers to implement effective research-action programs to improve teacher-students satisfaction and learning outcomes in the classroom context. This work explains the model's characteristics and functioning, recent advances, and how teachers can use it in an educational setting with a specific subject. This proposal integrates approaches from several relevant psycho-educational theories and introduces a new perspective into the existing literature that will allow researchers to make progress in studying educational setting functioning. The initial MOCSE configuration has been refined over time in accordance with the empirical results obtained from previous research, carried out within the MOCSE framework and with the subsequent reflections that derived from these results. Finally, the contribution of the model to improve learning outcomes and satisfaction, and its applicability in the classroom, are also discussed.
Grainger, Matthew James; Aramyan, Lusine; Piras, Simone; Quested, Thomas Edward; Righi, Simone; Setti, Marco; Vittuari, Matteo; Stewart, Gavin Bruce
2018-01-01
Food waste from households contributes the greatest proportion to total food waste in developed countries. Therefore, food waste reduction requires an understanding of the socio-economic (contextual and behavioural) factors that lead to its generation within the household. Addressing such a complex subject calls for sound methodological approaches that until now have been conditioned by the large number of factors involved in waste generation, by the lack of a recognised definition, and by limited available data. This work contributes to food waste generation literature by using one of the largest available datasets that includes data on the objective amount of avoidable household food waste, along with information on a series of socio-economic factors. In order to address one aspect of the complexity of the problem, machine learning algorithms (random forests and boruta) for variable selection integrated with linear modelling, model selection and averaging are implemented. Model selection addresses model structural uncertainty, which is not routinely considered in assessments of food waste in literature. The main drivers of food waste in the home selected in the most parsimonious models include household size, the presence of fussy eaters, employment status, home ownership status, and the local authority. Results, regardless of which variable set the models are run on, point toward large households as being a key target element for food waste reduction interventions.
Aramyan, Lusine; Piras, Simone; Quested, Thomas Edward; Righi, Simone; Setti, Marco; Vittuari, Matteo; Stewart, Gavin Bruce
2018-01-01
Food waste from households contributes the greatest proportion to total food waste in developed countries. Therefore, food waste reduction requires an understanding of the socio-economic (contextual and behavioural) factors that lead to its generation within the household. Addressing such a complex subject calls for sound methodological approaches that until now have been conditioned by the large number of factors involved in waste generation, by the lack of a recognised definition, and by limited available data. This work contributes to food waste generation literature by using one of the largest available datasets that includes data on the objective amount of avoidable household food waste, along with information on a series of socio-economic factors. In order to address one aspect of the complexity of the problem, machine learning algorithms (random forests and boruta) for variable selection integrated with linear modelling, model selection and averaging are implemented. Model selection addresses model structural uncertainty, which is not routinely considered in assessments of food waste in literature. The main drivers of food waste in the home selected in the most parsimonious models include household size, the presence of fussy eaters, employment status, home ownership status, and the local authority. Results, regardless of which variable set the models are run on, point toward large households as being a key target element for food waste reduction interventions. PMID:29389949
Moving Beyond ERP Components: A Selective Review of Approaches to Integrate EEG and Behavior
Bridwell, David A.; Cavanagh, James F.; Collins, Anne G. E.; Nunez, Michael D.; Srinivasan, Ramesh; Stober, Sebastian; Calhoun, Vince D.
2018-01-01
Relationships between neuroimaging measures and behavior provide important clues about brain function and cognition in healthy and clinical populations. While electroencephalography (EEG) provides a portable, low cost measure of brain dynamics, it has been somewhat underrepresented in the emerging field of model-based inference. We seek to address this gap in this article by highlighting the utility of linking EEG and behavior, with an emphasis on approaches for EEG analysis that move beyond focusing on peaks or “components” derived from averaging EEG responses across trials and subjects (generating the event-related potential, ERP). First, we review methods for deriving features from EEG in order to enhance the signal within single-trials. These methods include filtering based on user-defined features (i.e., frequency decomposition, time-frequency decomposition), filtering based on data-driven properties (i.e., blind source separation, BSS), and generating more abstract representations of data (e.g., using deep learning). We then review cognitive models which extract latent variables from experimental tasks, including the drift diffusion model (DDM) and reinforcement learning (RL) approaches. Next, we discuss ways to access associations among these measures, including statistical models, data-driven joint models and cognitive joint modeling using hierarchical Bayesian models (HBMs). We think that these methodological tools are likely to contribute to theoretical advancements, and will help inform our understandings of brain dynamics that contribute to moment-to-moment cognitive function. PMID:29632480
The Evolving Contribution of Mass Spectrometry to Integrative Structural Biology
NASA Astrophysics Data System (ADS)
Faini, Marco; Stengel, Florian; Aebersold, Ruedi
2016-06-01
Protein complexes are key catalysts and regulators for the majority of cellular processes. Unveiling their assembly and structure is essential to understanding their function and mechanism of action. Although conventional structural techniques such as X-ray crystallography and NMR have solved the structure of important protein complexes, they cannot consistently deal with dynamic and heterogeneous assemblies, limiting their applications to small scale experiments. A novel methodological paradigm, integrative structural biology, aims at overcoming such limitations by combining complementary data sources into a comprehensive structural model. Recent applications have shown that a range of mass spectrometry (MS) techniques are able to generate interaction and spatial restraints (cross-linking MS) information on native complexes or to study the stoichiometry and connectivity of entire assemblies (native MS) rapidly, reliably, and from small amounts of substrate. Although these techniques by themselves do not solve structures, they do provide invaluable structural information and are thus ideally suited to contribute to integrative modeling efforts. The group of Brian Chait has made seminal contributions in the use of mass spectrometric techniques to study protein complexes. In this perspective, we honor the contributions of the Chait group and discuss concepts and milestones of integrative structural biology. We also review recent examples of integration of structural MS techniques with an emphasis on cross-linking MS. We then speculate on future MS applications that would unravel the dynamic nature of protein complexes upon diverse cellular states.
Vilaprinyo, Ester; Puig, Teresa; Rue, Montserrat
2012-01-01
Background Reductions in breast cancer (BC) mortality in Western countries have been attributed to the use of screening mammography and adjuvant treatments. The goal of this work was to analyze the contributions of both interventions to the decrease in BC mortality between 1975 and 2008 in Catalonia. Methodology/Principal Findings A stochastic model was used to quantify the contribution of each intervention. Age standardized BC mortality rates for calendar years 1975–2008 were estimated in four hypothetical scenarios: 1) Only screening, 2) Only adjuvant treatment, 3) Both interventions, and 4) No intervention. For the 30–69 age group, observed Catalan BC mortality rates per 100,000 women-year rose from 29.4 in 1975 to 38.3 in 1993, and afterwards continuously decreased to 23.2 in 2008. If neither of the two interventions had been used, in 2008 the estimated BC mortality would have been 43.5, which, compared to the observed BC mortality rate, indicates a 46.7% reduction. In 2008 the reduction attributable to screening was 20.4%, to adjuvant treatments was 15.8% and to both interventions 34.1%. Conclusions/Significance Screening and adjuvant treatments similarly contributed to reducing BC mortality in Catalonia. Mathematical models have been useful to assess the impact of interventions addressed to reduce BC mortality that occurred over nearly the same periods. PMID:22272292
Further Simplification of the Simple Erosion Narrowing Score With Item Response Theory Methodology.
Oude Voshaar, Martijn A H; Schenk, Olga; Ten Klooster, Peter M; Vonkeman, Harald E; Bernelot Moens, Hein J; Boers, Maarten; van de Laar, Mart A F J
2016-08-01
To further simplify the simple erosion narrowing score (SENS) by removing scored areas that contribute the least to its measurement precision according to analysis based on item response theory (IRT) and to compare the measurement performance of the simplified version to the original. Baseline and 18-month data of the Combinatietherapie Bij Reumatoide Artritis (COBRA) trial were modeled using longitudinal IRT methodology. Measurement precision was evaluated across different levels of structural damage. SENS was further simplified by omitting the least reliably scored areas. Discriminant validity of SENS and its simplification were studied by comparing their ability to differentiate between the COBRA and sulfasalazine arms. Responsiveness was studied by comparing standardized change scores between versions. SENS data showed good fit to the IRT model. Carpal and feet joints contributed the least statistical information to both erosion and joint space narrowing scores. Omitting the joints of the foot reduced measurement precision for the erosion score in cases with below-average levels of structural damage (relative efficiency compared with the original version ranged 35-59%). Omitting the carpal joints had minimal effect on precision (relative efficiency range 77-88%). Responsiveness of a simplified SENS without carpal joints closely approximated the original version (i.e., all Δ standardized change scores were ≤0.06). Discriminant validity was also similar between versions for both the erosion score (relative efficiency = 97%) and the SENS total score (relative efficiency = 84%). Our results show that the carpal joints may be omitted from the SENS without notable repercussion for its measurement performance. © 2016, American College of Rheumatology.
NASA Astrophysics Data System (ADS)
Sayol, J. M.; Marcos, M.
2018-02-01
This study presents a novel methodology to estimate the impact of local sea level rise and extreme surges and waves in coastal areas under climate change scenarios. The methodology is applied to the Ebro Delta, a valuable and vulnerable low-lying wetland located in the northwestern Mediterranean Sea. Projections of local sea level accounting for all contributions to mean sea level changes, including thermal expansion, dynamic changes, fresh water addition and glacial isostatic adjustment, have been obtained from regionalized sea level projections during the 21st century. Particular attention has been paid to the uncertainties, which have been derived from the spread of the multi-model ensemble combined with seasonal/inter-annual sea level variability from local tide gauge observations. Besides vertical land movements have also been integrated to estimate local relative sea level rise. On the other hand, regional projections over the Mediterranean basin of storm surges and wind-waves have been used to evaluate changes in extreme events. The compound effects of surges and extreme waves have been quantified using their joint probability distributions. Finally, offshore sea level projections from extreme events superimposed to mean sea level have been propagated onto a high resolution digital elevation model of the study region in order to construct flood hazards maps for mid and end of the 21st century and under two different climate change scenarios. The effect of each contribution has been evaluated in terms of percentage of the area exposed to coastal hazards, which will help to design more efficient protection and adaptation measures.
NASA Astrophysics Data System (ADS)
Vaughan, A. R.; Lee, J. D.; Lewis, A. C.; Purvis, R.; Carslaw, D.; Misztal, P. K.; Metzger, S.; Beevers, S.; Goldstein, A. H.; Hewitt, C. N.; Shaw, M.; Karl, T.; Davison, B.
2015-12-01
The emission of pollutants is a major problem in today's cities. Emission inventories are a key tool for air quality management, with the United Kingdom's National and London Atmospheric Emission Inventories (NAEI & LAEI) being good examples. Assessing the validity of such inventoried is important. Here we report on the technical methodology of matching flux measurements of NOx over a city to inventory estimates. We used an eddy covariance technique to directly measure NOx fluxes from central London on an aircraft flown at low altitude. NOx mixing ratios were measured at 10 Hz time resolution using chemiluminescence (to measure NO) and highly specific photolytic conversion of NO2 to NO (to measure NO2). Wavelet transformation was used to calculate instantaneous fluxes along the flight track for each flight leg. The transformation allows for both frequency and time information to be extracted from a signal, where we quantify the covariance between the de-trended vertical wind and concentration to derive a flux. Comparison between the calculated fluxes and emission inventory data was achieved using a footprint model, which accounts for contributing source. Using both a backwards lagrangian model and cross-wind dispersion function, we find the footprint extent ranges from 5 to 11 Km in distance from the sample point. We then calculate a relative weighting matrix for each emission inventory within the calculated footprint. The inventories are split into their contributing source sectors with each scaled using up to date emission factors, giving a month; day and hourly scaled estimate which is then compared to the measurement.
[Income-related health inequalities in France in 2004: Decomposition and explanations].
Tubeuf, S
2009-10-01
This analysis supplements existing work on social health inequalities at two levels: the measurement of health and the measurement of inequalities. Firstly, individual health status was measured using a subjective health indicator corrected within a promising cardinalisation method which had not yet been carried out on French data. Secondly, this study used an innovative methodology to measure income-related health inequalities, to understand the relationships between income, income inequality, various social determinants, and health. The analysis was based on a sample of working-age adults from the 2004 Health and Health Insurance Survey. The methodology used in the study measures the total income-related health inequality using the concentration index. This index is based on a linear model explaining health according to several individual characteristics, such as age, sex, and various socioeconomic characteristics. The method thus takes into account both the causal relationships between the various explicative factors introduced in the model and their relationship with health. Furthermore, it concretely measures the contribution of the social determinants to income-related health inequalities. The results show an income-related health inequality favouring individuals with a higher income. Moreover, income level, supplementary private health insurance, education level, and social class account for the main contributions to inequality. Therefore, the decomposition method highlights population groups that policies should target. The study suggests that reducing income inequality is not sufficient to lower income-related health inequalities in France in 2004 and needs to be supplemented with the reduction of the relationship between income and health and the reduction of income inequality over socioeconomic status.
Deserno, Lorenz; Boehme, Rebecca; Heinz, Andreas; Schlagenhauf, Florian
2013-01-01
Abnormalities in reinforcement learning are a key finding in schizophrenia and have been proposed to be linked to elevated levels of dopamine neurotransmission. Behavioral deficits in reinforcement learning and their neural correlates may contribute to the formation of clinical characteristics of schizophrenia. The ability to form predictions about future outcomes is fundamental for environmental interactions and depends on neuronal teaching signals, like reward prediction errors. While aberrant prediction errors, that encode non-salient events as surprising, have been proposed to contribute to the formation of positive symptoms, a failure to build neural representations of decision values may result in negative symptoms. Here, we review behavioral and neuroimaging research in schizophrenia and focus on studies that implemented reinforcement learning models. In addition, we discuss studies that combined reinforcement learning with measures of dopamine. Thereby, we suggest how reinforcement learning abnormalities in schizophrenia may contribute to the formation of psychotic symptoms and may interact with cognitive deficits. These ideas point toward an interplay of more rigid versus flexible control over reinforcement learning. Pronounced deficits in the flexible or model-based domain may allow for a detailed characterization of well-established cognitive deficits in schizophrenia patients based on computational models of learning. Finally, we propose a framework based on the potentially crucial contribution of dopamine to dysfunctional reinforcement learning on the level of neural networks. Future research may strongly benefit from computational modeling but also requires further methodological improvement for clinical group studies. These research tools may help to improve our understanding of disease-specific mechanisms and may help to identify clinically relevant subgroups of the heterogeneous entity schizophrenia. PMID:24391603
NASA Technical Reports Server (NTRS)
Daryabeigi, Kamran; Cunnington, George R.; Miller, Steve D.; Knutson, Jeffry R.
2010-01-01
Combined radiation and conduction heat transfer through various high-temperature, high-porosity, unbonded (loose) fibrous insulations was modeled based on first principles. The diffusion approximation was used for modeling the radiation component of heat transfer in the optically thick insulations. The relevant parameters needed for the heat transfer model were derived from experimental data. Semi-empirical formulations were used to model the solid conduction contribution of heat transfer in fibrous insulations with the relevant parameters inferred from thermal conductivity measurements at cryogenic temperatures in a vacuum. The specific extinction coefficient for radiation heat transfer was obtained from high-temperature steady-state thermal measurements with large temperature gradients maintained across the sample thickness in a vacuum. Standard gas conduction modeling was used in the heat transfer formulation. This heat transfer modeling methodology was applied to silica, two types of alumina, and a zirconia-based fibrous insulation, and to a variation of opacified fibrous insulation (OFI). OFI is a class of insulations manufactured by embedding efficient ceramic opacifiers in various unbonded fibrous insulations to significantly attenuate the radiation component of heat transfer. The heat transfer modeling methodology was validated by comparison with more rigorous analytical solutions and with standard thermal conductivity measurements. The validated heat transfer model is applicable to various densities of these high-porosity insulations as long as the fiber properties are the same (index of refraction, size distribution, orientation, and length). Furthermore, the heat transfer data for these insulations can be obtained at any static pressure in any working gas environment without the need to perform tests in various gases at various pressures.
Lambeth, Christopher; Amatoury, Jason; Wang, Ziyu; Foster, Sheryl; Amis, Terence; Kairaitis, Kristina
2017-03-01
Macroscopic pharyngeal anatomical abnormalities are thought to contribute to the pathogenesis of upper airway (UA) obstruction in obstructive sleep apnea (OSA). Microscopic changes in the UA mucosal lining of OSA subjects are reported; however, the impact of these changes on UA mucosal surface topography is unknown. This study aimed to 1 ) develop methodology to measure UA mucosal surface topography, and 2 ) compare findings from healthy and OSA subjects. Ten healthy and eleven OSA subjects were studied. Awake, gated (end expiration), head and neck position controlled magnetic resonance images (MRIs) of the velopharynx (VP) were obtained. VP mucosal surfaces were segmented from axial images, and three-dimensional VP mucosal surface models were constructed. Curvature analysis of the models was used to study the VP mucosal surface topography. Principal, mean, and Gaussian curvatures were used to define surface shape composition and surface roughness of the VP mucosal surface models. Significant differences were found in the surface shape composition, with more saddle/spherical and less flat/cylindrical shapes in OSA than healthy VP mucosal surface models ( P < 0.01). OSA VP mucosal surface models were also found to have more mucosal surface roughness ( P < 0.0001) than healthy VP mucosal surface models. Our novel methodology was utilized to model the VP mucosal surface of OSA and healthy subjects. OSA subjects were found to have different VP mucosal surface topography, composed of increased irregular shapes and increased roughness. We speculate increased irregularity in VP mucosal surface may increase pharyngeal collapsibility as a consequence of friction-related pressure loss. NEW & NOTEWORTHY A new methodology was used to model the upper airway mucosal surface topography from magnetic resonance images of patients with obstructive sleep apnea and healthy adults. Curvature analysis was used to analyze the topography of the models, and a new metric was derived to describe the mucosal surface roughness. Increased roughness was found in the obstructive sleep apnea vs. healthy group, but further research is required to determine the functional effects of the measured difference on upper airway airflow mechanics. Copyright © 2017 the American Physiological Society.
Bansemir, G
1987-01-01
The conception and evaluation of standardized oral or written questioning as quantifying instruments of research orientate by the basic premises of Marxist-Leninist theory of recognition and general scientific logic. In the present contribution the socio-gerontological research process is outlined in extracts. By referring to the intrinsic connection between some of its essential components--problem, formation of hypotheses, obtaining indicators/measurement, preliminary examination, evaluation-as well as to typical errors and (fictitious) examples of practical research, this contribution contrasts the natural, apparently uncomplicated course of structured questioning with its qualitative methodological fundamentals and demands.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems
2015-12-01
distribution is unlimited IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE SYSTEMS by Sang M. Sok December 2015...REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE...importance of the CoM. The improved conceptual model methodology (ICoMM) is developed in support of improving the structure of the CoM for both face and
Peasura, Prachya
2015-01-01
This research studied the application of the response surface methodology (RSM) and central composite design (CCD) experiment in mathematical model and optimizes postweld heat treatment (PWHT). The material of study is a pressure vessel steel ASTM A516 grade 70 that is used for gas metal arc welding. PWHT parameters examined in this study included PWHT temperatures and time. The resulting materials were examined using CCD experiment and the RSM to determine the resulting material tensile strength test, observed with optical microscopy and scanning electron microscopy. The experimental results show that using a full quadratic model with the proposed mathematical model is YTS = -285.521 + 15.706X1 + 2.514X2 - 0.004X1(2) - 0.001X2(2) - 0.029X1X2. Tensile strength parameters of PWHT were optimized PWHT time of 5.00 hr and PWHT temperature of 645.75°C. The results show that the PWHT time is the dominant mechanism used to modify the tensile strength compared to the PWHT temperatures. This phenomenon could be explained by the fact that pearlite can contribute to higher tensile strength. Pearlite has an intensity, which results in increased material tensile strength. The research described here can be used as material data on PWHT parameters for an ASTM A516 grade 70 weld.
Reges, José E. O.; Salazar, A. O.; Maitelli, Carla W. S. P.; Carvalho, Lucas G.; Britto, Ursula J. B.
2016-01-01
This work is a contribution to the development of flow sensors in the oil and gas industry. It presents a methodology to measure the flow rates into multiple-zone water-injection wells from fluid temperature profiles and estimate the measurement uncertainty. First, a method to iteratively calculate the zonal flow rates using the Ramey (exponential) model was described. Next, this model was linearized to perform an uncertainty analysis. Then, a computer program to calculate the injected flow rates from experimental temperature profiles was developed. In the experimental part, a fluid temperature profile from a dual-zone water-injection well located in the Northeast Brazilian region was collected. Thus, calculated and measured flow rates were compared. The results proved that linearization error is negligible for practical purposes and the relative uncertainty increases as the flow rate decreases. The calculated values from both the Ramey and linear models were very close to the measured flow rates, presenting a difference of only 4.58 m³/d and 2.38 m³/d, respectively. Finally, the measurement uncertainties from the Ramey and linear models were equal to 1.22% and 1.40% (for injection zone 1); 10.47% and 9.88% (for injection zone 2). Therefore, the methodology was successfully validated and all objectives of this work were achieved. PMID:27420068
Analysis and methodology for aeronautical systems technology program planning
NASA Technical Reports Server (NTRS)
White, M. J.; Gershkoff, I.; Lamkin, S.
1983-01-01
A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.
Sewell, Fiona; Doe, John; Gellatly, Nichola; Ragan, Ian; Burden, Natalie
2017-10-01
The current animal-based paradigm for safety assessment must change. In September 2016, the UK National Centre for Replacement, Refinement and Reduction of Animals in Research (NC3Rs) brought together scientists from regulatory authorities, academia and industry to review progress in bringing new methodology into regulatory use, and to identify ways to expedite progress. Progress has been slow. Science is advancing to make this possible but changes are necessary. The new paradigm should allow new methodology to be adopted once it is developed rather than being based on a fixed set of studies. Regulatory authorities can help by developing Performance-Based Standards. The most pressing need is in repeat dose toxicology, although setting standards will be more complex than in areas such as sensitization. Performance standards should be aimed directly at human safety, not at reproducing the results of animal studies. Regulatory authorities can also aid progress towards the acceptance of non-animal based methodology by promoting "safe-haven" trials where traditional and new methodology data can be submitted in parallel to build up experience in the new methods. Industry can play its part in the acceptance of new methodology, by contributing to the setting of performance standards and by actively contributing to "safe-haven" trials. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Delineating baseflow contribution areas for streams - A model and methods comparison
NASA Astrophysics Data System (ADS)
Chow, Reynold; Frind, Michael E.; Frind, Emil O.; Jones, Jon P.; Sousa, Marcelo R.; Rudolph, David L.; Molson, John W.; Nowak, Wolfgang
2016-12-01
This study addresses the delineation of areas that contribute baseflow to a stream reach, also known as stream capture zones. Such areas can be delineated using standard well capture zone delineation methods, with three important differences: (1) natural gradients are smaller compared to those produced by supply wells and are therefore subject to greater numerical errors, (2) stream discharge varies seasonally, and (3) stream discharge varies spatially. This study focuses on model-related uncertainties due to model characteristics, discretization schemes, delineation methods, and particle tracking algorithms. The methodology is applied to the Alder Creek watershed in southwestern Ontario. Four different model codes are compared: HydroGeoSphere, WATFLOW, MODFLOW, and FEFLOW. In addition, two delineation methods are compared: reverse particle tracking and reverse transport, where the latter considers local-scale parameter uncertainty by using a macrodispersion term to produce a capture probability plume. The results from this study indicate that different models can calibrate acceptably well to the same data and produce very similar distributions of hydraulic head, but can produce different capture zones. The stream capture zone is found to be highly sensitive to the particle tracking algorithm. It was also found that particle tracking by itself, if applied to complex systems such as the Alder Creek watershed, would require considerable subjective judgement in the delineation of stream capture zones. Reverse transport is an alternative and more reliable approach that provides probability intervals for the baseflow contribution areas, taking uncertainty into account. The two approaches can be used together to enhance the confidence in the final outcome.
van Voorn, George A. K.; Ligtenberg, Arend; Molenaar, Jaap
2017-01-01
Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover’s distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system. PMID:28196372
Resilience through adaptation.
Ten Broeke, Guus A; van Voorn, George A K; Ligtenberg, Arend; Molenaar, Jaap
2017-01-01
Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover's distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system.
Applying graphs and complex networks to football metric interpretation.
Arriaza-Ardiles, E; Martín-González, J M; Zuniga, M D; Sánchez-Flores, J; de Saa, Y; García-Manso, J M
2018-02-01
This work presents a methodology for analysing the interactions between players in a football team, from the point of view of graph theory and complex networks. We model the complex network of passing interactions between players of a same team in 32 official matches of the Liga de Fútbol Profesional (Spain), using a passing/reception graph. This methodology allows us to understand the play structure of the team, by analysing the offensive phases of game-play. We utilise two different strategies for characterising the contribution of the players to the team: the clustering coefficient, and centrality metrics (closeness and betweenness). We show the application of this methodology by analyzing the performance of a professional Spanish team according to these metrics and the distribution of passing/reception in the field. Keeping in mind the dynamic nature of collective sports, in the future we will incorporate metrics which allows us to analyse the performance of the team also according to the circumstances of game-play and to different contextual variables such as, the utilisation of the field space, the time, and the ball, according to specific tactical situations. Copyright © 2017 Elsevier B.V. All rights reserved.
Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E
2014-09-01
This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.
In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Computational and Statistical Models: A Comparison for Policy Modeling of Childhood Obesity
NASA Astrophysics Data System (ADS)
Mabry, Patricia L.; Hammond, Ross; Ip, Edward Hak-Sing; Huang, Terry T.-K.
As systems science methodologies have begun to emerge as a set of innovative approaches to address complex problems in behavioral, social science, and public health research, some apparent conflicts with traditional statistical methodologies for public health have arisen. Computational modeling is an approach set in context that integrates diverse sources of data to test the plausibility of working hypotheses and to elicit novel ones. Statistical models are reductionist approaches geared towards proving the null hypothesis. While these two approaches may seem contrary to each other, we propose that they are in fact complementary and can be used jointly to advance solutions to complex problems. Outputs from statistical models can be fed into computational models, and outputs from computational models can lead to further empirical data collection and statistical models. Together, this presents an iterative process that refines the models and contributes to a greater understanding of the problem and its potential solutions. The purpose of this panel is to foster communication and understanding between statistical and computational modelers. Our goal is to shed light on the differences between the approaches and convey what kinds of research inquiries each one is best for addressing and how they can serve complementary (and synergistic) roles in the research process, to mutual benefit. For each approach the panel will cover the relevant "assumptions" and how the differences in what is assumed can foster misunderstandings. The interpretations of the results from each approach will be compared and contrasted and the limitations for each approach will be delineated. We will use illustrative examples from CompMod, the Comparative Modeling Network for Childhood Obesity Policy. The panel will also incorporate interactive discussions with the audience on the issues raised here.
The contribution of organization theory to nursing health services research.
Mick, Stephen S; Mark, Barbara A
2005-01-01
We review nursing and health services research on health care organizations over the period 1950 through 2004 to reveal the contribution of nursing to this field. Notwithstanding this rich tradition and the unique perspective of nursing researchers grounded in patient care production processes, the following gaps in nursing research remain: (1) the lack of theoretical frameworks about organizational factors relating to internal work processes; (2) the need for sophisticated methodologies to guide empirical investigations; (3) the difficulty in understanding how organizations adapt models for patient care delivery in response to market forces; (4) the paucity of attention to the impact of new technologies on the organization of patient care work processes. Given nurses' deep understanding of the inner workings of health care facilities, we hope to see an increasing number of research programs that tackle these deficiencies.
ERIC Educational Resources Information Center
Huesca, Robert
The participatory method of image production holds enormous potential for communication and journalism scholars operating out of a critical/cultural framework. The methodological potentials of mechanical reproduction were evident in the 1930s, when Walter Benjamin contributed three enduring concepts: questioning the art/document dichotomy; placing…
ERIC Educational Resources Information Center
Jennings, Jerry L.; Apsche, Jack A.; Blossom, Paige; Bayles, Corliss
2013-01-01
Although mindfulness has become a mainstream methodology in mental health treatment, it is a relatively new approach with adolescents, and perhaps especially youth with sexual behavior problems. Nevertheless, clinical experience and several empirical studies are available to show the effectiveness of a systematic mindfulness- based methodology for…
Teaching of Computer Science Topics Using Meta-Programming-Based GLOs and LEGO Robots
ERIC Educational Resources Information Center
Štuikys, Vytautas; Burbaite, Renata; Damaševicius, Robertas
2013-01-01
The paper's contribution is a methodology that integrates two educational technologies (GLO and LEGO robot) to teach Computer Science (CS) topics at the school level. We present the methodology as a framework of 5 components (pedagogical activities, technology driven processes, tools, knowledge transfer actors, and pedagogical outcomes) and…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-29
... DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2013-2014 Award Year: Federal... student's expected family contribution (EFC) for award year 2013-2014 for the student financial aid... in the Consumer Price Index (CPI). For award year 2013-2014, the Secretary is charged with updating...
Speaking Back to the Deficit Discourses: A Theoretical and Methodological Approach
ERIC Educational Resources Information Center
Hogarth, Melitta
2017-01-01
The educational attainment of Aboriginal and Torres Strait Islander students is often presented within a deficit view. The need for Aboriginal and Torres Strait Islander researchers to challenge the societal norms is necessary to contribute to the struggle for self-determination. This paper presents a theoretical and methodological approach that…
Neuroethics and animals: methods and philosophy.
Takala, Tuija; Häyry, Matti
2014-04-01
This article provides an overview of the six other contributions in the Neuroethics and Animals special section. In addition, it discusses the methodological and theoretical problems of interdisciplinary fields. The article suggests that interdisciplinary approaches without established methodological and theoretical bases are difficult to assess scientifically. This might cause these fields to expand without actually advancing.
Developmental Methodology as a Context for Interdisciplinary Dialogue in Developmental Science
ERIC Educational Resources Information Center
Card, Noel A.
2014-01-01
In this comment, I first highlight the contributions of Robinson-Cimpian, Lubienski, Ganley, and Copur-Gencturk (2014) in particular and a more interdisciplinary approach in general for the subdiscipline of developmental psychology. Second, I identify some historic methodological foci of psychology and encourage Robinson-Cimpian et al. to consider…
Use of Comparative Case Study Methodology for US Public Health Policy Analysis: A Review.
Dinour, Lauren M; Kwan, Amy; Freudenberg, Nicholas
There is growing recognition that policies influence population health, highlighting the need for evidence to inform future policy development and reform. This review describes how comparative case study methodology has been applied to public health policy research and discusses the methodology's potential to contribute to this evidence. English-language, peer-reviewed articles published between 1995 and 2012 were sought from 4 databases. Articles were included if they described comparative case studies addressing US public health policy. Two researchers independently assessed the 20 articles meeting review criteria. Case-related characteristics and research design tactics utilized to minimize threats to reliability and validity, such as the use of multiple sources of evidence and a case study protocol, were extracted from each article. Although comparative case study methodology has been used to analyze a range of public health policies at all stages and levels, articles reported an average use of only 3.65 (out of 10) research design tactics. By expanding the use of accepted research design tactics, public health policy researchers can contribute to expanding the evidence needed to advance health-promoting policies.
Diffusion orientation transform revisited.
Canales-Rodríguez, Erick Jorge; Lin, Ching-Po; Iturria-Medina, Yasser; Yeh, Chun-Hung; Cho, Kuan-Hung; Melie-García, Lester
2010-01-15
Diffusion orientation transform (DOT) is a powerful imaging technique that allows the reconstruction of the microgeometry of fibrous tissues based on diffusion MRI data. The three main error sources involving this methodology are the finite sampling of the q-space, the practical truncation of the series of spherical harmonics and the use of a mono-exponential model for the attenuation of the measured signal. In this work, a detailed mathematical description that provides an extension to the DOT methodology is presented. In particular, the limitations implied by the use of measurements with a finite support in q-space are investigated and clarified as well as the impact of the harmonic series truncation. Near- and far-field analytical patterns for the diffusion propagator are examined. The near-field pattern makes available the direct computation of the probability of return to the origin. The far-field pattern allows probing the limitations of the mono-exponential model, which suggests the existence of a limit of validity for DOT. In the regimen from moderate to large displacement lengths the isosurfaces of the diffusion propagator reveal aberrations in form of artifactual peaks. Finally, the major contribution of this work is the derivation of analytical equations that facilitate the accurate reconstruction of some orientational distribution functions (ODFs) and skewness ODFs that are relatively immune to these artifacts. The new formalism was tested using synthetic and real data from a phantom of intersecting capillaries. The results support the hypothesis that the revisited DOT methodology could enhance the estimation of the microgeometry of fiber tissues.
White, Andrew A; Wright, Seth W; Blanco, Roberto; Lemonds, Brent; Sisco, Janice; Bledsoe, Sandy; Irwin, Cindy; Isenhour, Jennifer; Pichert, James W
2004-10-01
Identifying the etiologies of adverse outcomes is an important first step in improving patient safety and reducing malpractice risks. However, relatively little is known about the causes of emergency department-related adverse outcomes. The objective was to describe a method for identification of common causes of adverse outcomes in an emergency department. This methodology potentially can suggest ways to improve care and might provide a model for identification of factors associated with adverse outcomes. This was a retrospective analysis of 74 consecutive files opened by a malpractice insurer between 1995 and 2000. Each risk-management file was analyzed to identify potential causes of adverse outcomes. The main outcomes were rater-assigned codes for alleged problems with care (e.g., failures of communication or problems related to diagnosis). About 50% of cases were related to injuries or abdominal complaints. A contributing cause was found in 92% of cases, and most had more than one contributing cause. The most frequent contributing categories included failure to diagnose (45%), supervision problems (31%), communication problems (30%), patient behavior (24%), administrative problems (20%), and documentation (20%). Specific relating factors within these categories, such as lack of timely resident supervision and failure to follow policies and procedures, were identified. This project documented that an aggregate analysis of risk-management files has the potential to identify shared causes related to real or perceived adverse outcomes. Several potentially correctable systems problems were identified using this methodology. These simple, descriptive management tools may be useful in identifying issues for problem solving and can be easily learned by physicians and managers.
Thelen, Brian; French, Nancy H F; Koziol, Benjamin W; Billmire, Michael; Owen, Robert Chris; Johnson, Jeffrey; Ginsberg, Michele; Loboda, Tatiana; Wu, Shiliang
2013-11-05
A study of the impacts on respiratory health of the 2007 wildland fires in and around San Diego County, California is presented. This study helps to address the impact of fire emissions on human health by modeling the exposure potential of proximate populations to atmospheric particulate matter (PM) from vegetation fires. Currently, there is no standard methodology to model and forecast the potential respiratory health effects of PM plumes from wildland fires, and in part this is due to a lack of methodology for rigorously relating the two. The contribution in this research specifically targets that absence by modeling explicitly the emission, transmission, and distribution of PM following a wildland fire in both space and time. Coupled empirical and deterministic models describing particulate matter (PM) emissions and atmospheric dispersion were linked to spatially explicit syndromic surveillance health data records collected through the San Diego Aberration Detection and Incident Characterization (SDADIC) system using a Generalized Additive Modeling (GAM) statistical approach. Two levels of geographic aggregation were modeled, a county-wide regional level and division of the county into six sub regions. Selected health syndromes within SDADIC from 16 emergency departments within San Diego County relevant for respiratory health were identified for inclusion in the model. The model captured the variability in emergency department visits due to several factors by including nine ancillary variables in addition to wildfire PM concentration. The model coefficients and nonlinear function plots indicate that at peak fire PM concentrations the odds of a person seeking emergency care is increased by approximately 50% compared to non-fire conditions (40% for the regional case, 70% for a geographically specific case). The sub-regional analyses show that demographic variables also influence respiratory health outcomes from smoke. The model developed in this study allows a quantitative assessment and prediction of respiratory health outcomes as it relates to the location and timing of wildland fire emissions relevant for application to future wildfire scenarios. An important aspect of the resulting model is its generality thus allowing its ready use for geospatial assessments of respiratory health impacts under possible future wildfire conditions in the San Diego region. The coupled statistical and process-based modeling demonstrates an end-to-end methodology for generating reasonable estimates of wildland fire PM concentrations and health effects at resolutions compatible with syndromic surveillance data.
Moral judgment as information processing: an integrative review.
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.
The 2014 update to the National Seismic Hazard Model in California
Powers, Peter; Field, Edward H.
2015-01-01
The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.
Moral judgment as information processing: an integrative review
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022
Electron Flux Models for Different Energies at Geostationary Orbit
NASA Technical Reports Server (NTRS)
Boynton, R. J.; Balikhin, M. A.; Sibeck, D. G.; Walker, S. N.; Billings, S. A.; Ganushkina, N.
2016-01-01
Forecast models were derived for energetic electrons at all energy ranges sampled by the third-generation Geostationary Operational Environmental Satellites (GOES). These models were based on Multi-Input Single-Output Nonlinear Autoregressive Moving Average with Exogenous inputs methodologies. The model inputs include the solar wind velocity, density and pressure, the fraction of time that the interplanetary magnetic field (IMF) was southward, the IMF contribution of a solar wind-magnetosphere coupling function proposed by Boynton et al. (2011b), and the Dst index. As such, this study has deduced five new 1 h resolution models for the low-energy electrons measured by GOES (30-50 keV, 50-100 keV, 100-200 keV, 200-350 keV, and 350-600 keV) and extended the existing >800 keV and >2 MeV Geostationary Earth Orbit electron fluxes models to forecast at a 1 h resolution. All of these models were shown to provide accurate forecasts, with prediction efficiencies ranging between 66.9% and 82.3%.
NASA Astrophysics Data System (ADS)
Huespe, A. E.; Oliver, J.; Mora, D. F.
2013-12-01
A finite element methodology for simulating the failure of high performance fiber reinforced concrete composites (HPFRC), with arbitrarily oriented short fibers, is presented. The composite material model is based on a micromorphic approach. Using the framework provided by this theory, the body configuration space is described through two kinematical descriptors. At the structural level, the displacement field represents the standard kinematical descriptor. Additionally, a morphological kinematical descriptor, the micromorphic field, is introduced. It describes the fiber-matrix relative displacement, or slipping mechanism of the bond, observed at the mesoscale level. In the first part of this paper, we summarize the model formulation of the micromorphic approach presented in a previous work by the authors. In the second part, and as the main contribution of the paper, we address specific issues related to the numerical aspects involved in the computational implementation of the model. The developed numerical procedure is based on a mixed finite element technique. The number of dofs per node changes according with the number of fiber bundles simulated in the composite. Then, a specific solution scheme is proposed to solve the variable number of unknowns in the discrete model. The HPFRC composite model takes into account the important effects produced by concrete fracture. A procedure for simulating quasi-brittle fracture is introduced into the model and is described in the paper. The present numerical methodology is assessed by simulating a selected set of experimental tests which proves its viability and accuracy to capture a number of mechanical phenomenon interacting at the macro- and mesoscale and leading to failure of HPFRC composites.
NASA Astrophysics Data System (ADS)
Leandro, J.; Schumann, A.; Pfister, A.
2016-04-01
Some of the major challenges in modelling rainfall-runoff in urbanised areas are the complex interaction between the sewer system and the overland surface, and the spatial heterogeneity of the urban key features. The former requires the sewer network and the system of surface flow paths to be solved simultaneously. The latter is still an unresolved issue because the heterogeneity of runoff formation requires high detailed information and includes a large variety of feature specific rainfall-runoff dynamics. This paper discloses a methodology for considering the variability of building types and the spatial heterogeneity of land surfaces. The former is achieved by developing a specific conceptual rainfall-runoff model and the latter by defining a fully distributed approach for infiltration processes in urban areas with limited storage capacity dependent on OpenStreetMaps (OSM). The model complexity is increased stepwise by adding components to an existing 2D overland flow model. The different steps are defined as modelling levels. The methodology is applied in a German case study. Results highlight that: (a) spatial heterogeneity of urban features has a medium to high impact on the estimated overland flood-depths, (b) the addition of multiple urban features have a higher cumulative effect due to the dynamic effects simulated by the model, (c) connecting the runoff from buildings to the sewer contributes to the non-linear effects observed on the overland flood-depths, and (d) OSM data is useful in identifying pounding areas (for which infiltration plays a decisive role) and permeable natural surface flow paths (which delay the flood propagation).
NASA Astrophysics Data System (ADS)
Geslin, Pierre-Antoine; Gatti, Riccardo; Devincre, Benoit; Rodney, David
2017-11-01
We propose a framework to study thermally-activated processes in dislocation glide. This approach is based on an implementation of the nudged elastic band method in a nodal mesoscale dislocation dynamics formalism. Special care is paid to develop a variational formulation to ensure convergence to well-defined minimum energy paths. We also propose a methodology to rigorously parametrize the model on atomistic data, including elastic, core and stacking fault contributions. To assess the validity of the model, we investigate the homogeneous nucleation of partial dislocation loops in aluminum, recovering the activation energies and loop shapes obtained with atomistic calculations and extending these calculations to lower applied stresses. The present method is also applied to heterogeneous nucleation on spherical inclusions.
In Memoriam - Marvin L. Wesely.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaffney, J. S.; Environmental Research
2003-06-01
Marvin L. Wesely, senior meteorologist at Argonne National Laboratory, died January 20, 2003, from a rare form of heart cancer. He was an internationally know and highly respected leader in the scientific measurement and modeling of atmospheric boundary layer turbulence and dry deposition of air pollutants. His fundamental contributions in the development of methodologies for fomulating dry deposition processes are used in atmospheric and biospheric models applied on all scales, worldwide. His extensive research aimed at finding solutions to such environmental problems as air pollution and global warming resulted in more than 150 published articles. Dr. Wesley was also anmore » editor for the Journal of Applied Meteorology and chief scientist of the atmospheric chemistry program in Washington, DC.« less
A nanomaterial release model for waste shredding using a Bayesian belief network
NASA Astrophysics Data System (ADS)
Shandilya, Neeraj; Ligthart, Tom; van Voorde, Imelda; Stahlmecke, Burkhard; Clavaguera, Simon; Philippot, Cecile; Ding, Yaobo; Goede, Henk
2018-02-01
The shredding of waste of electrical and electronic equipment (WEEE) and other products, incorporated with nanomaterials, can lead to a substantial release of nanomaterials. Considering the uncertainty, complexity, and scarcity of experimental data on release, we present the development of a Bayesian belief network (BBN) model. This baseline model aims to give a first prediction of the release of nanomaterials (excluding nanofibers) during their mechanical shredding. With a focus on the description of the model development methodology, we characterize nanomaterial release in terms of number, size, mass, and composition of released particles. Through a sensitivity analysis of the model, we find the material-specific parameters like affinity of nanomaterials to the matrix of the composite and their state of dispersion inside the matrix to reduce the nanomaterial release up to 50%. The shredder-specific parameters like number of shafts in a shredder and input and output size of the material for shredding could minimize it up to 98%. The comparison with two experimental test cases shows promising outcome on the prediction capacity of the model. As additional experimental data on nanomaterial release becomes available, the model is able to further adapt and update risk forecasts. When adapting the model with additional expert beliefs, experts should be selected using criteria, e.g., substantial contribution to nanomaterial and/or particulate matter release-related scientific literature, the capacity and willingness to contribute to further development of the BBN model, and openness to accepting deviating opinions. [Figure not available: see fulltext.
Multi-scaling modelling in financial markets
NASA Astrophysics Data System (ADS)
Liu, Ruipeng; Aste, Tomaso; Di Matteo, T.
2007-12-01
In the recent years, a new wave of interest spurred the involvement of complexity in finance which might provide a guideline to understand the mechanism of financial markets, and researchers with different backgrounds have made increasing contributions introducing new techniques and methodologies. In this paper, Markov-switching multifractal models (MSM) are briefly reviewed and the multi-scaling properties of different financial data are analyzed by computing the scaling exponents by means of the generalized Hurst exponent H(q). In particular we have considered H(q) for price data, absolute returns and squared returns of different empirical financial time series. We have computed H(q) for the simulated data based on the MSM models with Binomial and Lognormal distributions of the volatility components. The results demonstrate the capacity of the multifractal (MF) models to capture the stylized facts in finance, and the ability of the generalized Hurst exponents approach to detect the scaling feature of financial time series.
NASA Astrophysics Data System (ADS)
Al-Kuhali, K.; Hussain M., I.; Zain Z., M.; Mullenix, P.
2015-05-01
Aim: This paper contribute to the flat panel display industry it terms of aggregate production planning. Methodology: For the minimization cost of total production of LCD manufacturing, a linear programming was applied. The decision variables are general production costs, additional cost incurred for overtime production, additional cost incurred for subcontracting, inventory carrying cost, backorder costs and adjustments for changes incurred within labour levels. Model has been developed considering a manufacturer having several product types, which the maximum types are N, along a total time period of T. Results: Industrial case study based on Malaysia is presented to test and to validate the developed linear programming model for aggregate production planning. Conclusion: The model development is fit under stable environment conditions. Overall it can be recommended to adapt the proven linear programming model to production planning of Malaysian flat panel display industry.
Multi-factor energy price models and exotic derivatives pricing
NASA Astrophysics Data System (ADS)
Hikspoors, Samuel
The high pace at which many of the world's energy markets have gradually been opened to competition have generated a significant amount of new financial activity. Both academicians and practitioners alike recently started to develop the tools of energy derivatives pricing/hedging as a quantitative topic of its own. The energy contract structures as well as their underlying asset properties set the energy risk management industry apart from its more standard equity and fixed income counterparts. This thesis naturally contributes to these broad market developments in participating to the advances of the mathematical tools aiming at a better theory of energy contingent claim pricing/hedging. We propose many realistic two-factor and three-factor models for spot and forward price processes that generalize some well known and standard modeling assumptions. We develop the associated pricing methodologies and propose stable calibration algorithms that motivate the application of the relevant modeling schemes.
Probabilistic Estimates of Global Mean Sea Level and its Underlying Processes
NASA Astrophysics Data System (ADS)
Hay, C.; Morrow, E.; Kopp, R. E.; Mitrovica, J. X.
2015-12-01
Local sea level can vary significantly from the global mean value due to a suite of processes that includes ongoing sea-level changes due to the last ice age, land water storage, ocean circulation changes, and non-uniform sea-level changes that arise when modern-day land ice rapidly melts. Understanding these sources of spatial and temporal variability is critical to estimating past and present sea-level change and projecting future sea-level rise. Using two probabilistic techniques, a multi-model Kalman smoother and Gaussian process regression, we have reanalyzed 20th century tide gauge observations to produce a new estimate of global mean sea level (GMSL). Our methods allow us to extract global information from the sparse tide gauge field by taking advantage of the physics-based and model-derived geometry of the contributing processes. Both methods provide constraints on the sea-level contribution of glacial isostatic adjustment (GIA). The Kalman smoother tests multiple discrete models of glacial isostatic adjustment (GIA), probabilistically computing the most likely GIA model given the observations, while the Gaussian process regression characterizes the prior covariance structure of a suite of GIA models and then uses this structure to estimate the posterior distribution of local rates of GIA-induced sea-level change. We present the two methodologies, the model-derived geometries of the underlying processes, and our new probabilistic estimates of GMSL and GIA.
Algebra for Enterprise Ontology: towards analysis and synthesis of enterprise models
NASA Astrophysics Data System (ADS)
Suga, Tetsuya; Iijima, Junichi
2018-03-01
Enterprise modeling methodologies have made enterprises more likely to be the object of systems engineering rather than craftsmanship. However, the current state of research in enterprise modeling methodologies lacks investigations of the mathematical background embedded in these methodologies. Abstract algebra, a broad subfield of mathematics, and the study of algebraic structures may provide interesting implications in both theory and practice. Therefore, this research gives an empirical challenge to establish an algebraic structure for one aspect model proposed in Design & Engineering Methodology for Organizations (DEMO), which is a major enterprise modeling methodology in the spotlight as a modeling principle to capture the skeleton of enterprises for developing enterprise information systems. The results show that the aspect model behaves well in the sense of algebraic operations and indeed constructs a Boolean algebra. This article also discusses comparisons with other modeling languages and suggests future work.
Applications of information theory, genetic algorithms, and neural models to predict oil flow
NASA Astrophysics Data System (ADS)
Ludwig, Oswaldo; Nunes, Urbano; Araújo, Rui; Schnitman, Leizer; Lepikson, Herman Augusto
2009-07-01
This work introduces a new information-theoretic methodology for choosing variables and their time lags in a prediction setting, particularly when neural networks are used in non-linear modeling. The first contribution of this work is the Cross Entropy Function (XEF) proposed to select input variables and their lags in order to compose the input vector of black-box prediction models. The proposed XEF method is more appropriate than the usually applied Cross Correlation Function (XCF) when the relationship among the input and output signals comes from a non-linear dynamic system. The second contribution is a method that minimizes the Joint Conditional Entropy (JCE) between the input and output variables by means of a Genetic Algorithm (GA). The aim is to take into account the dependence among the input variables when selecting the most appropriate set of inputs for a prediction problem. In short, theses methods can be used to assist the selection of input training data that have the necessary information to predict the target data. The proposed methods are applied to a petroleum engineering problem; predicting oil production. Experimental results obtained with a real-world dataset are presented demonstrating the feasibility and effectiveness of the method.
Mutel, Christopher L; de Baan, Laura; Hellweg, Stefanie
2013-06-04
Comprehensive sensitivity analysis is a significant tool to interpret and improve life cycle assessment (LCA) models, but is rarely performed. Sensitivity analysis will increase in importance as inventory databases become regionalized, increasing the number of system parameters, and parametrized, adding complexity through variables and nonlinear formulas. We propose and implement a new two-step approach to sensitivity analysis. First, we identify parameters with high global sensitivities for further examination and analysis with a screening step, the method of elementary effects. Second, the more computationally intensive contribution to variance test is used to quantify the relative importance of these parameters. The two-step sensitivity test is illustrated on a regionalized, nonlinear case study of the biodiversity impacts from land use of cocoa production, including a worldwide cocoa products trade model. Our simplified trade model can be used for transformable commodities where one is assessing market shares that vary over time. In the case study, the highly uncertain characterization factors for the Ivory Coast and Ghana contributed more than 50% of variance for almost all countries and years examined. The two-step sensitivity test allows for the interpretation, understanding, and improvement of large, complex, and nonlinear LCA systems.
Why Medical Informatics (still) Needs Cognitive and Social Sciences.
Declerck, G; Aimé, X
2013-01-01
To summarize current excellent medical informatics research in the field of human factors and organizational issues. Using PubMed, a total of 3,024 papers were selected from 17 journals. The papers were evaluated on the basis of their title, keywords, and abstract, using several exclusion and inclusion criteria. 15 preselected papers were carefully evaluated by six referees using a standard evaluation grid. Six best papers were selected exemplifying the central role cognitive and social sciences can play in medical informatics research. Among other contributions, those studies: (i) make use of the distributed cognition paradigm to model and understand clinical care situations; (ii) take into account organizational issues to analyse the impact of HIT on information exchange and coordination processes; (iii) illustrate how models and empirical data from cognitive psychology can be used in medical informatics; and (iv) highlight the need of qualitative studies to analyze the unexpected side effects of HIT on cognitive and work processes. The selected papers demonstrate that paradigms, methodologies, models, and results from cognitive and social sciences can help to bridge the gap between HIT and end users, and contribute to limit adoption failures that are reported regularly.
BioSPICE: access to the most current computational tools for biologists.
Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark
2003-01-01
The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.
Novel methodologies in marine fish larval nutrition.
Conceição, Luis E C; Aragão, Cláudia; Richard, Nadège; Engrola, Sofia; Gavaia, Paulo; Mira, Sara; Dias, Jorge
2010-03-01
Major gaps in knowledge on fish larval nutritional requirements still remain. Small larval size, and difficulties in acceptance of inert microdiets, makes progress slow and cumbersome. This lack of knowledge in fish larval nutritional requirements is one of the causes of high mortalities and quality problems commonly observed in marine larviculture. In recent years, several novel methodologies have contributed to significant progress in fish larval nutrition. Others are emerging and are likely to bring further insight into larval nutritional physiology and requirements. This paper reviews a range of new tools and some examples of their present use, as well as potential future applications in the study of fish larvae nutrition. Tube-feeding and incorporation into Artemia of (14)C-amino acids and lipids allowed studying Artemia intake, digestion and absorption and utilisation of these nutrients. Diet selection by fish larvae has been studied with diets containing different natural stable isotope signatures or diets where different rare metal oxides were added. Mechanistic modelling has been used as a tool to integrate existing knowledge and reveal gaps, and also to better understand results obtained in tracer studies. Population genomics may assist in assessing genotype effects on nutritional requirements, by using progeny testing in fish reared in the same tanks, and also in identifying QTLs for larval stages. Functional genomics and proteomics enable the study of gene and protein expression under various dietary conditions, and thereby identify the metabolic pathways which are affected by a given nutrient. Promising results were obtained using the metabolic programming concept in early life to facilitate utilisation of certain nutrients at later stages. All together, these methodologies have made decisive contributions, and are expected to do even more in the near future, to build a knowledge basis for development of optimised diets and feeding regimes for different species of larval fish.
Shi, Chune; Fernando, H J S; Hyde, Peter
2012-02-01
Phoenix, Arizona, has been an ozone nonattainment area for the past several years and it remains so. Mitigation strategies call for improved modeling methodologies as well as understanding of ozone formation and destruction mechanisms during seasons of high ozone events. To this end, the efficacy of lateral boundary conditions (LBCs) based on satellite measurements (adjusted-LBCs) was investigated, vis-à-vis the default-LBCs, for improving the predictions of Models-3/CMAQ photochemical air quality modeling system. The model evaluations were conducted using hourly ground-level ozone and NO(2) concentrations as well as tropospheric NO(2) columns and ozone concentrations in the middle to upper troposphere, with the 'design' periods being June and July of 2006. Both included high ozone episodes, but the June (pre-monsoon) period was characterized by local thermal circulation whereas the July (monsoon) period by synoptic influence. Overall, improved simulations were noted for adjusted-LBC runs for ozone concentrations both at the ground-level and in the middle to upper troposphere, based on EPA-recommended model performance metrics. The probability of detection (POD) of ozone exceedances (>75ppb, 8-h averages) for the entire domain increased from 20.8% for the default-LBC run to 33.7% for the adjusted-LBC run. A process analysis of modeling results revealed that ozone within PBL during bulk of the pre-monsoon season is contributed by local photochemistry and vertical advection, while the contributions of horizontal and vertical advections are comparable in the monsoon season. The process analysis with adjusted-LBC runs confirms the contributions of vertical advection to episodic high ozone days, and hence elucidates the importance of improving predictability of upper levels with improved LBCs. Copyright © 2011 Elsevier B.V. All rights reserved.
Bonacim, Carlos Alberto Grespan; Araújo, Adriana Maria Procópio de
2010-06-01
This paper contributes to public institutions with the adaptation of a performance evaluation tool based on private companies. The objective is to demonstrate how the impact of an educational activity might be measured in the economic value added for the society of a public university hospital. The paper was divided in four parts, despite the introductory and methodological aspects and the final remarks. First, the hospital sector is explained, specifically in the context of the public university hospitals. Then, the definitions, the nature and measure of the intellectual capital are presented, followed by the disclosure of the main economic performance evaluation models. Finally, an adapted model is presented, under the approach of the value based management, considering adjustments of the return and the respective investment measures, showing the impacts of the intellectual capital management and the education activity on the economic result of those institutions. The study was developed based on a methodology supported by a bibliographical research, using a comparative method procedure in the descriptive modality. At last, it is highlighted the importance of accountability for the society regarding the use of public resources and how this study can help in this way.
NASA Astrophysics Data System (ADS)
Pavlak, Gregory S.
Building energy use is a significant contributing factor to growing worldwide energy demands. In pursuit of a sustainable energy future, commercial building operations must be intelligently integrated with the electric system to increase efficiency and enable renewable generation. Toward this end, a model-based methodology was developed to estimate the capability of commercial buildings to participate in frequency regulation ancillary service markets. This methodology was integrated into a supervisory model predictive controller to optimize building operation in consideration of energy prices, demand charges, and ancillary service revenue. The supervisory control problem was extended to building portfolios to evaluate opportunities for synergistic effect among multiple, centrally-optimized buildings. Simulation studies performed showed that the multi-market optimization was able to determine appropriate opportunities for buildings to provide frequency regulation. Total savings were increased by up to thirteen percentage points, depending on the simulation case. Furthermore, optimizing buildings as a portfolio achieved up to seven additional percentage points of savings, depending on the case. Enhanced energy and cost savings opportunities were observed by taking the novel perspective of optimizing building portfolios in multiple grid markets, motivating future pursuits of advanced control paradigms that enable a more intelligent electric grid.
Statistical and Economic Techniques for Site-specific Nematode Management.
Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L
2014-03-01
Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.
Healthcare Information Systems for the epidemiologic surveillance within the community.
Diomidous, Marianna; Pistolis, John; Mechili, Aggelos; Kolokathi, Aikaterini; Zimeras, Stelios
2013-01-01
Public health and health care are important issues for developing countries and access to health care is a significant factor that contributes to a healthy population. In response to these issues, the World Health Organization (WHO) has been working on the development of methods and models for measuring physical accessibility to health care using several layers of information integrated in a GIS. This paper describes the methodological approach for the development of a real time electronic health record, based on the statistical and geographic information for the identification of various diseases and accidents that can happen in a specific place.
Tiruta-Barna, Ligia; Fantozzi-Merle, Catherine; de Brauer, Christine; Barna, Radu
2006-11-16
The aim of this paper is the investigation of the leaching behaviour of different porous materials containing organic pollutants (PAH: naphthalene and phenanthrene). The assessment methodology of long term leaching behaviour of inorganic materials was extended to cement solidified organic pollutants. Based on a scenario-approach considering environmental factors, matrix and pollutants specificities, the applied methodology is composed of adapted equilibrium and dynamic leaching tests. The contributions of different physical and chemical mechanisms were identified and the leaching behaviour was modelled. The physical parameters of the analysed reference and polluted materials are similar. A difference in the pore size distribution appears for higher naphthalene content. The solubility of the PAH contained in the material is affected by the ionic strength and by the presence of a co-solvent; the solution pH does not influence PAH solubility. The solubility of the major mineral species is not influenced by the presence of the two PAH nor by the presence of the methanol as co-solvent in the range of the tested material compositions. In the case of the leaching of a monolith material the main transport mechanism is the diffusion in the porous system. For both mineral and organic species we observed at least two dynamic domains. At the beginning of the leaching process the released flux is due to the surface dissolution and to the diffusion of the main quantity dissolved in the initial pore solution. The second period is governed by a stationary regime between dissolution in pore water and diffusion. The model, coupling transport and chemical phenomena in the pore solution, at the monolith surface and in the leachate simulates satisfactory the release for both mineral and organic species.
[Financing, organization, costs and services performance of the Argentinean health sub-systems.
Yavich, Natalia; Báscolo, Ernesto Pablo; Haggerty, Jeannie
2016-01-01
To analyze the relationship between health system financing and services organization models with costs and health services performance in each of Rosario's health sub-systems. The financing and organization models were characterized using secondary data. Costs were calculated using the WHO/SHA methodology. Healthcare quality was measured by a household survey (n=822). Public subsystem:Vertically integrated funding and primary healthcare as a leading strategy to provide services produced low costs and individual-oriented healthcare but with weak accessibility conditions and comprehensiveness. Private subsystem: Contractual integration and weak regulatory and coordination mechanisms produced effects opposed to those of the public sub-system. Social security: Contractual integration and strong regulatory and coordination mechanisms contributed to intermediate costs and overall high performance. Each subsystem financing and services organization model had a strong and heterogeneous influence on costs and health services performance.
Flight test planning and parameter extraction for rotorcraft system identification
NASA Technical Reports Server (NTRS)
Wang, J. C.; Demiroz, M. Y.; Talbot, P. D.
1986-01-01
The present study is concerned with the mathematical modelling of aircraft dynamics on the basis of an investigation conducted with the aid of the Rotor System Research Aircraft (RSRA). The particular characteristics of RSRA make it possible to investigate aircraft properties which cannot be readily studied elsewhere, for example in the wind tunnel. The considered experiment had mainly the objective to develop an improved understanding of the physics of rotor flapping dynamics and rotor loads in maneuvers. The employed approach is based on a utilization of parameter identification methodology (PID) with application to helicopters. A better understanding of the contribution of the main rotor to the overall aircraft forces and moments is also to be obtained. Attention is given to the mathematical model of a rotorcraft system, an integrated identification method, flight data processing, and the identification of RSRA mathematical models.
Modeling Common Cause Failures of Thrusters on ISS Visiting Vehicles
NASA Technical Reports Server (NTRS)
Haught, Megan; Duncan, Gary
2014-01-01
This paper discusses the methodology used to model common cause failures of thrusters on the International Space Station (ISS) Visiting Vehicles. The ISS Visiting Vehicles each have as many as 32 thrusters, whose redundancy and similar design make them susceptible to common cause failures. The Global Alpha Model (as described in NUREG/CR-5485) can be used to represent the system common cause contribution, but NUREG/CR-5496 supplies global alpha parameters for groups only up to size six. Because of the large number of redundant thrusters on each vehicle, regression is used to determine parameter values for groups of size larger than six. An additional challenge is that Visiting Vehicle thruster failures must occur in specific combinations in order to fail the propulsion system; not all failure groups of a certain size are critical.
Pedroza, Mesias; Schneider, Daniel J.; Karmouty-Quintana, Harry; Coote, Julie; Shaw, Stevan; Corrigan, Rebecca; Molina, Jose G.; Alcorn, Joseph L.; Galas, David; Gelinas, Richard; Blackburn, Michael R.
2011-01-01
Background Chronic lung diseases are the third leading cause of death in the United States due in part to an incomplete understanding of pathways that govern the progressive tissue remodeling that occurs in these disorders. Adenosine is elevated in the lungs of animal models and humans with chronic lung disease where it promotes air-space destruction and fibrosis. Adenosine signaling increases the production of the pro-fibrotic cytokine interleukin-6 (IL-6). Based on these observations, we hypothesized that IL-6 signaling contributes to tissue destruction and remodeling in a model of chronic lung disease where adenosine levels are elevated. Methodology/Principal Findings We tested this hypothesis by neutralizing or genetically removing IL-6 in adenosine deaminase (ADA)-deficient mice that develop adenosine dependent pulmonary inflammation and remodeling. Results demonstrated that both pharmacologic blockade and genetic removal of IL-6 attenuated pulmonary inflammation, remodeling and fibrosis in this model. The pursuit of mechanisms involved revealed adenosine and IL-6 dependent activation of STAT-3 in airway epithelial cells. Conclusions/Significance These findings demonstrate that adenosine enhances IL-6 signaling pathways to promote aspects of chronic lung disease. This suggests that blocking IL-6 signaling during chronic stages of disease may provide benefit in halting remodeling processes such as fibrosis and air-space destruction. PMID:21799929
Do Joint Fighter Programs Save Money? Technical Appendixes on Methodology
2013-01-01
Bookstore Make a charitable contribution Limited Electronic Distribution Rights This document and trademark(s) contained herein are protected by...research clients and sponsors. Support RAND—make a tax-deductible charitable contribution at www.rand.org/giving/contribute.html R® is a registered...Evidence, Organisation for Economic Co-Operation and Development, Economics Department Working Paper 317, January 17, 2002. Anderson, Fred, Northrop
Genetic determination of height-mediated mate choice.
Tenesa, Albert; Rawlik, Konrad; Navarro, Pau; Canela-Xandri, Oriol
2016-01-19
Numerous studies have reported positive correlations among couples for height. This suggests that humans find individuals of similar height attractive. However, the answer to whether the choice of a mate with a similar phenotype is genetically or environmentally determined has been elusive. Here we provide an estimate of the genetic contribution to height choice in mates in 13,068 genotyped couples. Using a mixed linear model we show that 4.1% of the variation in the mate height choice is determined by a person's own genotype, as expected in a model where one's height determines the choice of mate height. Furthermore, the genotype of an individual predicts their partners' height in an independent dataset of 15,437 individuals with 13% accuracy, which is 64% of the theoretical maximum achievable with a heritability of 0.041. Theoretical predictions suggest that approximately 5% of the heritability of height is due to the positive covariance between allelic effects at different loci, which is caused by assortative mating. Hence, the coupling of alleles with similar effects could substantially contribute to the missing heritability of height. These estimates provide new insight into the mechanisms that govern mate choice in humans and warrant the search for the genetic causes of choice of mate height. They have important methodological implications and contribute to the missing heritability debate.
3D modeling of a dolerite intrusion from the photogrammetric and geophysical data integration.
NASA Astrophysics Data System (ADS)
Duarte, João; Machadinho, Ana; Figueiredo, Fernando; Mira, Maria
2015-04-01
The aims of this study is create a methodology based on the integration of data obtained from various available technologies, which allow a credible and complete evaluation of rock masses. In this particular case of a dolerite intrusion, which deployed an exploration of aggregates and belongs to the Jobasaltos - Extracção e Britagem. S.A.. Dolerite intrusion is situated in the volcanic complex of Serra de Todo-o-Mundo, Casais Gaiola, intruded in Jurassic sandstones. The integration of the surface and subsurface mapping, obtained by technology UAVs (Drone) and geophysical surveys (Electromagnetic Method - TEM 48 FAST), allows the construction of 2D and 3D models of the study local. The combination of the 3D point clouds produced from two distinct processes, modeling of photogrammetric and geophysical data, will be the basis for the construction of a single model of set. The rock masses in an integral perspective being visible their development above the surface and subsurface. The presentation of 2D and 3D models will give a perspective of structures, fracturation, lithology and their spatial correlations contributing to a better local knowledge, as well as its potential for the intended purpose. From these local models it will be possible to characterize and quantify the geological structures. These models will have its importance as a tool to assist in the analysis and drafting of regional models. The qualitative improvement in geological/structural modeling, seeks to reduce the value of characterization/cost ratio, in phase of prospecting, improving the investment/benefit ratio. This methodology helps to assess more accurately the economic viability of the projects.
[Customer and patient satisfaction. An appropriate management tool in hospitals?].
Pawils, S; Trojan, A; Nickel, S; Bleich, C
2012-09-01
Recently, the concept of patient satisfaction has been established as an essential part of the quality management of hospitals. Despite the concept's lack of theoretical and methodological foundations, patient surveys on subjective hospital experiences contribute immensely to the improvement of hospitals. What needs to be considered critically in this context is the concept of customer satisfaction for patients, the theoretical integration of empirical results, the reduction of false satisfaction indications and the application of risk-adjusted versus naïve benchmarking of data. This paper aims to contribute to the theoretical discussion of the topic and to build a basis for planning methodologically sound patient surveys.
Design and Customization of Telemedicine Systems
Martínez-Alcalá, Claudia I.; Muñoz, Mirna; Monguet-Fierro, Josep
2013-01-01
In recent years, the advances in information and communication technology (ICT) have resulted in the development of systems and applications aimed at supporting rehabilitation therapy that contributes to enrich patients' life quality. This work is focused on the improvement of the telemedicine systems with the purpose of customizing therapies according to the profile and disability of patients. For doing this, as salient contribution, this work proposes the adoption of user-centered design (UCD) methodology for the design and development of telemedicine systems in order to support the rehabilitation of patients with neurological disorders. Finally, some applications of the UCD methodology in the telemedicine field are presented as a proof of concept. PMID:23762191
Hartz, Susanne; John, Jürgen
2008-01-01
Economic evaluation as an integral part of health technology assessment is today mostly applied to established technologies. Evaluating healthcare innovations in their early states of development has recently attracted attention. Although it offers several benefits, it also holds methodological challenges. The aim of our study was to investigate the possible contributions of economic evaluation to industry's decision making early in product development and to confront the results with the actual use of early data in economic assessments. We conducted a literature research to detect methodological contributions as well as economic evaluations that used data from early phases of product development. Economic analysis can be beneficially used in early phases of product development for various purposes including early market assessment, R&D portfolio management, and first estimations of pricing and reimbursement scenarios. Analytical tools available for these purposes have been identified. Numerous empirical works were detected, but most do not disclose any concrete decision context and could not be directly matched with the suggested applications. Industry can benefit from starting economic evaluation early in product development in several ways. Empirical evidence suggests that there is still potential left unused.
The present study investigated whether combining of targeted analytical chemistry methods with unsupervised, data-rich methodologies (i.e. transcriptomics) can be utilized to evaluate relative contributions of wastewater treatment plant (WWTP) effluents to biological effects. The...
NASA Astrophysics Data System (ADS)
Jayaweera, H. M. P. C.; Muhtaroğlu, Ali
2016-11-01
A novel model based methodology is presented to determine optimal device parameters for the fully integrated ultra low voltage DC-DC converter for energy harvesting applications. The proposed model feasibly contributes to determine the maximum efficient number of charge pump stages to fulfill the voltage requirement of the energy harvester application. The proposed DC-DC converter based power consumption model enables the analytical derivation of the charge pump efficiency when utilized simultaneously with the known LC tank oscillator behavior under resonant conditions, and voltage step up characteristics of the cross-coupled charge pump topology. The verification of the model has been done using a circuit simulator. The optimized system through the established model achieves more than 40% maximum efficiency yielding 0.45 V output with single stage, 0.75 V output with two stages, and 0.9 V with three stages for 2.5 kΩ, 3.5 kΩ and 5 kΩ loads respectively using 0.2 V input.
The Influence of Consumer Goals and Marketing Activities on Product Bundling
NASA Astrophysics Data System (ADS)
Haijun, Wang
Upon entering a store, consumers are faced with the questions of whether to buy, what to buy, and how much to buy. Consumers include products from different categories in their decision process. Product categories can be related in different ways. Product bundling is a process that involves the choice of at least two non-substitutable items. In this research, the consumers' explicit product bundling activity at the point of sale is focused. We focuses on the retailers' perspective and therefore leaves out consumers' brand choice decisions, concentrating on purchase incidence and quantity. At the base of the current model of the exist researches, we integrate behavioural choice analysis and predictive choice modelling through the underlying behavioural models, called random utility maximization (RUM) models. The methodological contribution of this research lies therein to combine a nested logit choice model with a latent variable factor model. We point out several limitations for both theory and practice at the end.
Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.
2009-01-01
The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.
ERIC Educational Resources Information Center
Zembylas, Michalinos
2017-01-01
This paper follows recent debates on the ontological turn in the social sciences and humanities to exemplify how this turn creates important openings of methodological and political potential in education. In particular, the paper makes an attempt to show two things: first, the new questions and possibilities that are opened from explicitly…
ERIC Educational Resources Information Center
Karabenick, Stuart A.; Zusho, Akane
2015-01-01
We provide a conceptual commentary on the articles in this special issue, first by describing the unique features of each study, focusing on what we consider to be their theoretical and methodological contributions, and then by highlighting significant crosscutting themes and future directions in the study of SRL. Specifically, we define SRL to be…
Methodology for the Preliminary Design of High Performance Schools in Hot and Humid Climates
ERIC Educational Resources Information Center
Im, Piljae
2009-01-01
A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the…
ERIC Educational Resources Information Center
Bonometti, Patrizia
2012-01-01
Purpose: The aim of this contribution is to describe a new complexity-science-based approach for improving safety, quality and efficiency and the way it was implemented by TenarisDalmine. Design/methodology/approach: This methodology is called "a safety-building community". It consists of a safety-behaviour social self-construction…
ERIC Educational Resources Information Center
Hunleth, Jean
2011-01-01
By taking a reflexive approach to research methodology, this article contributes to discussions on power dynamics and knowledge production in the social studies of children. The author describes and analyzes three research methods that she used with children--drawing, child-led tape-recording and focus group discussions. These methods were carried…
Daniel R. Williams; Michael E. Patterson
2007-01-01
Place ideas in natural resource management have grown in recent years. But with that growth have come greater complexity and diversity in thinking and mounting confusion about the ontological and epistemological assumptions underlying any specific investigation. Beckley et al. (2007) contribute to place research by proposing a new methodological approach to analyzing...
PDF-based heterogeneous multiscale filtration model.
Gong, Jian; Rutland, Christopher J
2015-04-21
Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.
Overview of the Aeroelastic Prediction Workshop
NASA Technical Reports Server (NTRS)
Heeg, Jennifer; Chwalowski, Pawel; Florance, Jennifer P.; Wieseman, Carol D.; Schuster, David M.; Perry, Raleigh B.
2013-01-01
The Aeroelastic Prediction Workshop brought together an international community of computational fluid dynamicists as a step in defining the state of the art in computational aeroelasticity. This workshop's technical focus was prediction of unsteady pressure distributions resulting from forced motion, benchmarking the results first using unforced system data. The most challenging aspects of the physics were identified as capturing oscillatory shock behavior, dynamic shock-induced separated flow and tunnel wall boundary layer influences. The majority of the participants used unsteady Reynolds-averaged Navier Stokes codes. These codes were exercised at transonic Mach numbers for three configurations and comparisons were made with existing experimental data. Substantial variations were observed among the computational solutions as well as differences relative to the experimental data. Contributing issues to these differences include wall effects and wall modeling, non-standardized convergence criteria, inclusion of static aeroelastic deflection, methodology for oscillatory solutions, post-processing methods. Contributing issues pertaining principally to the experimental data sets include the position of the model relative to the tunnel wall, splitter plate size, wind tunnel expansion slot configuration, spacing and location of pressure instrumentation, and data processing methods.
An Implementing Strategy for Improving Wildland Fire Environmental Literacy
NASA Astrophysics Data System (ADS)
McCalla, M. R.; Andrus, D.; Barnett, K.
2007-12-01
Wildland fire is any planned or unplanned fire which occurs in wildland ecosystems. Wildland fires affect millions of acres annually in the U.S. An average of 5.4 million acres a year were burned in the U.S. between 1995 and 2004, approximately 142 percent of the average burned area between 1984 and 1994. In 2005 alone, Federal agencies spent nearly $1 billion on fire suppression and state and local agencies contributed millions more. Many Americans prefer to live and vacation in relatively remote surroundings, (i.e., woods and rangelands). These choices offer many benefits, but they also present significant risks. Most of North America is fire-prone and every day developed areas and home sites are extending further into natural wildlands, which increases the chances of catastrophic fire. In addition, an abundance of accumulated biomass in forests and rangelands and persistent drought conditions are contributing to larger, costlier wildland fires. To effectively prevent, manage, suppress, respond to, and recover from wildland fires, fire managers, and other communities which are impacted by wildland fires (e.g., the business community; healthcare providers; federal, state, and local policymakers; the media; the public, etc.) need timely, accurate, and detailed wildland fire weather and climate information to support their decision-making activities. But what are the wildland fire weather and climate data, products, and information, as well as information dissemination technologies, needed to reach out and promote wildland fire environmental literacy in these communities? The Office of the Federal Coordinator for Meteorological Services and Supporting Research (OFCM) conducted a comprehensive review and assessment of weather and climate needs of providers and users in their wildland fire and fuels management activities. The assessment has nine focus areas, one of which is environmental literacy (e.g., education, training, outreach, partnering, and collaboration). The OFCM model for promoting wildland fire environmental literacy, the model's component parts, as well as an implementing strategy to execute the model will be presented. That is, the presentation will lay out the framework and methodology which the OFCM used to systematically define the wildland fire weather and climate education and outreach needs through interdepartmental collaboration within the OFCM coordinating infrastructure. A key element of the methodology is to improve the overall understanding and use of wildland fire forecast and warning climate and weather products and to exploit current and emerging technologies to improve the dissemination of customer-tailored forecast and warning information and products to stakeholders and users. Thus, the framework and methodology define the method used to determine the target public, private, and academic sector audiences. The methodology also identifies the means for determining the optimal channels, formats, and content for informing end users in time for effective action to be taken.
Nassar, Dalia
2016-08-01
In contrast to the previously widespread view that Kant's work was largely in dialogue with the physical sciences, recent scholarship has highlighted Kant's interest in and contributions to the life sciences. Scholars are now investigating the extent to which Kant appealed to and incorporated insights from the life sciences and considering the ways he may have contributed to a new conception of living beings. The scholarship remains, however, divided in its interest: historians of science are concerned with the content of Kant's claims, and the ways in which they may or may not have contributed to the emerging science of life, while historians of philosophy focus on the systematic justifications for Kant's claims, e.g., the methodological and theoretical underpinnings of Kant's statement that living beings are mechanically inexplicable. My aim in this paper is to bring together these two strands of scholarship into dialogue by showing how Kant's methodological concerns (specifically, his notion of reflective judgment) contributed to his conception of living beings and to the ontological concern with life as a distinctive object of study. I argue that although Kant's explicit statement was that biology could not be a science, his implicit and more fundamental claim was that the study of living beings necessitates a distinctive mode of thought, a mode that is essentially analogical. I consider the implications of this view, and argue that it is by developing a new methodology for grasping organized beings that Kant makes his most important contribution to the new science of life. Copyright © 2016. Published by Elsevier Ltd.
JEDI Methodology | Jobs and Economic Development Impact Models | NREL
Methodology JEDI Methodology The intent of the Jobs and Economic Development Impact (JEDI) models costs) to demonstrate the employment and economic impacts that will likely result during the estimate of overall economic impacts from specific scenarios. Please see Limitations of JEDI Models for
Mejia, Christian R.; Valladares-Garrido, Mario J.; Miñan-Tapia, Armando; Serrano, Felipe T.; Tobler-Gómez, Liz E.; Pereda-Castro, William; Mendoza-Flores, Cynthia R.; Mundaca-Manay, Maria Y.; Valladares-Garrido, Danai
2017-01-01
Introduction Sci-Hub is a useful web portal for people working in science as it provides access to millions of free scientific articles. Satisfaction and usage should be explored in the Latino student population. The objective of this study was to evaluate the use, knowledge, and perception of the scientific contribution of Sci-Hub in medical students from Latin America. Methodology A multicenter, observational, analytical study was conducted in 6632 medical students from 6 countries in Latin America. We surveyed from a previously validated instrument, delving into knowledge, monthly average usage, satisfaction level, and perception of the scientific contributions provided by Sci-Hub. Frequencies and percentages are described, and generalized linear models were used to establish statistical associations. Results Only 19.2% of study participants knew of Sci-Hub and its function, while the median use was twice a month. 29.9% of Sci-Hub-aware participants claimed they always find the desired scientific information in their Sci-Hub search; 62.5% of participants affirmed that Sci-Hub contributes to scientific investigation; only 2.2% reported that Sci-Hub does not contribute to science. Conclusion The majority of Latino students are not aware of Sci-Hub. PMID:28982181
NASA Astrophysics Data System (ADS)
Babonis, G. S.; Csatho, B. M.; Schenk, A. F.
2016-12-01
We present a new record of Antarctic ice thickness changes, reconstructed from ICESat laser altimetry observations, from 2004-2009, at over 100,000 locations across the Antarctic Ice Sheet (AIS). This work generates elevation time series at ICESat groundtrack crossover regions on an observation-by-observation basis, with rigorous, quantified, error estimates using the SERAC approach (Schenk and Csatho, 2012). The results include average and annual elevation, volume and mass changes in Antarctica, fully corrected for glacial isostatic adjustment (GIA) and known intercampaign biases; and partitioned into contributions from surficial processes (e.g. firn densification) and ice dynamics. The modular flexibility of the SERAC framework allows for the assimilation of multiple ancillary datasets (e.g. GIA models, Intercampaign Bias Corrections, IBC), in a common framework, to calculate mass changes for several different combinations of GIA models and IBCs and to arrive at a measure of variability from these results. We are able to determine the effect these corrections have on annual and average volume and mass change calculations in Antarctica, and to explore how these differences vary between drainage basins and with elevation. As such, this contribution presents a method that compliments, and is consistent with, the 2012 Ice sheet Mass Balance Inter-comparison Exercise (IMBIE) results (Shepherd 2012). Additionally, this work will contribute to the 2016 IMBIE, which seeks to reconcile ice sheet mass changes from different observations,, including laser altimetry, using a different methodologies and ancillary datasets including GIA models, Firn Densification Models, and Intercampaign Bias Corrections.
Investigation of pedestrian crashes on two-way two-lane rural roads in Ethiopia.
Tulu, Getu Segni; Washington, Simon; Haque, Md Mazharul; King, Mark J
2015-05-01
Understanding pedestrian crash causes and contributing factors in developing countries is critically important as they account for about 55% of all traffic crashes. Not surprisingly, considerable attention in the literature has been paid to road traffic crash prediction models and methodologies in developing countries of late. Despite this interest, there are significant challenges confronting safety managers in developing countries. For example, in spite of the prominence of pedestrian crashes occurring on two-way two-lane rural roads, it has proven difficult to develop pedestrian crash prediction models due to a lack of both traffic and pedestrian exposure data. This general lack of available data has further hampered identification of pedestrian crash causes and subsequent estimation of pedestrian safety performance functions. The challenges are similar across developing nations, where little is known about the relationship between pedestrian crashes, traffic flow, and road environment variables on rural two-way roads, and where unique predictor variables may be needed to capture the unique crash risk circumstances. This paper describes pedestrian crash safety performance functions for two-way two-lane rural roads in Ethiopia as a function of traffic flow, pedestrian flows, and road geometry characteristics. In particular, random parameter negative binomial model was used to investigate pedestrian crashes. The models and their interpretations make important contributions to road crash analysis and prevention in developing countries. They also assist in the identification of the contributing factors to pedestrian crashes, with the intent to identify potential design and operational improvements. Copyright © 2015. Published by Elsevier Ltd.
How spatio-temporal habitat connectivity affects amphibian genetic structure.
Watts, Alexander G; Schlichting, Peter E; Billerman, Shawn M; Jesmer, Brett R; Micheletti, Steven; Fortin, Marie-Josée; Funk, W Chris; Hapeman, Paul; Muths, Erin; Murphy, Melanie A
2015-01-01
Heterogeneous landscapes and fluctuating environmental conditions can affect species dispersal, population genetics, and genetic structure, yet understanding how biotic and abiotic factors affect population dynamics in a fluctuating environment is critical for species management. We evaluated how spatio-temporal habitat connectivity influences dispersal and genetic structure in a population of boreal chorus frogs (Pseudacris maculata) using a landscape genetics approach. We developed gravity models to assess the contribution of various factors to the observed genetic distance as a measure of functional connectivity. We selected (a) wetland (within-site) and (b) landscape matrix (between-site) characteristics; and (c) wetland connectivity metrics using a unique methodology. Specifically, we developed three networks that quantify wetland connectivity based on: (i) P. maculata dispersal ability, (ii) temporal variation in wetland quality, and (iii) contribution of wetland stepping-stones to frog dispersal. We examined 18 wetlands in Colorado, and quantified 12 microsatellite loci from 322 individual frogs. We found that genetic connectivity was related to topographic complexity, within- and between-wetland differences in moisture, and wetland functional connectivity as contributed by stepping-stone wetlands. Our results highlight the role that dynamic environmental factors have on dispersal-limited species and illustrate how complex asynchronous interactions contribute to the structure of spatially-explicit metapopulations.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
Inoue, K; Ochi, H; Taketsuka, M; Saito, H; Sakurai, K; Ichihashi, N; Iwatsuki, K; Kokubo, S
2008-05-01
A systematic analysis was carried out by using response surface methodology to create a quantitative model of the synergistic effects of conditions in a continuous freezer [mix flow rate (L/h), overrun (%), cylinder pressure (kPa), drawing temperature ( degrees C), and dasher speed (rpm)] on the principal constituent parameters of ice cream [rate of fat destabilization (%), mean air cell diameter (mum), and mean ice crystal diameter (mum)]. A central composite face-centered design was used for this study. Thirty-one combinations of the 5 above-mentioned freezer conditions were designed (including replicates at the center point), and ice cream samples were manufactured and examined in a continuous freezer under the selected conditions. The responses were the 3 variables given above. A quadratic model was constructed, with the freezer conditions as the independent variables and the ice cream characteristics as the dependent variables. The coefficients of determination (R(2)) were greater than 0.9 for all 3 responses, but Q(2), the index used here for the capability of the model for predicting future observed values of the responses, was negative for both the mean ice crystal diameter and the mean air cell diameter. Therefore, pruned models were constructed by removing terms that had contributed little to the prediction in the original model and by refitting the regression model. It was demonstrated that these pruned models provided good fits to the data in terms of R(2), Q(2), and ANOVA. The effects of freezer conditions were expressed quantitatively in terms of the 3 responses. The drawing temperature ( degrees C) was found to have a greater effect on ice cream characteristics than any of the other factors.
NASA Astrophysics Data System (ADS)
Qie, G.; Wang, G.; Wang, M.
2016-12-01
Mixed pixels and shadows due to buildings in urban areas impede accurate estimation and mapping of city vegetation carbon density. In most of previous studies, these factors are often ignored, which thus result in underestimation of city vegetation carbon density. In this study we presented an integrated methodology to improve the accuracy of mapping city vegetation carbon density. Firstly, we applied a linear shadow remove analysis (LSRA) on remotely sensed Landsat 8 images to reduce the shadow effects on carbon estimation. Secondly, we integrated a linear spectral unmixing analysis (LSUA) with a linear stepwise regression (LSR), a logistic model-based stepwise regression (LMSR) and k-Nearest Neighbors (kNN), and utilized and compared the integrated models on shadow-removed images to map vegetation carbon density. This methodology was examined in Shenzhen City of Southeast China. A data set from a total of 175 sample plots measured in 2013 and 2014 was used to train the models. The independent variables statistically significantly contributing to improving the fit of the models to the data and reducing the sum of squared errors were selected from a total of 608 variables derived from different image band combinations and transformations. The vegetation fraction from LSUA was then added into the models as an important independent variable. The estimates obtained were evaluated using a cross-validation method. Our results showed that higher accuracies were obtained from the integrated models compared with the ones using traditional methods which ignore the effects of mixed pixels and shadows. This study indicates that the integrated method has great potential on improving the accuracy of urban vegetation carbon density estimation. Key words: Urban vegetation carbon, shadow, spectral unmixing, spatial modeling, Landsat 8 images
NASA Astrophysics Data System (ADS)
Boeke, R.; Taylor, P. C.; Li, Y.
2017-12-01
Arctic cloud amount as simulated in CMIP5 models displays large intermodel spread- models disagree on the processes important for cloud formation as well as the radiative impact of clouds. The radiative response to cloud forcing can be better assessed when the drivers of Arctic cloud formation are known. Arctic cloud amount (CA) is a function of both atmospheric and surface conditions, and it is crucial to separate the influences of unique processes to understand why the models are different. This study uses a multilinear regression methodology to determine cloud changes using 3 variables as predictors: lower tropospheric stability (LTS), 500-hPa vertical velocity (ω500), and sea ice concentration (SIC). These three explanatory variables were chosen because their effects on clouds can be attributed to unique climate processes: LTS is a thermodynamic indicator of the relationship between clouds and atmospheric stability, SIC determines the interaction between clouds and the surface, and ω500 is a metric for dynamical change. Vertical, seasonal profiles of necessary variables are obtained from the Coupled Model Intercomparison Project 5 (CMIP5) historical simulation, an ocean-atmosphere couple model forced with the best-estimate natural and anthropogenic radiative forcing from 1850-2005, and statistical significance tests are used to confirm the regression equation. A unique heuristic model will be constructed for each climate model and for observations, and models will be tested by their ability to capture the observed cloud amount and behavior. Lastly, the intermodel spread in Arctic cloud amount will be attributed to individual processes, ranking the relative contributions of each factor to shed light on emergent constraints in the Arctic cloud radiative effect.
The Educational Situation Quality Model: Recent Advances
Doménech-Betoret, Fernando
2018-01-01
The purpose of this work was to present an educational model developed in recent years entitled the “The Educational Situation Quality Model” (MOCSE, acronym in Spanish). MOCSE can be defined as an instructional model that simultaneously considers the teaching-learning process, where motivation plays a central role. It explains the functioning of an educational setting by organizing and relating the most important variables which, according to the literature, contribute to student learning. Besides being a conceptual framework, this model also provides a methodological procedure to guide research and to promote reflection in the classroom. It allows teachers to implement effective research-action programs to improve teacher–students satisfaction and learning outcomes in the classroom context. This work explains the model’s characteristics and functioning, recent advances, and how teachers can use it in an educational setting with a specific subject. This proposal integrates approaches from several relevant psycho-educational theories and introduces a new perspective into the existing literature that will allow researchers to make progress in studying educational setting functioning. The initial MOCSE configuration has been refined over time in accordance with the empirical results obtained from previous research, carried out within the MOCSE framework and with the subsequent reflections that derived from these results. Finally, the contribution of the model to improve learning outcomes and satisfaction, and its applicability in the classroom, are also discussed. PMID:29593623
Toroody, Ahmad Bahoo; Abaei, Mohammad Mahdy; Gholamnia, Reza
2016-12-01
Risk assessment can be classified into two broad categories: traditional and modern. This paper is aimed at contrasting the functional resonance analysis method (FRAM) as a modern approach with the fault tree analysis (FTA) as a traditional method, regarding assessing the risks of a complex system. Applied methodology by which the risk assessment is carried out, is presented in each approach. Also, FRAM network is executed with regard to nonlinear interaction of human and organizational levels to assess the safety of technological systems. The methodology is implemented for lifting structures deep offshore. The main finding of this paper is that the combined application of FTA and FRAM during risk assessment, could provide complementary perspectives and may contribute to a more comprehensive understanding of an incident. Finally, it is shown that coupling a FRAM network with a suitable quantitative method will result in a plausible outcome for a predefined accident scenario.
Collins, A.L; Pulley, S.; Foster, I.D.L; Gellis, Allen; Porto, P.; Horowitz, A.J.
2017-01-01
The growing awareness of the environmental significance of fine-grained sediment fluxes through catchment systems continues to underscore the need for reliable information on the principal sources of this material. Source estimates are difficult to obtain using traditional monitoring techniques, but sediment source fingerprinting or tracing procedures, have emerged as a potentially valuable alternative. Despite the rapidly increasing numbers of studies reporting the use of sediment source fingerprinting, several key challenges and uncertainties continue to hamper consensus among the international scientific community on key components of the existing methodological procedures. Accordingly, this contribution reviews and presents recent developments for several key aspects of fingerprinting, namely: sediment source classification, catchment source and target sediment sampling, tracer selection, grain size issues, tracer conservatism, source apportionment modelling, and assessment of source predictions using artificial mixtures. Finally, a decision-tree representing the current state of knowledge is presented, to guide end-users in applying the fingerprinting approach.
Pages, Gaël; Ramdani, Nacim; Fraisse, Philippe; Guiraud, David
2009-06-01
This paper presents a contribution for restoring standing in paraplegia while using functional electrical stimulation (FES). Movement generation induced by FES remains mostly open looped and stimulus intensities are tuned empirically. To design an efficient closed-loop control, a preliminary study has been carried out to investigate the relationship between body posture and voluntary upper body movements. A methodology is proposed to estimate body posture in the sagittal plane using force measurements exerted on supporting handles during standing. This is done by setting up constraints related to the geometric equations of a two-dimensional closed chain model and the hand-handle interactions. All measured quantities are subject to an uncertainty assumed unknown but bounded. The set membership estimation problem is solved via interval analysis. Guaranteed uncertainty bounds are computed for the estimated postures. In order to test the feasibility of our methodology, experiments were carried out with complete spinal cord injured patients.
A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification
NASA Astrophysics Data System (ADS)
Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.
MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.
Kim, Eunkyoung; Panzella, Lucia; Micillo, Raffaella; Bentley, William E.; Napolitano, Alessandra; Payne, Gregory F.
2015-01-01
Pheomelanin has been implicated in the increased susceptibility to UV-induced melanoma for people with light skin and red hair. Recent studies identified a UV-independent pathway to melanoma carcinogenesis and implicated pheomelanin’s pro-oxidant properties that act through the generation of reactive oxygen species and/or the depletion of cellular antioxidants. Here, we applied an electrochemically-based reverse engineering methodology to compare the redox properties of human hair pheomelanin with model synthetic pigments and natural eumelanin. This methodology exposes the insoluble melanin samples to complex potential (voltage) inputs and measures output response characteristics to assess redox activities. The results demonstrate that both eumelanin and pheomelanin are redox-active, they can rapidly (sec-min) and repeatedly redox-cycle between oxidized and reduced states, and pheomelanin possesses a more oxidative redox potential. This study suggests that pheomelanin’s redox-based pro-oxidant activity may contribute to sustaining a chronic oxidative stress condition through a redox-buffering mechanism. PMID:26669666
Kim, Eunkyoung; Panzella, Lucia; Micillo, Raffaella; Bentley, William E; Napolitano, Alessandra; Payne, Gregory F
2015-12-16
Pheomelanin has been implicated in the increased susceptibility to UV-induced melanoma for people with light skin and red hair. Recent studies identified a UV-independent pathway to melanoma carcinogenesis and implicated pheomelanin's pro-oxidant properties that act through the generation of reactive oxygen species and/or the depletion of cellular antioxidants. Here, we applied an electrochemically-based reverse engineering methodology to compare the redox properties of human hair pheomelanin with model synthetic pigments and natural eumelanin. This methodology exposes the insoluble melanin samples to complex potential (voltage) inputs and measures output response characteristics to assess redox activities. The results demonstrate that both eumelanin and pheomelanin are redox-active, they can rapidly (sec-min) and repeatedly redox-cycle between oxidized and reduced states, and pheomelanin possesses a more oxidative redox potential. This study suggests that pheomelanin's redox-based pro-oxidant activity may contribute to sustaining a chronic oxidative stress condition through a redox-buffering mechanism.
Current issues relating to psychosocial job strain and cardiovascular disease research.
Theorell, T; Karasek, R A
1996-01-01
The authors comment on recent reviews of cardiovascular job strain research by P. L. Schnall and P. A. Landsbergis (1994), and by T. S. Kristensen (1995), which conclude that job strain as defined by the demand-control model (the combination of contributions of low job decision latitudes and high psychological job demands) is confirmed as a risk factor for cardiovascular mortality in a large majority of studies. Lack of social support at work appears to further increase risk. Several still-unresolved research questions are examined in light of recent studies: (a) methodological issues related to use of occupational aggregate estimations and occupational career aggregate assessments, use of standard scales for job analysis and recall bias issues in self-reporting; (b) confounding factors and differential strengths of association by subgroups in job strain-cardiovascular disease analyses with respect to social class, gender, and working hours; and (c) review of results of monitoring job strain-blood pressure associations and associated methodological issues.
Geometric stiffening in multibody dynamics formulations
NASA Technical Reports Server (NTRS)
Sharf, Inna
1993-01-01
In this paper we discuss the issue of geometric stiffening as it arises in the context of multibody dynamics. This topic has been treated in a number of previous publications in this journal and appears to be a debated subject. The controversy revolves primarily around the 'correct' methodology for incorporating the stiffening effect into dynamics formulations. The main goal of this work is to present the different approaches that have been developed for this problem through an in-depth review of several publications dealing with this subject. This is done with the goal of contributing to a precise understanding of the existing methodologies for modelling the stiffening effects in multibody systems. Thus, in presenting the material we attempt to illuminate the key characteristics of the various methods as well as show how they relate to each other. In addition, we offer a number of novel insights and clarifying interpretations of these schemes. The paper is completed with a general classification and comparison of the different approaches.
Can there be a physics of financial markets? Methodological reflections on econophysics
NASA Astrophysics Data System (ADS)
Huber, Tobias A.; Sornette, Didier
2016-12-01
We address the question whether there can be a physical science of financial markets. In particular, we examine the argument that, given the reflexivity of financial markets (i.e., the feedback mechanism between expectations and prices), there is a fundamental difference between social and physical systems, which demands a new scientific method. By providing a selective history of the mutual cross-fertilization between physics and economics, we reflect on the methodological differences of how models and theories get constructed in these fields. We argue that the novel conception of financial markets as complex adaptive systems is one of the most important contributions of econophysics and show that this field of research provides the methods, concepts, and tools to scientifically account for reflexivity. We conclude by arguing that a new science of economic and financial systems should not only be physics-based, but needs to integrate findings from other scientific fields, so that a truly multi-disciplinary complex systems science of financial markets can be built.
Background sampling and transferability of species distribution model ensembles under climate change
NASA Astrophysics Data System (ADS)
Iturbide, Maialen; Bedia, Joaquín; Gutiérrez, José Manuel
2018-07-01
Species Distribution Models (SDMs) constitute an important tool to assist decision-making in environmental conservation and planning. A popular application of these models is the projection of species distributions under climate change conditions. Yet there are still a range of methodological SDM factors which limit the transferability of these models, contributing significantly to the overall uncertainty of the resulting projections. An important source of uncertainty often neglected in climate change studies comes from the use of background data (a.k.a. pseudo-absences) for model calibration. Here, we study the sensitivity to pseudo-absence sampling as a determinant factor for SDM stability and transferability under climate change conditions, focusing on European wide projections of Quercus robur as an illustrative case study. We explore the uncertainty in future projections derived from ten pseudo-absence realizations and three popular SDMs (GLM, Random Forest and MARS). The contribution of the pseudo-absence realization to the uncertainty was higher in peripheral regions and clearly differed among the tested SDMs in the whole study domain, being MARS the most sensitive - with projections differing up to a 40% for different realizations - and GLM the most stable. As a result we conclude that parsimonious SDMs are preferable in this context, avoiding complex methods (such as MARS) which may exhibit poor model transferability. Accounting for this new source of SDM-dependent uncertainty is crucial when forming multi-model ensembles to undertake climate change projections.
Fellinger, Michael R.; Hector, Louis G.; Trinkle, Dallas R.
2016-10-28
Here, we present an efficient methodology for computing solute-induced changes in lattice parameters and elastic stiffness coefficients Cij of single crystals using density functional theory. We also introduce a solute strain misfit tensor that quantifies how solutes change lattice parameters due to the stress they induce in the host crystal. Solutes modify the elastic stiffness coefficients through volumetric changes and by altering chemical bonds. We compute each of these contributions to the elastic stiffness coefficients separately, and verify that their sum agrees with changes in the elastic stiffness coefficients computed directly using fully optimized supercells containing solutes. Computing the twomore » elastic stiffness contributions separately is more computationally efficient and provides more information on solute effects than the direct calculations. We compute the solute dependence of polycrystalline averaged shear and Young's moduli from the solute dependence of the single-crystal Cij. We then apply this methodology to substitutional Al, B, Cu, Mn, Si solutes and octahedral interstitial C and N solutes in bcc Fe. Comparison with experimental data indicates that our approach accurately predicts solute-induced changes in the lattice parameter and elastic coefficients. The computed data can be used to quantify solute-induced changes in mechanical properties such as strength and ductility, and can be incorporated into mesoscale models to improve their predictive capabilities.« less
2016-06-01
characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira
2016-01-01
Background Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. Objectives The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologies in the health sector, and to encourage discussion of these conceptual models in scholarly forums. Methods A two-part method was used to summarize and structure ideas about how to work effectively with conceptual models in health informatics research that included (1) a selective review and summary of the literature of conceptual models; and (2) the construction of a step-by-step approach to developing a conceptual model. Results The seven-step methodology for developing conceptual models in health informatics research explained in this paper involves (1) acknowledging the limitations of health science and information science conceptual models; (2) giving a rationale for one’s choice of integrative conceptual model; (3) explicating a conceptual model verbally and graphically; (4) seeking feedback about the conceptual model from stakeholders in both the health science and information science domains; (5) aligning a conceptual model with an appropriate research plan; (6) adapting a conceptual model in response to new knowledge over time; and (7) disseminating conceptual models in scholarly and scientific forums. Conclusions Making explicit the conceptual model that underpins a health informatics research project can contribute to increasing the number of well-formed and strongly grounded health informatics research projects. This explication has distinct benefits for researchers in training, research teams, and researchers and practitioners in information, health, and other disciplines. PMID:26912288
Gray, Kathleen; Sockolow, Paulina
2016-02-24
Contributing to health informatics research means using conceptual models that are integrative and explain the research in terms of the two broad domains of health science and information science. However, it can be hard for novice health informatics researchers to find exemplars and guidelines in working with integrative conceptual models. The aim of this paper is to support the use of integrative conceptual models in research on information and communication technologies in the health sector, and to encourage discussion of these conceptual models in scholarly forums. A two-part method was used to summarize and structure ideas about how to work effectively with conceptual models in health informatics research that included (1) a selective review and summary of the literature of conceptual models; and (2) the construction of a step-by-step approach to developing a conceptual model. The seven-step methodology for developing conceptual models in health informatics research explained in this paper involves (1) acknowledging the limitations of health science and information science conceptual models; (2) giving a rationale for one's choice of integrative conceptual model; (3) explicating a conceptual model verbally and graphically; (4) seeking feedback about the conceptual model from stakeholders in both the health science and information science domains; (5) aligning a conceptual model with an appropriate research plan; (6) adapting a conceptual model in response to new knowledge over time; and (7) disseminating conceptual models in scholarly and scientific forums. Making explicit the conceptual model that underpins a health informatics research project can contribute to increasing the number of well-formed and strongly grounded health informatics research projects. This explication has distinct benefits for researchers in training, research teams, and researchers and practitioners in information, health, and other disciplines.
The impact of airport characteristics on airport surface accidents and incidents.
Wilke, Sabine; Majumdar, Arnab; Ochieng, Washington Y
2015-06-01
Airport surface safety and in particular runway and taxiway safety is acknowledged globally as one of aviation's greatest challenges. To improve this key area of aviation safety, it is necessary to identify and understand the causal and contributing factors on safety occurrences. While the contribution of human factors, operations, and procedures has been researched extensively, the impact of the airport and its associated characteristics itself has received little or no attention. This paper introduces a novel methodology for risk and hazard assessment of airport surface operations, and models the relationships between airport characteristics, and (a) the rate of occurrences, (b) the severity of occurrences, and (c) the causal factors underlying occurrences. The results show for the first time how the characteristics of airports, and in particular its infrastructure and operations, influence the safety of surface operations. Copyright © 2015 Elsevier Ltd. and National Safety Council. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Palazón, Leticia; Gaspar, Leticia; Latorre, Borja; Blake, Will; Navas, Ana
2014-05-01
Spanish Pyrenean reservoirs are under pressure from high sediment yields in contributing catchments. Sediment fingerprinting approaches offer potential to quantify the contribution of different sediment sources, evaluate catchment erosion dynamics and develop management plans to tackle the reservoir siltation problems. The drainage basin of the Barasona reservoir (1509 km2), located in the Central Spanish Pyrenees, is an alpine-prealpine agroforest basin supplying sediments to the reservoir at an annual rate of around 350 t km-2 with implications for reservoir longevity. The climate is mountain type, wet and cold, with both Atlantic and Mediterranean influences. Steep slopes and the presence of deep and narrow gorges favour rapid runoff and large floods. The ability of geochemical fingerprint properties to discriminate between the sediment sources was investigated by conducting the nonparametric Kruskal-Wallis H-test and a stepwise discriminant function analysis (minimization of Wilk's lambda). This standard procedure selects potential fingerprinting properties as optimum composite fingerprint to characterize and discriminate between sediment sources to the reservoir. Then the contribution of each potential sediment source was assessed by applying a Monte Carlo mixing model to obtain source proportions for the Barasona reservoir sediment samples. The Monte Carlo mixing model was written in C programming language and designed to deliver a user-defined number possible solutions. A Combinatorial Principals method was used to identify the most probable solution with associated uncertainty based on source variability. The unique solution for each sample was characterized by the mean value and the standard deviation of the generated solutions and the lower goodness of fit value applied. This method is argued to guarantee a similar set of representative solutions in all unmixing cases based on likelihood of occurrence. Soil samples for the different potential sediment sources of the drainage basin were compared with samples from the reservoir using a range of different fingerprinting properties (i.e. mass activities of environmental radionuclides, elemental composition and magnetic susceptibility) analyzed in the < 63 μm sediment fraction. In this case, the 100 best results from 106 generated iterations were selected obtaining a goodness of fit higher than 0.76. The preliminary results using this new data processing methodology for samples collected in the reservoir allowed us to identify cultivated fields and badlands as main potential sources of sediments to the reservoir. These findings support the appropriate use of the fingerprinting methodology in a Spanish Pyrenees basin, which will enable us to better understand the basin sediment production of the Barasona reservoir.
Applying data mining techniques to improve diagnosis in neonatal jaundice.
Ferreira, Duarte; Oliveira, Abílio; Freitas, Alberto
2012-12-07
Hyperbilirubinemia is emerging as an increasingly common problem in newborns due to a decreasing hospital length of stay after birth. Jaundice is the most common disease of the newborn and although being benign in most cases it can lead to severe neurological consequences if poorly evaluated. In different areas of medicine, data mining has contributed to improve the results obtained with other methodologies.Hence, the aim of this study was to improve the diagnosis of neonatal jaundice with the application of data mining techniques. This study followed the different phases of the Cross Industry Standard Process for Data Mining model as its methodology.This observational study was performed at the Obstetrics Department of a central hospital (Centro Hospitalar Tâmega e Sousa--EPE), from February to March of 2011. A total of 227 healthy newborn infants with 35 or more weeks of gestation were enrolled in the study. Over 70 variables were collected and analyzed. Also, transcutaneous bilirubin levels were measured from birth to hospital discharge with maximum time intervals of 8 hours between measurements, using a noninvasive bilirubinometer.Different attribute subsets were used to train and test classification models using algorithms included in Weka data mining software, such as decision trees (J48) and neural networks (multilayer perceptron). The accuracy results were compared with the traditional methods for prediction of hyperbilirubinemia. The application of different classification algorithms to the collected data allowed predicting subsequent hyperbilirubinemia with high accuracy. In particular, at 24 hours of life of newborns, the accuracy for the prediction of hyperbilirubinemia was 89%. The best results were obtained using the following algorithms: naive Bayes, multilayer perceptron and simple logistic. The findings of our study sustain that, new approaches, such as data mining, may support medical decision, contributing to improve diagnosis in neonatal jaundice.
NASA Astrophysics Data System (ADS)
Fontaine, Alain; Sauvage, Bastien; Pétetin, Hervé; Auby, Antoine; Boulanger, Damien; Thouret, Valerie
2016-04-01
Since 1994, the IAGOS program (In-Service Aircraft for a Global Observing System http://www.iagos.org) and its predecessor MOZAIC has produced in-situ measurements of the atmospheric composition during more than 46000 commercial aircraft flights. In order to help analyzing these observations and further understanding the processes driving their evolution, we developed a modelling tool SOFT-IO quantifying their source/receptor link. We improved the methodology used by Stohl et al. (2003), based on the FLEXPART plume dispersion model, to simulate the contributions of anthropogenic and biomass burning emissions from the ECCAD database (http://eccad.aeris-data.fr) to the measured carbon monoxide mixing ratio along each IAGOS flight. Thanks to automated processes, contributions are simulated for the last 20 days before observation, separating individual contributions from the different source regions. The main goal is to supply add-value products to the IAGOS database showing pollutants geographical origin and emission type. Using this information, it may be possible to link trends in the atmospheric composition to changes in the transport pathways and to the evolution of emissions. This tool could be used for statistical validation as well as for inter-comparisons of emission inventories using large amounts of data, as Lagrangian models are able to bring the global scale emissions down to a smaller scale, where they can be directly compared to the in-situ observations from the IAGOS database.
Delineating baseflow contribution areas for streams - A model and methods comparison.
Chow, Reynold; Frind, Michael E; Frind, Emil O; Jones, Jon P; Sousa, Marcelo R; Rudolph, David L; Molson, John W; Nowak, Wolfgang
2016-12-01
This study addresses the delineation of areas that contribute baseflow to a stream reach, also known as stream capture zones. Such areas can be delineated using standard well capture zone delineation methods, with three important differences: (1) natural gradients are smaller compared to those produced by supply wells and are therefore subject to greater numerical errors, (2) stream discharge varies seasonally, and (3) stream discharge varies spatially. This study focuses on model-related uncertainties due to model characteristics, discretization schemes, delineation methods, and particle tracking algorithms. The methodology is applied to the Alder Creek watershed in southwestern Ontario. Four different model codes are compared: HydroGeoSphere, WATFLOW, MODFLOW, and FEFLOW. In addition, two delineation methods are compared: reverse particle tracking and reverse transport, where the latter considers local-scale parameter uncertainty by using a macrodispersion term to produce a capture probability plume. The results from this study indicate that different models can calibrate acceptably well to the same data and produce very similar distributions of hydraulic head, but can produce different capture zones. The stream capture zone is found to be highly sensitive to the particle tracking algorithm. It was also found that particle tracking by itself, if applied to complex systems such as the Alder Creek watershed, would require considerable subjective judgement in the delineation of stream capture zones. Reverse transport is an alternative and more reliable approach that provides probability intervals for the baseflow contribution areas, taking uncertainty into account. The two approaches can be used together to enhance the confidence in the final outcome. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sangermano, Florencia
2009-12-01
The world is suffering from rapid changes in both climate and land cover which are the main factors affecting global biodiversity. These changes may affect ecosystems by altering species distributions, population sizes, and community compositions, which emphasizes the need for a rapid assessment of biodiversity status for conservation and management purposes. Current approaches on monitoring biodiversity rely mainly on long term observations of predetermined sites, which require large amounts of time, money and personnel to be executed. In order to overcome problems associated with current field monitoring methods, the main objective of this dissertation is the development of framework for inferential monitoring of the impact of global change on biodiversity based on remotely sensed data coupled with species distribution modeling techniques. Several research pieces were performed independently in order to fulfill this goal. First, species distribution modeling was used to identify the ranges of 6362 birds, mammals and amphibians in South America. Chapter 1 compares the power of different presence-only species distribution methods for modeling distributions of species with different response curves to environmental gradients and sample sizes. It was found that there is large variability in the power of the methods for modeling habitat suitability and species ranges, showing the importance of performing, when possible, a preliminary gradient analysis of the species distribution before selecting the method to be used. Chapter 2 presents a new methodology for the redefinition of species range polygons. Using a method capable of establishing the uncertainty in the definition of existing range polygons, the automated procedure identifies the relative importance of bioclimatic variables for the species, predicts their ranges and generates a quality assessment report to explore prediction errors. Analysis using independent validation data shows the power of this methodology to redefine species ranges in a more biophysically reasonable way. If a specific variable is important for a species, a change in that variable is likely to impact the species. Chapter 3 presents a methodology to identify the impact of environmental changes on 6362 species of mammals, amphibians and birds of South America, based on per-species measures of sensitivity, marginality, range restriction and trends in remotely sensed bioclimatic variables. Maps of the impact of environmental changes on vertebrates of South America were generated, with the Andes, Patagonia and the Atlantic Forest experiencing the strongest impact of environmental change in this over the past quarter century. Contributions of this dissertation include the development of new range polygons for all mammals, amphibians and birds of South America, as well as a methodology to re-draw the polygons in any other region of the world. This dataset is essential for both biodiversity analysis and conservation prioritization. Other contributions are the generation of maps of impact of global change on biodiversity, together with a framework for the development and updating of those maps. Conservation and monitoring agencies will find this research useful not only for the selection of new conservation areas but also for prioritizing areas for field monitoring.
NASA Astrophysics Data System (ADS)
Yang, T.; Lee, C.
2017-12-01
The biases in the Global Circulation Models (GCMs) are crucial for understanding future climate changes. Currently, most bias correction methodologies suffer from the assumption that model bias is stationary. This paper provides a non-stationary bias correction model, termed Residual-based Bagging Tree (RBT) model, to reduce simulation biases and to quantify the contributions of single models. Specifically, the proposed model estimates the residuals between individual models and observations, and takes the differences between observations and the ensemble mean into consideration during the model training process. A case study is conducted for 10 major river basins in Mainland China during different seasons. Results show that the proposed model is capable of providing accurate and stable predictions while including the non-stationarities into the modeling framework. Significant reductions in both bias and root mean squared error are achieved with the proposed RBT model, especially for the central and western parts of China. The proposed RBT model has consistently better performance in reducing biases when compared to the raw ensemble mean, the ensemble mean with simple additive bias correction, and the single best model for different seasons. Furthermore, the contribution of each single GCM in reducing the overall bias is quantified. The single model importance varies between 3.1% and 7.2%. For different future scenarios (RCP 2.6, RCP 4.5, and RCP 8.5), the results from RBT model suggest temperature increases of 1.44 ºC, 2.59 ºC, and 4.71 ºC by the end of the century, respectively, when compared to the average temperature during 1970 - 1999.
Grau, P; Vanrolleghem, P; Ayesa, E
2007-01-01
In this paper, a new methodology for integrated modelling of the WWTP has been used for the construction of the Benchmark Simulation Model N degrees 2 (BSM2). The transformations-approach proposed in this methodology does not require the development of specific transformers to interface unit process models and allows the construction of tailored models for a particular WWTP guaranteeing the mass and charge continuity for the whole model. The BSM2 PWM constructed as case study, is evaluated by means of simulations under different scenarios and its validity in reproducing water and sludge lines in WWTP is demonstrated. Furthermore the advantages that this methodology presents compared to other approaches for integrated modelling are verified in terms of flexibility and coherence.
Force 2025 and Beyond Strategic Force Design Analytic Model
2017-01-12
depiction of the core ideas of our force design model. Figure 1: Description of Force Design Model Figure 2 shows an overview of our methodology ...the F2025B Force Design Analytic Model research conducted by TRAC- MTRY and the Naval Postgraduate School. Our research develops a methodology for...designs. We describe a data development methodology that characterizes the data required to construct a force design model using our approach. We
Future directions for LDEF ionizing radiation modeling and assessments
NASA Technical Reports Server (NTRS)
Armstrong, T. W.; Colborn, B. L.
1992-01-01
Data from the ionizing radiation dosimetry aboard LDEF provide a unique opportunity for assessing the accuracy of current space radiation models and in identifying needed improvements for future mission applications. Details are given of the LDEF data available for radiation model evaluations. The status is given of model comparisons with LDEF data, along with future directions of planned modeling efforts and data comparison assessments. The methodology is outlined which is related to modeling being used to help insure that the LDEF ionizing radiation results can be used to address ionizing radiation issues for future missions. In general, the LDEF radiation modeling has emphasized quick-look predictions using simplified methods to make comparisons with absorbed dose measurements and induced radioactivity measurements of emissions. Modeling and LDEF data comparisons related to linear energy transfer spectra are of importance for several reasons which are outlined. The planned modeling and LDEF data comparisons for LET spectra is discussed, including components of the LET spectra due to different environment sources, contribution from different production mechanisms, and spectra in plastic detectors vs silicon.
Peasura, Prachya
2015-01-01
This research studied the application of the response surface methodology (RSM) and central composite design (CCD) experiment in mathematical model and optimizes postweld heat treatment (PWHT). The material of study is a pressure vessel steel ASTM A516 grade 70 that is used for gas metal arc welding. PWHT parameters examined in this study included PWHT temperatures and time. The resulting materials were examined using CCD experiment and the RSM to determine the resulting material tensile strength test, observed with optical microscopy and scanning electron microscopy. The experimental results show that using a full quadratic model with the proposed mathematical model is Y TS = −285.521 + 15.706X 1 + 2.514X 2 − 0.004X 1 2 − 0.001X 2 2 − 0.029X 1 X 2. Tensile strength parameters of PWHT were optimized PWHT time of 5.00 hr and PWHT temperature of 645.75°C. The results show that the PWHT time is the dominant mechanism used to modify the tensile strength compared to the PWHT temperatures. This phenomenon could be explained by the fact that pearlite can contribute to higher tensile strength. Pearlite has an intensity, which results in increased material tensile strength. The research described here can be used as material data on PWHT parameters for an ASTM A516 grade 70 weld. PMID:26550602
Minjares-Fuentes, R; Femenia, A; Garau, M C; Meza-Velázquez, J A; Simal, S; Rosselló, C
2014-06-15
An ultrasound-assisted procedure for the extraction of pectins from grape pomace with citric acid as the extracting agent was established. A Box-Behnken design (BBD) was employed to optimize the extraction temperature (X1: 35-75°C), extraction time (X2: 20-60 min) and pH (X3: 1.0-2.0) to obtain a high yield of pectins with high average molecular weight (MW) and degree of esterification (DE) from grape pomace. Analysis of variance showed that the contribution of a quadratic model was significant for the pectin extraction yield and for pectin MW whereas the DE of pectins was more influenced by a linear model. An optimization study using response surface methodology was performed and 3D response surfaces were plotted from the mathematical model. According to the RSM model, the highest pectin yield (∼32.3%) can be achieved when the UAE process is carried out at 75°C for 60 min using a citric acid solution of pH 2.0. These pectic polysaccharides, composed mainly by galacturonic acid units (<97% of total sugars), have an average MW of 163.9 kDa and a DE of 55.2%. Close agreement between experimental and predicted values was found. These results suggest that ultrasound-assisted extraction could be a good option for the extraction of functional pectins with citric acid from grape pomace at industrial level. Copyright © 2014 Elsevier Ltd. All rights reserved.
Building on Our Teaching Assets: The Unique Pedagogical Contributions of Bilingual Educators
ERIC Educational Resources Information Center
Hopkins, Megan
2013-01-01
This article examines the unique contributions that bilingual and bilingually credentialed teachers make to the instruction of emergent bilinguals in the United States. This mixed methodological study involved 474 teachers in Arizona, California, and Texas, which represent distinct language policy contexts. Results revealed that, irrespective of…
Software engineering methodologies and tools
NASA Technical Reports Server (NTRS)
Wilcox, Lawrence M.
1993-01-01
Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.
Variability of Short-term Precipitation and Runoff in Small Czech Drainage Basins
NASA Astrophysics Data System (ADS)
Kavka, Petr; Strouhal, Luděk; Landa, Martin; Neuman, Martin; Kožant, Petr; Muller, Miloslav
2016-04-01
The aim of this contribution is to introduce the recently started three year's project named "Variability of Short-term Precipitation and Runoff in Small Czech Drainage Basins and its Influence on Water Resources Management". Its main goal is to elaborate a methodology and online utility for deriving short-term design precipitation series, which could be utilized by a broad community of scientists, state administration as well as design planners. The outcomes of the project will especially be helpful in modelling hydrological or soil erosion problems when designing common measures for promoting water retention or landscape drainage systems in or out of the scope of Landscape consolidation projects. The precipitation scenarios will be derived from 10 years of observed data from point gauging stations and radar data. The analysis is focused on events' return period, rainfall total amount, internal intensity distribution and spatial distribution over the area of Czech Republic. The methodology will account for the choice of the simulation model. Several representatives of practically oriented models will be tested for the output sensitivity to selected precipitation scenario comparing to variability connected with other inputs uncertainty. The variability of the outputs will also be assessed in the context of economic impacts in design of landscape water structures or mitigation measures. The research was supported by the grant QJ1520265 of the Czech Ministry of Agriculture, using data provided by the Czech Hydrometeorological Institute.
McCarthy-Jones, Simon; Krueger, Joel; Larøi, Frank; Broome, Matthew; Fernyhough, Charles
2013-01-01
One of the leading cognitive models of auditory verbal hallucinations (AVHs) proposes such experiences result from a disturbance in the process by which inner speech is attributed to the self. Research in this area has, however, proceeded in the absence of thorough cognitive and phenomenological investigations of the nature of inner speech, against which AVHs are implicitly or explicitly defined. In this paper we begin by introducing philosophical phenomenology and highlighting its relevance to AVHs, before briefly examining the evolving literature on the relation between inner experiences and AVHs. We then argue for the need for philosophical phenomenology (Phenomenology) and the traditional empirical methods of psychology for studying inner experience (phenomenology) to mutually inform each other to provide a richer and more nuanced picture of both inner experience and AVHs than either could on its own. A critical examination is undertaken of the leading model of AVHs derived from phenomenological philosophy, the ipseity disturbance model. From this we suggest issues that future work in this vein will need to consider, and examine how interdisciplinary methodologies may contribute to advances in our understanding of AVHs. Detailed suggestions are made for the direction and methodology of future work into AVHs, which we suggest should be undertaken in a context where phenomenology and physiology are both necessary, but neither sufficient. PMID:23576974
Jaén, Sebastian; Dyner, Isaac
2014-03-01
A large-scale expansion of the Colombian coca cultivation is one of the most revealing signs of a structural change in the illegal cocaine market in the Andean region. From being a modest and domestic production, in the space of five years Colombian coca cultivation supplied a competitive market, capable of substituting almost completely the foreign sources of supply. The purpose of this work is to explore the role and potential of system dynamics (SD) as a modeling methodology to better understand the consequences of drug policy. As a case study, this work tests the hypothesis that the outbreak of Colombian coca cultivations is a consequence of the take down of large cartels, leading to the surge of small drug-trafficking firms called "cartelitos." Using an SD model, and elements from the economic theory of the criminal firm, our work shows how the formation of these small firms might significantly contribute to the configuring of a more competitive domestic coca industry (and hence to a more efficient crime industry). We conclude that SD seems an appropriate dynamic modeling-based approach to address policy issues regarding drug markets. The methodology takes into account the dynamic nature of drug markets and their multi-dimensional responses to policy interventions. Copyright © 2014 Elsevier B.V. All rights reserved.
What We Know About the Brain Structure-Function Relationship.
Batista-García-Ramó, Karla; Fernández-Verdecia, Caridad Ivette
2018-04-18
How the human brain works is still a question, as is its implication with brain architecture: the non-trivial structure–function relationship. The main hypothesis is that the anatomic architecture conditions, but does not determine, the neural network dynamic. The functional connectivity cannot be explained only considering the anatomical substrate. This involves complex and controversial aspects of the neuroscience field and that the methods and methodologies to obtain structural and functional connectivity are not always rigorously applied. The goal of the present article is to discuss about the progress made to elucidate the structure–function relationship of the Central Nervous System, particularly at the brain level, based on results from human and animal studies. The current novel systems and neuroimaging techniques with high resolutive physio-structural capacity have brought about the development of an integral framework of different structural and morphometric tools such as image processing, computational modeling and graph theory. Different laboratories have contributed with in vivo, in vitro and computational/mathematical models to study the intrinsic neural activity patterns based on anatomical connections. We conclude that multi-modal techniques of neuroimaging are required such as an improvement on methodologies for obtaining structural and functional connectivity. Even though simulations of the intrinsic neural activity based on anatomical connectivity can reproduce much of the observed patterns of empirical functional connectivity, future models should be multifactorial to elucidate multi-scale relationships and to infer disorder mechanisms.
Smith, Timothy W.; Uchino, Bert N.; MacKenzie, Justin; Hicks, Angela; Campo, Rebecca A.; Reblin, Maija; Grewen, Karen; Amico, Janet A.; Light, Kathleen C.
2016-01-01
Cardiovascular reactivity is a potential mechanism underlying associations of close relationship quality with cardiovascular disease. Two models describe oxytocin as another mechanism. The “calm and connect” model posits an association between positive relationship experiences and oxytocin levels and responses, whereas the “tend and befriend” model emphasizes the effects of negative relationship experiences in evoking oxytocin release. In this study of 180 younger couples, relationship quality had a small, marginally significant inverse association with plasma oxytocin levels, and neither positive nor negative couple interactions evoked change in plasma oxytocin. Negative couple interactions evoked significant cardiovascular reactivity, especially among women. Hence, in the largest study of these issues to date, there was little support for key tenets of the “calm and connect” model, and only very modest support for the ”tend and befriend” model. However, findings were consistent with the view that CVR contributes to the effects of relationship difficulties on health. PMID:22543270
Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.
Said, Nadia; Engelhart, Michael; Kirches, Christian; Körkel, Stefan; Holt, Daniel V
2016-01-01
Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.
Daniel R. Williams
2014-01-01
The year 1992 was a watershed for research on place attachment. Not only was the landmark book Place Attachment (Altman & Low, 1992) published, in that same year some colleagues and I published "Beyond the Commodity Metaphor" in the journal Leisure Sciences (Williams et al., 1992). Our paper was not intended as a methodological contribution to place...
ERIC Educational Resources Information Center
Munyai, Keneilwe
2016-01-01
This short paper explores the potential contribution of design thinking methodology to the education and training system in South Africa. Design thinking is slowly gaining traction in South Africa. Design Thinking is gaining traction in South Africa. There is offered by the Hasso Plattner Institute of Design Thinking at the University of Cape Town…
ERIC Educational Resources Information Center
Wildsmith-Cromarty, Rosemary
2015-01-01
This report describes ongoing research on reading in African languages. It draws mainly on contributions from two British Association for Applied Linguistics (BAAL) "Language in Africa" (LiA) Special Interest Group (SIG) meetings: the LiA SIG strand at BAAL 2013 and the seminar on "Reading Methodologies in African Languages"…
Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M
2013-09-01
Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.
NASA Astrophysics Data System (ADS)
Pujol, Meritxell Cortada; Quintana, Maria Graciela Badilla; Romaní, Jordi Riera
With the incorporation in education of Information and Communication Technologies (ICT), especially the Interactive Whiteboard (IWB), emerges the need for a proper teacher training process due to adequate the integration and the didactic use of this tool in the classroom. This article discusses the teachers' perception on the training process for ICT integration. Its main aim is to contribute to the unification of minimum criteria for effective ICT implementation in any training process for active teachers. This case study begins from the development of a training model called Eduticom which was putted into practice in 4 schools in Catalonia, Spain. Findings indicated different teachers' needs such as an appropriate infrastructure, a proper management and a flexible training model which essentially addresses methodological and didactic aspects of IWB uses in the classroom.
NASA Astrophysics Data System (ADS)
Telesca, V.; Copertino, V. A.; Scavone, G.; Pastore, V.; Dal Sasso, S.
2009-04-01
Most of the hydrological models are by now founded on field and satellite data integration. In fact, the use of remote sensing techniques supplies the frequent lack of field-measured variables and parameters required to apply evaluation models of the hydrological cycle components at a regional scale. These components are very sensitive to the climatic and surface features and conditions. Remote sensing represent a complementary contribution to in situ investigation methodologies, furnishing repeated and real time observations. Naturally, the interest of these techniques is tied up to the existence of a solid correlation among the greatness to evaluate and the remote sensing information obtainable from the images. In this context, satellite remote sensing has become a basic tool since it allows the regular monitoring of extensive areas. Different surface variables and parameters can be extracted from the combination of the multi-spectral information contained in a satellite image. Land Surface Temperature (LST) is a fundamental parameter to estimate most of the components of the hydrological cycle and the soil-atmosphere energy balance, such as the net radiation, the sensible heat flux and the actual evapotranspiration. Besides, LST maps can be used in models for the fire monitoring and prevention. The aim of this work is to realize, exploiting the contribution of the remote sensing, some Land Surface Temperature maps, applying different "Split Windows" algorithms and to compare them with the "Day/Night" LST/MODIS, to select the best algorithm to apply in a Two-Source Energy Balance model (STSEB). Integrated into a rainfall/runoff model, it can contribute to cope with problems of land management for the protection from natural hazards. In particular, the energy balance procedure will be included into a model for the ‘in continuous' simulation and the forecast of floods. Another important application of our model is tied up to the forecast of scenarios connected to drought problems. In this context, they can contribute to the planning and the realization of mitigation interventions for the desertification risk.
NASA Astrophysics Data System (ADS)
Ayatollahy Tafti, Tayeb
We develop a new method for integrating information and data from different sources. We also construct a comprehensive workflow for characterizing and modeling a fracture network in unconventional reservoirs, using microseismic data. The methodology is based on combination of several mathematical and artificial intelligent techniques, including geostatistics, fractal analysis, fuzzy logic, and neural networks. The study contributes to scholarly knowledge base on the characterization and modeling fractured reservoirs in several ways; including a versatile workflow with a novel objective functions. Some the characteristics of the methods are listed below: 1. The new method is an effective fracture characterization procedure estimates different fracture properties. Unlike the existing methods, the new approach is not dependent on the location of events. It is able to integrate all multi-scaled and diverse fracture information from different methodologies. 2. It offers an improved procedure to create compressional and shear velocity models as a preamble for delineating anomalies and map structures of interest and to correlate velocity anomalies with fracture swarms and other reservoir properties of interest. 3. It offers an effective way to obtain the fractal dimension of microseismic events and identify the pattern complexity, connectivity, and mechanism of the created fracture network. 4. It offers an innovative method for monitoring the fracture movement in different stages of stimulation that can be used to optimize the process. 5. Our newly developed MDFN approach allows to create a discrete fracture network model using only microseismic data with potential cost reduction. It also imposes fractal dimension as a constraint on other fracture modeling approaches, which increases the visual similarity between the modeled networks and the real network over the simulated volume.
The rise of machine consciousness: studying consciousness with computational models.
Reggia, James A
2013-08-01
Efforts to create computational models of consciousness have accelerated over the last two decades, creating a field that has become known as artificial consciousness. There have been two main motivations for this controversial work: to develop a better scientific understanding of the nature of human/animal consciousness and to produce machines that genuinely exhibit conscious awareness. This review begins by briefly explaining some of the concepts and terminology used by investigators working on machine consciousness, and summarizes key neurobiological correlates of human consciousness that are particularly relevant to past computational studies. Models of consciousness developed over the last twenty years are then surveyed. These models are largely found to fall into five categories based on the fundamental issue that their developers have selected as being most central to consciousness: a global workspace, information integration, an internal self-model, higher-level representations, or attention mechanisms. For each of these five categories, an overview of past work is given, a representative example is presented in some detail to illustrate the approach, and comments are provided on the contributions and limitations of the methodology. Three conclusions are offered about the state of the field based on this review: (1) computational modeling has become an effective and accepted methodology for the scientific study of consciousness, (2) existing computational models have successfully captured a number of neurobiological, cognitive, and behavioral correlates of conscious information processing as machine simulations, and (3) no existing approach to artificial consciousness has presented a compelling demonstration of phenomenal machine consciousness, or even clear evidence that artificial phenomenal consciousness will eventually be possible. The paper concludes by discussing the importance of continuing work in this area, considering the ethical issues it raises, and making predictions concerning future developments. Copyright © 2013 Elsevier Ltd. All rights reserved.
Schirrmann, Michael; Joschko, Monika; Gebbers, Robin; Kramer, Eckart; Zörner, Mirjam; Barkusky, Dietmar; Timmer, Jens
2016-01-01
Background Earthworms are important for maintaining soil ecosystem functioning and serve as indicators of soil fertility. However, detection of earthworms is time-consuming, which hinders the assessment of earthworm abundances with high sampling density over entire fields. Recent developments of mobile terrestrial sensor platforms for proximal soil sensing (PSS) provided new tools for collecting dense spatial information of soils using various sensing principles. Yet, the potential of PSS for assessing earthworm habitats is largely unexplored. This study investigates whether PSS data contribute to the spatial prediction of earthworm abundances in species distribution models of agricultural soils. Methodology/Principal Findings Proximal soil sensing data, e.g., soil electrical conductivity (EC), pH, and near infrared absorbance (NIR), were collected in real-time in a field with two management strategies (reduced tillage / conventional tillage) and sandy to loam soils. PSS was related to observations from a long-term (11 years) earthworm observation study conducted at 42 plots. Earthworms were sampled from 0.5 x 0.5 x 0.2 m³ soil blocks and identified to species level. Sensor data were highly correlated with earthworm abundances observed in reduced tillage but less correlated with earthworm abundances observed in conventional tillage. This may indicate that management influences the sensor-earthworm relationship. Generalized additive models and state-space models showed that modelling based on data fusion from EC, pH, and NIR sensors produced better results than modelling without sensor data or data from just a single sensor. Regarding the individual earthworm species, particular sensor combinations were more appropriate than others due to the different habitat requirements of the earthworms. Earthworm species with soil-specific habitat preferences were spatially predicted with higher accuracy by PSS than more ubiquitous species. Conclusions/Significance Our findings suggest that PSS contributes to the spatial modelling of earthworm abundances at field scale and that it will support species distribution modelling in the attempt to understand the soil-earthworm relationships in agroecosystems. PMID:27355340
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Life cycle assessment part 2: current impact assessment practice.
Pennington, D W; Potting, J; Finnveden, G; Lindeijer, E; Jolliet, O; Rydberg, T; Rebitzer, G
2004-07-01
Providing our society with goods and services contributes to a wide range of environmental impacts. Waste generation, emissions and the consumption of resources occur at many stages in a product's life cycle-from raw material extraction, energy acquisition, production and manufacturing, use, reuse, recycling, through to ultimate disposal. These all contribute to impacts such as climate change, stratospheric ozone depletion, photooxidant formation (smog), eutrophication, acidification, toxicological stress on human health and ecosystems, the depletion of resources and noise-among others. The need exists to address these product-related contributions more holistically and in an integrated manner, providing complimentary insights to those of regulatory/process-oriented methodologies. A previous article (Part 1, Rebitzer et al., 2004) outlined how to define and model a product's life cycle in current practice, as well as the methods and tools that are available for compiling the associated waste, emissions and resource consumption data into a life cycle inventory. This article highlights how practitioners and researchers from many domains have come together to provide indicators for the different impacts attributable to products in the life cycle impact assessment (LCIA) phase of life cycle assessment (LCA).
Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes
Zhang, Hong; Pei, Yun
2016-01-01
Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions. PMID:27529266
NASA Astrophysics Data System (ADS)
Kumar, Girish; Jain, Vipul; Gandhi, O. P.
2018-03-01
Maintenance helps to extend equipment life by improving its condition and avoiding catastrophic failures. Appropriate model or mechanism is, thus, needed to quantify system availability vis-a-vis a given maintenance strategy, which will assist in decision-making for optimal utilization of maintenance resources. This paper deals with semi-Markov process (SMP) modeling for steady state availability analysis of mechanical systems that follow condition-based maintenance (CBM) and evaluation of optimal condition monitoring interval. The developed SMP model is solved using two-stage analytical approach for steady-state availability analysis of the system. Also, CBM interval is decided for maximizing system availability using Genetic Algorithm approach. The main contribution of the paper is in the form of a predictive tool for system availability that will help in deciding the optimum CBM policy. The proposed methodology is demonstrated for a centrifugal pump.
Numerical Simulation and Quantitative Uncertainty Assessment of Microchannel Flow
NASA Astrophysics Data System (ADS)
Debusschere, Bert; Najm, Habib; Knio, Omar; Matta, Alain; Ghanem, Roger; Le Maitre, Olivier
2002-11-01
This study investigates the effect of uncertainty in physical model parameters on computed electrokinetic flow of proteins in a microchannel with a potassium phosphate buffer. The coupled momentum, species transport, and electrostatic field equations give a detailed representation of electroosmotic and pressure-driven flow, including sample dispersion mechanisms. The chemistry model accounts for pH-dependent protein labeling reactions as well as detailed buffer electrochemistry in a mixed finite-rate/equilibrium formulation. To quantify uncertainty, the governing equations are reformulated using a pseudo-spectral stochastic methodology, which uses polynomial chaos expansions to describe uncertain/stochastic model parameters, boundary conditions, and flow quantities. Integration of the resulting equations for the spectral mode strengths gives the evolution of all stochastic modes for all variables. Results show the spatiotemporal evolution of uncertainties in predicted quantities and highlight the dominant parameters contributing to these uncertainties during various flow phases. This work is supported by DARPA.
A fuzzy hill-climbing algorithm for the development of a compact associative classifier
NASA Astrophysics Data System (ADS)
Mitra, Soumyaroop; Lam, Sarah S.
2012-02-01
Classification, a data mining technique, has widespread applications including medical diagnosis, targeted marketing, and others. Knowledge discovery from databases in the form of association rules is one of the important data mining tasks. An integrated approach, classification based on association rules, has drawn the attention of the data mining community over the last decade. While attention has been mainly focused on increasing classifier accuracies, not much efforts have been devoted towards building interpretable and less complex models. This paper discusses the development of a compact associative classification model using a hill-climbing approach and fuzzy sets. The proposed methodology builds the rule-base by selecting rules which contribute towards increasing training accuracy, thus balancing classification accuracy with the number of classification association rules. The results indicated that the proposed associative classification model can achieve competitive accuracies on benchmark datasets with continuous attributes and lend better interpretability, when compared with other rule-based systems.
NASA Astrophysics Data System (ADS)
Alifanov, O. M.; Paleshkin, A. V.; Terent‧ev, V. V.; Firsyuk, S. O.
2016-01-01
A methodological approach to determination of the thermal state at a point on the surface of an isothermal element of a small spacecraft has been developed. A mathematical model of heat transfer between surfaces of intricate geometric configuration has been described. In this model, account was taken of the external field of radiant fluxes and of the differentiated mutual influence of the surfaces. An algorithm for calculation of the distribution of the density of the radiation absorbed by surface elements of the object under study has been proposed. The temperature field on the lateral surface of the spacecraft exposed to sunlight and on its shady side has been calculated. By determining the thermal state of magnetic controls of the orientation system as an example, the authors have assessed the contribution of the radiation coming from the solar-cell panels and from the spacecraft surface.
NASA Astrophysics Data System (ADS)
Bednarek, Tomasz; Tsotridis, Georgios
2017-03-01
The objective of the current study is to highlight possible limitations and difficulties associated with Computational Fluid Dynamics in PEM single fuel cell modelling. It is shown that an appropriate convergence methodology should be applied for steady-state solutions, due to inherent numerical instabilities. A single channel fuel cell model has been taken as numerical example. Results are evaluated for quantitative as well qualitative points of view. The contribution to the polarization curve of the different fuel cell components such as bi-polar plates, gas diffusion layers, catalyst layers and membrane was investigated via their effects on the overpotentials. Furthermore, the potential losses corresponding to reaction kinetics, due to ohmic and mas transport limitations and the effect of the exchange current density and open circuit voltage, were also investigated. It is highlighted that the lack of reliable and robust input data is one of the issues for obtaining accurate results.
Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes.
Zhang, Hong; Pei, Yun
2016-08-12
Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions.
Statistical analysis and yield management in LED design through TCAD device simulation
NASA Astrophysics Data System (ADS)
Létay, Gergö; Ng, Wei-Choon; Schneider, Lutz; Bregy, Adrian; Pfeiffer, Michael
2007-02-01
This paper illustrates how technology computer-aided design (TCAD), which nowadays is an essential part of CMOS technology, can be applied to LED development and manufacturing. In the first part, the essential electrical and optical models inherent to LED modeling are reviewed. The second part of the work describes a methodology to improve the efficiency of the simulation procedure by using the concept of process compact models (PCMs). The last part demonstrates the capabilities of PCMs using an example of a blue InGaN LED. In particular, a parameter screening is performed to find the most important parameters, an optimization task incorporating the robustness of the design is carried out, and finally the impact of manufacturing tolerances on yield is investigated. It is indicated how the concept of PCMs can contribute to an efficient design for manufacturing DFM-aware development.
A hybrid modeling with data assimilation to evaluate human exposure level
NASA Astrophysics Data System (ADS)
Koo, Y. S.; Cheong, H. K.; Choi, D.; Kim, A. L.; Yun, H. Y.
2015-12-01
Exposure models are designed to better represent human contact with PM (Particulate Matter) and other air pollutants such as CO, SO2, O3, and NO2. The exposure concentrations of the air pollutants to human are determined by global and regional long range transport of global and regional scales from Europe and China as well as local emissions from urban and road vehicle sources. To assess the exposure level in detail, the multiple scale influence from background to local sources should be considered. A hybrid air quality modeling methodology combing a grid-based chemical transport model with a local plume dispersion model was used to provide spatially and temporally resolved air quality concentration for human exposure levels in Korea. In the hybrid modeling approach, concentrations from a grid-based chemical transport model and a local plume dispersion model are added to provide contributions from photochemical interactions, long-range (regional) transport and local-scale dispersion. The CAMx (Comprehensive Air quality Model with Extensions was used for the background concentrations from anthropogenic and natural emissions in East Asia including Korea while the road dispersion by vehicle emission was calculated by CALPUFF model. The total exposure level of the pollutants was finally assessed by summing the background and road contributions. In the hybrid modeling, the data assimilation method based on the optimal interpolation was applied to overcome the discrepancies between the model predicted concentrations and observations. The air quality data from the air quality monitoring stations in Korea. The spatial resolution of the hybrid model was 50m for the Seoul Metropolitan Ares. This example clearly demonstrates that the exposure level could be estimated to the fine scale for the exposure assessment by using the hybrid modeling approach with data assimilation.
Plants and microorganisms as drivers of mineral weathering
NASA Astrophysics Data System (ADS)
Dontsova, K.; Chorover, J.; Maier, R.; Hunt, E.; Zaharescu, D. G.
2011-12-01
Plants and microorganisms play important role in mineral weathering and soil formation modifying their environment to make it more hospitable for life. This presentation summarizes several collaborative studies that focused on understanding how interactions between plants and microorganisms, where plants provide the energy through photosynthesis, drive mineral weathering and result in soil formation. Plants influence weathering through multiple mechanisms that have been previously established, such as increase in CO2 concentration in the soil through root respiration and degradation of plant residues and exudates by heterotrophic microorganisms, release of organic acids that promote mineral dissolution, removal of weathering products from soil solution through uptake, and water redistribution. Weathering processes result in nutrient release that satisfies immediate needs of the plants and microorganisms, as well as precipitation of secondary phases, that provide surfaces for retention of nutrients and organic carbon accumulation. What makes understanding contribution of plants and microorganisms, such as bacteria and fungi, to mineral weathering challenging is the fact that they closely interact, enhancing and amplifying each other's contribution. In order to address multiple processes that contribute to and result from biological weathering a combination of chemical, biological, mineralogical, and computational techniques and methodologies is needed. This complex array of methodologies includes bulk techniques, such as determination of total dissolved organic and inorganic carbon and nitrogen, ion chromatography and high performance liquid chromatography to characterize amount and composition of exuded organic acids, inductively coupled plasma mass spectrometry to determine concentrations of lithogenic elements in solution, X-ray diffraction to characterize changes in mineral composition of the material, DNA extraction to characterize community structure, as well as microscopic techniques. These techniques in combination with numerical geochemical modeling are being employed to improve our understanding of biological weathering.
'Setting the guinea pigs free': towards a new model of community-led social marketing.
Smith, A J; Henry, L
2009-09-01
To offer the opportunity to discuss the positive contribution of co-production approaches in the field of social marketing. Recognizing the ever-evolving theoretical base for social marketing, this article offers a brief commentary on the positive contribution of co-production approaches in this field. The authors outline their own move towards conceptualizing a community-led social marketing approach and describe some key features. This developing framework has been influenced by, and tested through, the Early Presentation of Cancer Symptoms Programme, a community-led social marketing approach to tackle health inequalities across priority neighbourhoods in North East Lincolnshire, UK. A blend of social marketing, community involvement and rapid improvement science methodologies are drawn upon. The approach involves not just a strong focus on involving communities in insight and consultation, but also adopts methods where they are in charge of the process of generating solutions. A series of monthly and pre/post measures have demonstrated improvements in awareness of symptoms, reported willingness to act and increases in presentation measured through service referrals. Key features of the approach involve shared ownership and a shift away from service-instigated change by enabling communities 'to do' through developing skills and confidence and the conditions to 'try out'. The approach highlights the contribution that co-production approaches have to offer social marketing activity. In order to maximize potential, it is important to consider ways of engaging communities effectively. Successful approaches include translating social marketing methodology into easy-to-use frameworks, involving communities in gathering and interpreting local data, and supporting communities to act as change agents by planning and carrying out activity. The range of impacts across organisational, health and social capital measures demonstrates that multiple and longer-lasting improvements can be achieved with successful approaches.
Verification of Agricultural Methane Emission Inventories
NASA Astrophysics Data System (ADS)
Desjardins, R. L.; Pattey, E.; Worth, D. E.; VanderZaag, A.; Mauder, M.; Srinivasan, R.; Worthy, D.; Sweeney, C.; Metzger, S.
2017-12-01
It is estimated that agriculture contributes more than 40% of anthropogenic methane (CH4) emissions in North America. However, these estimates, which are either based on the Intergovernmental Panel on Climate Change (IPCC) methodology or inverse modeling techniques, are poorly validated due to the challenges of separating interspersed CH4 sources within agroecosystems. A flux aircraft, instrumented with a fast-response Picarro CH4 analyzer for the eddy covariance (EC) technique and a sampling system for the relaxed eddy accumulation technique (REA), was flown at an altitude of about 150 m along several 20-km transects over an agricultural region in Eastern Canada. For all flight days, the top-down CH4 flux density measurements were compared to the footprint adjusted bottom-up estimates based on an IPCC Tier II methodology. Information on the animal population, land use type and atmospheric and surface variables were available for each transect. Top-down and bottom-up estimates of CH4 emissions were found to be poorly correlated, and wetlands were the most frequent confounding source of CH4; however, there were other sources such as waste treatment plants and biodigesters. Spatially resolved wavelet covariance estimates of CH4 emissions helped identify the contribution of wetlands to the overall CH4 flux, and the dependence of these emissions on temperature. When wetland contribution in the flux footprint was minimized, top-down and bottom-up estimates agreed to within measurement error. This research demonstrates that although existing aircraft-based technology can be used to verify regional ( 100 km2) agricultural CH4 emissions, it remains challenging due to diverse sources of CH4 present in many regions. The use of wavelet covariance to generate spatially-resolved flux estimates was found to be the best way to separate interspersed sources of CH4.
Wild, Verina; Carina, Fourie; Frouzakis, Regula; Clarinval, Caroline; Fässler, Margrit; Elger, Bernice; Gächter, Thomas; Leu, Agnes; Spirig, Rebecca; Kleinknecht, Michael; Radovanovic, Dragana; Mouton Dorey, Corine; Burnand, Bernard; Vader, John-Paul; Januel, Jean-Marie; Biller-Andorno, Nikola; The IDoC Group
2015-01-01
The starting point of the interdisciplinary project "Assessing the impact of diagnosis related groups (DRGs) on patient care and professional practice" (IDoC) was the lack of a systematic ethical assessment for the introduction of cost containment measures in healthcare. Our aim was to contribute to the methodological and empirical basis of such an assessment. Five sub-groups conducted separate but related research within the fields of biomedical ethics, law, nursing sciences and health services, applying a number of complementary methodological approaches. The individual research projects were framed within an overall ethical matrix. Workshops and bilateral meetings were held to identify and elaborate joint research themes. Four common, ethically relevant themes emerged in the results of the studies across sub-groups: (1.) the quality and safety of patient care, (2.) the state of professional practice of physicians and nurses, (3.) changes in incentives structure, (4.) vulnerable groups and access to healthcare services. Furthermore, much-needed data for future comparative research has been collected and some early insights into the potential impact of DRGs are outlined. Based on the joint results we developed preliminary recommendations related to conceptual analysis, methodological refinement, monitoring and implementation.
Durif-Bruckert, C; Roux, P; Morelle, M; Mignotte, H; Faure, C; Moumjid-Ferdjaoui, N
2015-07-01
The aim of this study on shared decision-making in the doctor-patient encounter about surgical treatment for early-stage breast cancer, conducted in a regional cancer centre in France, was to further the understanding of patient perceptions on shared decision-making. The study used methodological triangulation to collect data (both quantitative and qualitative) about patient preferences in the context of a clinical consultation in which surgeons followed a shared decision-making protocol. Data were analysed from a multi-disciplinary research perspective (social psychology and health economics). The triangulated data collection methods were questionnaires (n = 132), longitudinal interviews (n = 47) and observations of consultations (n = 26). Methodological triangulation revealed levels of divergence and complementarity between qualitative and quantitative results that suggest new perspectives on the three inter-related notions of decision-making, participation and information. Patients' responses revealed important differences between shared decision-making and participation per se. The authors note that subjecting patients to a normative behavioural model of shared decision-making in an era when paradigms of medical authority are shifting may undermine the patient's quest for what he or she believes is a more important right: a guarantee of the best care available. © 2014 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Marti, Joan; Bartolini, Stefania; Becerril, Laura
2016-04-01
VeTOOLS is a project funded by the European Commission's Humanitarian Aid and Civil Protection department (ECHO), and aims at creating an integrated software platform specially designed to assess and manage volcanic risk. The project facilitates interaction and cooperation between scientists and Civil Protection Agencies in order to share, unify, and exchange procedures, methodologies and technologies to effectively reduce the impacts of volcanic disasters. The project aims at 1) improving and developing volcanic risk assessment and management capacities in active volcanic regions; 2) developing universal methodologies, scenario definitions, response strategies and alert protocols to cope with the full range of volcanic threats; 4) improving quantitative methods and tools for vulnerability and risk assessment; and 5) defining thresholds and protocols for civil protection. With these objectives, the VeTOOLS project points to two of the Sendai Framework resolutions for implementing it: i) Provide guidance on methodologies and standards for risk assessments, disaster risk modelling and the use of data; ii) Promote and support the availability and application of science and technology to decision-making, and offers a good example on how a close collaboration between science and civil protection is an effective way to contribute to DRR. European Commission ECHO Grant SI2.695524
Murray, Christian J; Lipfert, Frederick W
2012-01-01
Many publications estimate short-term air pollution-mortality risks, but few estimate the associated changes in life-expectancies. We present a new methodology for analyzing time series of health effects, in which prior frailty is assumed to precede short-term elderly nontraumatic mortality. The model is based on a subpopulation of frail individuals whose entries and exits (deaths) are functions of daily and lagged environmental conditions: ambient temperature/season, airborne particles, and ozone. This frail susceptible population is unknown; its fluctuations cannot be observed but are estimated using maximum-likelihood methods with the Kalman filter. We used an existing 14-y set of daily data to illustrate the model and then tested the assumption of prior frailty with a new generalized model that estimates the portion of the daily death count allocated to nonfrail individuals. In this demonstration dataset, new entries into the high-risk pool are associated with lower ambient temperatures and higher concentrations of particulate matter and ozone. Accounting for these effects on antecedent frailty reduces this at-risk population, yielding frail life expectancies of 5-7 days. Associations between environmental factors and entries to the at-risk pool are about twice as strong as for mortality. Nonfrail elderly deaths are seen to make only small contributions. This new model predicts a small short-lived frail population-at-risk that is stable over a wide range of environmental conditions. The predicted effects of pollution on new entries and deaths are robust and consistent with conventional morbidity/mortality times-series studies. We recommend model verification using other suitable datasets.
2D Inversion of Transient Electromagnetic Method (TEM)
NASA Astrophysics Data System (ADS)
Bortolozo, Cassiano Antonio; Luís Porsani, Jorge; Acácio Monteiro dos Santos, Fernando
2017-04-01
A new methodology was developed for 2D inversion of Transient Electromagnetic Method (TEM). The methodology consists in the elaboration of a set of routines in Matlab code for modeling and inversion of TEM data and the determination of the most efficient field array for the problem. In this research, the 2D TEM modeling uses the finite differences discretization. To solve the inversion problem, were applied an algorithm based on Marquardt technique, also known as Ridge Regression. The algorithm is stable and efficient and it is widely used in geoelectrical inversion problems. The main advantage of 1D survey is the rapid data acquisition in a large area, but in regions with two-dimensional structures or that need more details, is essential to use two-dimensional interpretation methodologies. For an efficient field acquisition we used in an innovative form the fixed-loop array, with a square transmitter loop (200m x 200m) and 25m spacing between the sounding points. The TEM surveys were conducted only inside the transmitter loop, in order to not deal with negative apparent resistivity values. Although it is possible to model the negative values, it makes the inversion convergence more difficult. Therefore the methodology described above has been developed in order to achieve maximum optimization of data acquisition. Since it is necessary only one transmitter loop disposition in the surface for each series of soundings inside the loop. The algorithms were tested with synthetic data and the results were essential to the interpretation of the results with real data and will be useful in future situations. With the inversion of the real data acquired over the Paraná Sedimentary Basin (PSB) was successful realized a 2D TEM inversion. The results indicate a robust geoelectrical characterization for the sedimentary and crystalline aquifers in the PSB. Therefore, using a new and relevant approach for 2D TEM inversion, this research effectively contributed to map the most promising regions for groundwater exploration. In addition, there was the development of new geophysical software that can be applied as an important tool for many geological/hydrogeological applications and educational purposes.
Riley, Richard D; Ensor, Joie; Jackson, Dan; Burke, Danielle L
2017-01-01
Many meta-analysis models contain multiple parameters, for example due to multiple outcomes, multiple treatments or multiple regression coefficients. In particular, meta-regression models may contain multiple study-level covariates, and one-stage individual participant data meta-analysis models may contain multiple patient-level covariates and interactions. Here, we propose how to derive percentage study weights for such situations, in order to reveal the (otherwise hidden) contribution of each study toward the parameter estimates of interest. We assume that studies are independent, and utilise a decomposition of Fisher's information matrix to decompose the total variance matrix of parameter estimates into study-specific contributions, from which percentage weights are derived. This approach generalises how percentage weights are calculated in a traditional, single parameter meta-analysis model. Application is made to one- and two-stage individual participant data meta-analyses, meta-regression and network (multivariate) meta-analysis of multiple treatments. These reveal percentage study weights toward clinically important estimates, such as summary treatment effects and treatment-covariate interactions, and are especially useful when some studies are potential outliers or at high risk of bias. We also derive percentage study weights toward methodologically interesting measures, such as the magnitude of ecological bias (difference between within-study and across-study associations) and the amount of inconsistency (difference between direct and indirect evidence in a network meta-analysis).
[Evaluation of the first training on clinical research methodology in Chile].
Espinoza, Manuel; Cabieses, Báltica; Pedreros, César; Zitko, Pedro
2011-03-01
This paper describes the evaluation of the first training on clinical research methodology in Chile (EMIC-Chile) 12 months after its completion. An online survey was conducted for students and the Delphi method was used for the teaching team. Among the students, the majority reported that the program had contributed to their professional development and that they had shared some of the knowledge acquired with colleagues in their workplace. Forty-one percent submitted a project to obtain research funding through a competitive grants process once they had completed the course. Among the teachers, the areas of greatest interest were the communication strategy, teaching methods, the characteristics of the teaching team, and potential strategies for making the EMIC-Chile permanent in the future. This experience could contribute to future research training initiatives for health professionals. Recognized challenges are the involvement of nonmedical professions in clinical research, the complexities associated with the distance learning methodology, and the continued presence of initiatives of this importance at the national and regional level.
Mauricio-Iglesias, Miguel; Montero-Castro, Ignacio; Mollerup, Ane L; Sin, Gürkan
2015-05-15
The design of sewer system control is a complex task given the large size of the sewer networks, the transient dynamics of the water flow and the stochastic nature of rainfall. This contribution presents a generic methodology for the design of a self-optimising controller in sewer systems. Such controller is aimed at keeping the system close to the optimal performance, thanks to an optimal selection of controlled variables. The definition of an optimal performance was carried out by a two-stage optimisation (stochastic and deterministic) to take into account both the overflow during the current rain event as well as the expected overflow given the probability of a future rain event. The methodology is successfully applied to design an optimising control strategy for a subcatchment area in Copenhagen. The results are promising and expected to contribute to the advance of the operation and control problem of sewer systems. Copyright © 2015 Elsevier Ltd. All rights reserved.
What has the study of digital games contributed to the science of expert behavior?
Charness, Neil
2017-01-01
I review the historical context for modeling skilled performance in games. Using Newell’s (1990) concept of time bands for explaining cognitive behavior, I categorize the current papers in terms of time scales, type of data, and analysis methodologies. I discuss strengths and weaknesses of these approaches for describing skill acquisition and why the study of digital games can address the challenges of replication and generalizability. Cognitive science needs to pay closer attention to population representativeness to enhance generalizability of findings, and to the social band of explanation, in order to explain why so few individuals reach expert levels of performance. PMID:28176450
Community health nursing advocacy: a concept analysis.
Ezeonwu, Mabel C
2015-01-01
The purpose of this article is to present an in-depth analysis of the concept of community health nursing (CHN) advocacy. Walker and Avant's (2010) 8-step concept analysis methodology was used. A broad inquiry into the literature between 1994 and 2014 resulted in the identification of the uses, defining attributes, empirical referents, antecedents, and consequences, as well as the articulation of an operational definition of CHN advocacy. Model and contrary cases were identified to demonstrate the concept's application and to clarify its meaning. This analysis contributes to the advancement of knowledge of CHN advocacy and provides nurse clinicians, educators, and researchers with some conceptual clarity to help improve community health outcomes.
Characterization of autoregressive processes using entropic quantifiers
NASA Astrophysics Data System (ADS)
Traversaro, Francisco; Redelico, Francisco O.
2018-01-01
The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.
System Dynamics Modeling for Proactive Intelligence
2010-01-01
5 4. Modeling Resources as Part of an Integrated Multi- Methodology System .................. 16 5. Formalizing Pro-Active...Observable Data With and Without Simulation Analysis ............................... 15 Figure 13. Summary of Probe Methodology and Results...Strategy ............................................................................. 22 Figure 22. Overview of Methodology
DOT National Transportation Integrated Search
2001-03-05
A systems modeling approach is presented for assessment of harm in the automotive accident environment. The methodology is presented in general form and then applied to evaluate vehicle aggressivity in frontal crashes. The methodology consists of par...
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.