Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...
Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...
USDA-ARS?s Scientific Manuscript database
Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...
Integrated Environmental Modeling: Quantitative Microbial Risk Assessment
The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...
Wu, Zujian; Pang, Wei; Coghill, George M
2015-01-01
Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.
The new AP Physics exams: Integrating qualitative and quantitative reasoning
NASA Astrophysics Data System (ADS)
Elby, Andrew
2015-04-01
When physics instructors and education researchers emphasize the importance of integrating qualitative and quantitative reasoning in problem solving, they usually mean using those types of reasoning serially and separately: first students should analyze the physical situation qualitatively/conceptually to figure out the relevant equations, then they should process those equations quantitatively to generate a solution, and finally they should use qualitative reasoning to check that answer for plausibility (Heller, Keith, & Anderson, 1992). The new AP Physics 1 and 2 exams will, of course, reward this approach to problem solving. But one kind of free response question will demand and reward a further integration of qualitative and quantitative reasoning, namely mathematical modeling and sense-making--inventing new equations to capture a physical situation and focusing on proportionalities, inverse proportionalities, and other functional relations to infer what the equation ``says'' about the physical world. In this talk, I discuss examples of these qualitative-quantitative translation questions, highlighting how they differ from both standard quantitative and standard qualitative questions. I then discuss the kinds of modeling activities that can help AP and college students develop these skills and habits of mind.
The mathematics of cancer: integrating quantitative models.
Altrock, Philipp M; Liu, Lin L; Michor, Franziska
2015-12-01
Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.
The Mapping Model: A Cognitive Theory of Quantitative Estimation
ERIC Educational Resources Information Center
von Helversen, Bettina; Rieskamp, Jorg
2008-01-01
How do people make quantitative estimations, such as estimating a car's selling price? Traditionally, linear-regression-type models have been used to answer this question. These models assume that people weight and integrate all information available to estimate a criterion. The authors propose an alternative cognitive theory for quantitative…
3D Printed Organ Models with Physical Properties of Tissue and Integrated Sensors.
Qiu, Kaiyan; Zhao, Zichen; Haghiashtiani, Ghazaleh; Guo, Shuang-Zhuang; He, Mingyu; Su, Ruitao; Zhu, Zhijie; Bhuiyan, Didarul B; Murugan, Paari; Meng, Fanben; Park, Sung Hyun; Chu, Chih-Chang; Ogle, Brenda M; Saltzman, Daniel A; Konety, Badrinath R; Sweet, Robert M; McAlpine, Michael C
2018-03-01
The design and development of novel methodologies and customized materials to fabricate patient-specific 3D printed organ models with integrated sensing capabilities could yield advances in smart surgical aids for preoperative planning and rehearsal. Here, we demonstrate 3D printed prostate models with physical properties of tissue and integrated soft electronic sensors using custom-formulated polymeric inks. The models show high quantitative fidelity in static and dynamic mechanical properties, optical characteristics, and anatomical geometries to patient tissues and organs. The models offer tissue-mimicking tactile sensation and behavior and thus can be used for the prediction of organ physical behavior under deformation. The prediction results show good agreement with values obtained from simulations. The models also allow the application of surgical and diagnostic tools to their surface and inner channels. Finally, via the conformal integration of 3D printed soft electronic sensors, pressure applied to the models with surgical tools can be quantitatively measured.
3D Printed Organ Models with Physical Properties of Tissue and Integrated Sensors
Qiu, Kaiyan; Zhao, Zichen; Haghiashtiani, Ghazaleh; Guo, Shuang-Zhuang; He, Mingyu; Su, Ruitao; Zhu, Zhijie; Bhuiyan, Didarul B.; Murugan, Paari; Meng, Fanben; Park, Sung Hyun; Chu, Chih-Chang; Ogle, Brenda M.; Saltzman, Daniel A.; Konety, Badrinath R.
2017-01-01
The design and development of novel methodologies and customized materials to fabricate patient-specific 3D printed organ models with integrated sensing capabilities could yield advances in smart surgical aids for preoperative planning and rehearsal. Here, we demonstrate 3D printed prostate models with physical properties of tissue and integrated soft electronic sensors using custom-formulated polymeric inks. The models show high quantitative fidelity in static and dynamic mechanical properties, optical characteristics, and anatomical geometries to patient tissues and organs. The models offer tissue-mimicking tactile sensation and behavior and thus can be used for the prediction of organ physical behavior under deformation. The prediction results show good agreement with values obtained from simulations. The models also allow the application of surgical and diagnostic tools to their surface and inner channels. Finally, via the conformal integration of 3D printed soft electronic sensors, pressure applied to the models with surgical tools can be quantitatively measured. PMID:29608202
Modeling of Receptor Tyrosine Kinase Signaling: Computational and Experimental Protocols.
Fey, Dirk; Aksamitiene, Edita; Kiyatkin, Anatoly; Kholodenko, Boris N
2017-01-01
The advent of systems biology has convincingly demonstrated that the integration of experiments and dynamic modelling is a powerful approach to understand the cellular network biology. Here we present experimental and computational protocols that are necessary for applying this integrative approach to the quantitative studies of receptor tyrosine kinase (RTK) signaling networks. Signaling by RTKs controls multiple cellular processes, including the regulation of cell survival, motility, proliferation, differentiation, glucose metabolism, and apoptosis. We describe methods of model building and training on experimentally obtained quantitative datasets, as well as experimental methods of obtaining quantitative dose-response and temporal dependencies of protein phosphorylation and activities. The presented methods make possible (1) both the fine-grained modeling of complex signaling dynamics and identification of salient, course-grained network structures (such as feedback loops) that bring about intricate dynamics, and (2) experimental validation of dynamic models.
Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites...
Integrated computational model of the bioenergetics of isolated lung mitochondria
Zhang, Xiao; Jacobs, Elizabeth R.; Camara, Amadou K. S.; Clough, Anne V.
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria. PMID:29889855
Integrated computational model of the bioenergetics of isolated lung mitochondria.
Zhang, Xiao; Dash, Ranjan K; Jacobs, Elizabeth R; Camara, Amadou K S; Clough, Anne V; Audi, Said H
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria.
ERIC Educational Resources Information Center
Rodriguez-Barbero, A.; Lopez-Novoa, J. M.
2008-01-01
One of the problems that we have found when teaching human physiology in a Spanish medical school is that the degree of understanding by the students of the integration between organs and systems is rather poor. We attempted to remedy this problem by using a case discussion method together with the Quantitative Circulatory Physiology (QCP)…
Integration of Social Sciences in Terrorism Modelling: Issues, Problems and Recommendations
2007-02-01
qualitative social research : empirical data, patterns, regularities and case studies Terrorism emergence: causes...quantitative and qualitative methods in studies of terrorism, mass violence and conflicts, suggested models of human behaviour response to the threat of...epistemology of social research , demographics, quantitative sociological research , qualitative social research , cultural studies , etc.) can contribute
Chirumbolo, Antonio; Urbini, Flavio; Callea, Antonino; Lo Presti, Alessandro; Talamo, Alessandra
2017-01-01
One of the more visible effects of the societal changes is the increased feelings of uncertainty in the workforce. In fact, job insecurity represents a crucial occupational risk factor and a major job stressor that has negative consequences on both organizational well-being and individual health. Many studies have focused on the consequences about the fear and the perception of losing the job as a whole (called quantitative job insecurity), while more recently research has begun to examine more extensively the worries and the perceptions of losing valued job features (called qualitative job insecurity). The vast majority of the studies, however, have investigated the effects of quantitative and qualitative job insecurity separately. In this paper, we proposed the Job Insecurity Integrated Model aimed to examine the effects of quantitative job insecurity and qualitative job insecurity on their short-term and long-term outcomes. This model was empirically tested in two independent studies, hypothesizing that qualitative job insecurity mediated the effects of quantitative job insecurity on different outcomes, such as work engagement and organizational identification (Study 1), and job satisfaction, commitment, psychological stress and turnover intention (Study 2). Study 1 was conducted on 329 employees in private firms, while Study 2 on 278 employees in both public sector and private firms. Results robustly showed that qualitative job insecurity totally mediated the effects of quantitative on all the considered outcomes. By showing that the effects of quantitative job insecurity on its outcomes passed through qualitative job insecurity, the Job Insecurity Integrated Model contributes to clarifying previous findings in job insecurity research and puts forward a framework that could profitably produce new investigations with important theoretical and practical implications. PMID:29250013
Chirumbolo, Antonio; Urbini, Flavio; Callea, Antonino; Lo Presti, Alessandro; Talamo, Alessandra
2017-01-01
One of the more visible effects of the societal changes is the increased feelings of uncertainty in the workforce. In fact, job insecurity represents a crucial occupational risk factor and a major job stressor that has negative consequences on both organizational well-being and individual health. Many studies have focused on the consequences about the fear and the perception of losing the job as a whole (called quantitative job insecurity), while more recently research has begun to examine more extensively the worries and the perceptions of losing valued job features (called qualitative job insecurity). The vast majority of the studies, however, have investigated the effects of quantitative and qualitative job insecurity separately. In this paper, we proposed the Job Insecurity Integrated Model aimed to examine the effects of quantitative job insecurity and qualitative job insecurity on their short-term and long-term outcomes. This model was empirically tested in two independent studies, hypothesizing that qualitative job insecurity mediated the effects of quantitative job insecurity on different outcomes, such as work engagement and organizational identification (Study 1), and job satisfaction, commitment, psychological stress and turnover intention (Study 2). Study 1 was conducted on 329 employees in private firms, while Study 2 on 278 employees in both public sector and private firms. Results robustly showed that qualitative job insecurity totally mediated the effects of quantitative on all the considered outcomes. By showing that the effects of quantitative job insecurity on its outcomes passed through qualitative job insecurity, the Job Insecurity Integrated Model contributes to clarifying previous findings in job insecurity research and puts forward a framework that could profitably produce new investigations with important theoretical and practical implications.
NASA Astrophysics Data System (ADS)
Mo, Yunjeong
The purpose of this research is to support the development of an intelligent Decision Support System (DSS) by integrating quantitative information with expert knowledge in order to facilitate effective retrofit decision-making. To achieve this goal, the Energy Retrofit Decision Process Framework is analyzed. Expert system shell software, a retrofit measure cost database, and energy simulation software are needed for developing the DSS; Exsys Corvid, the NREM database and BEopt were chosen for implementing an integration model. This integration model demonstrates the holistic function of a residential energy retrofit system for existing homes, by providing a prioritized list of retrofit measures with cost information, energy simulation and expert advice. The users, such as homeowners and energy auditors, can acquire all of the necessary retrofit information from this unified system without having to explore several separate systems. The integration model plays the role of a prototype for the finalized intelligent decision support system. It implements all of the necessary functions for the finalized DSS, including integration of the database, energy simulation and expert knowledge.
An evidential reasoning extension to quantitative model-based failure diagnosis
NASA Technical Reports Server (NTRS)
Gertler, Janos J.; Anderson, Kenneth C.
1992-01-01
The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pietzcker, Robert C.; Ueckerdt, Falko; Carrara, Samuel
Mitigation-Process Integrated Assessment Models (MP-IAMs) are used to analyze long-term transformation pathways of the energy system required to achieve stringent climate change mitigation targets. Due to their substantial temporal and spatial aggregation, IAMs cannot explicitly represent all detailed challenges of integrating the variable renewable energies (VRE) wind and solar in power systems, but rather rely on parameterized modeling approaches. In the ADVANCE project, six international modeling teams have developed new approaches to improve the representation of power sector dynamics and VRE integration in IAMs. In this study, we qualitatively and quantitatively evaluate the last years' modeling progress and study themore » impact of VRE integration modeling on VRE deployment in IAM scenarios. For a comprehensive and transparent qualitative evaluation, we first develop a framework of 18 features of power sector dynamics and VRE integration. We then apply this framework to the newly-developed modeling approaches to derive a detailed map of strengths and limitations of the different approaches. For the quantitative evaluation, we compare the IAMs to the detailed hourly-resolution power sector model REMIX. We find that the new modeling approaches manage to represent a large number of features of the power sector, and the numerical results are in reasonable agreement with those derived from the detailed power sector model. Updating the power sector representation and the cost and resources of wind and solar substantially increased wind and solar shares across models: Under a carbon price of 30$/tCO2 in 2020 (increasing by 5% per year), the model-average cost-minimizing VRE share over the period 2050-2100 is 62% of electricity generation, 24%-points higher than with the old model version.« less
An overview of quantitative approaches in Gestalt perception.
Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H
2016-09-01
Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Knight Commission on Intercollegiate Athletics, 2009
2009-01-01
The Knight Commission's landmark 1991 report, "Keeping Faith with the Student-Athlete: A New Model for Intercollegiate Athletics," proposed a new "one-plus-three" model for intercollegiate athletics--presidential control directed toward academic integrity, fiscal integrity, and an independent certification process to verify that integrity. Indeed,…
A quantitative visual dashboard to explore exposures to consumer product ingredients
The Exposure Prioritization (Ex Priori) model features a simplified, quantitative visual dashboard to explore exposures across chemical space. Diverse data streams are integrated within the interface such that different exposure scenarios for “individual,” “pop...
NASA Technical Reports Server (NTRS)
Connolly, Joseph W.; Kopasakis, George
2010-01-01
This paper covers the propulsion system component modeling and controls development of an integrated mixed compression inlet and turbojet engine that will be used for an overall vehicle Aero-Propulso-Servo-Elastic (APSE) model. Using previously created nonlinear component-level propulsion system models, a linear integrated propulsion system model and loop shaping control design have been developed. The design includes both inlet normal shock position control and jet engine rotor speed control for a potential supersonic commercial transport. A preliminary investigation of the impacts of the aero-elastic effects on the incoming flow field to the propulsion system are discussed, however, the focus here is on developing a methodology for the propulsion controls design that prevents unstart in the inlet and minimizes the thrust oscillation experienced by the vehicle. Quantitative Feedback Theory (QFT) specifications and bounds, and aspects of classical loop shaping are used in the control design process. Model uncertainty is incorporated in the design to address possible error in the system identification mapping of the nonlinear component models into the integrated linear model.
NASA Astrophysics Data System (ADS)
Wang, Q. J.; Robertson, D. E.; Haines, C. L.
2009-02-01
Irrigation is important to many agricultural businesses but also has implications for catchment health. A considerable body of knowledge exists on how irrigation management affects farm business and catchment health. However, this knowledge is fragmentary; is available in many forms such as qualitative and quantitative; is dispersed in scientific literature, technical reports, and the minds of individuals; and is of varying degrees of certainty. Bayesian networks allow the integration of dispersed knowledge into quantitative systems models. This study describes the development, validation, and application of a Bayesian network model of farm irrigation in the Shepparton Irrigation Region of northern Victoria, Australia. In this first paper we describe the process used to integrate a range of sources of knowledge to develop a model of farm irrigation. We describe the principal model components and summarize the reaction to the model and its development process by local stakeholders. Subsequent papers in this series describe model validation and the application of the model to assess the regional impact of historical and future management intervention.
Designing automation for human use: empirical studies and quantitative models.
Parasuraman, R
2000-07-01
An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.
ERIC Educational Resources Information Center
van Zomeren, Martijn; Postmes, Tom; Spears, Russell
2008-01-01
An integrative social identity model of collective action (SIMCA) is developed that incorporates 3 socio-psychological perspectives on collective action. Three meta-analyses synthesized a total of 182 effects of perceived injustice, efficacy, and identity on collective action (corresponding to these socio-psychological perspectives). Results…
Planner-Based Control of Advanced Life Support Systems
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Kortenkamp, David; Fry, Chuck; Bell, Scott
2005-01-01
The paper describes an approach to the integration of qualitative and quantitative modeling techniques for advanced life support (ALS) systems. Developing reliable control strategies that scale up to fully integrated life support systems requires augmenting quantitative models and control algorithms with the abstractions provided by qualitative, symbolic models and their associated high-level control strategies. This will allow for effective management of the combinatorics due to the integration of a large number of ALS subsystems. By focusing control actions at different levels of detail and reactivity we can use faster: simpler responses at the lowest level and predictive but complex responses at the higher levels of abstraction. In particular, methods from model-based planning and scheduling can provide effective resource management over long time periods. We describe reference implementation of an advanced control system using the IDEA control architecture developed at NASA Ames Research Center. IDEA uses planning/scheduling as the sole reasoning method for predictive and reactive closed loop control. We describe preliminary experiments in planner-based control of ALS carried out on an integrated ALS simulation developed at NASA Johnson Space Center.
Integrated urban systems model with multiple transportation supply agents.
DOT National Transportation Integrated Search
2012-10-01
This project demonstrates the feasibility of developing quantitative models that can forecast future networks under : current and alternative transportation planning processes. The current transportation planning process is modeled : based on empiric...
"Could I return to my life?" Integrated Narrative Nursing Model in Education (INNE).
Artioli, Giovanna; Foà, Chiara; Cosentino, Chiara; Sulla, Francesco; Sollami, Alfonso; Taffurelli, Chiara
2018-03-28
The Integrated Narrative Nursing Model (INNM) is an approach that integrates the qualitative methodology typical of the human sciences, with the quantitative methodology more often associated with the natural sciences. This complex model, which combines a focus on narrative with quantitative measures, has recently been effectively applied to the assessment of chronic patients. In this study, the model is applied to the planning phase of education (Integrated Narrative Nursing Education, INNE), and proves to be a valid instrument for the promotion of the current educational paradigm that is centered on the engagement of both the patient and the caregiver in their own path of care. The aim of this study is therefore to describe the nurse's strategy in the planning of an educational intervention by using the INNE model. The case of a 70-year-old woman with pulmonary neoplasm is described at her first admission to Hospice. Each step conducted by the reference nurse, who uses INNE to record the nurse-patient narrative and collect subsequent questionnaires in order to create a shared educational plan, is also described. The information collected was submitted, starting from a grounded methodology to the following four levels of analysis: I. Needs Assessment, II. Narrative Diagnosis, III. Quantitative Outcome, IV. Integrated Outcome. Step IV, which is derived from the integration of all levels of analysis, allows a nurse to define, even graphically, the conceptual map of a patient's needs, resources and perspectives, in a completely tailored manner. The INNE model offers a valid methodological support for the professional who intends to educate the patient through an inter-subjective and engaged pathway, between the professional, their patient and the socio-relational context. It is a matter of adopting a complex vision that combines processes and methods that require a steady scientific basis and advanced methodological expertise with active listening and empathy - skills which require emotional intelligence.
Behavioral Assembly Required: Particularly for Quantitative Courses
ERIC Educational Resources Information Center
Mazen, Abdelmagid
2008-01-01
This article integrates behavioral approaches into the teaching and learning of quantitative subjects with application to statistics. Focusing on the emotional component of learning, the article presents a system dynamic model that provides descriptive and prescriptive accounts of learners' anxiety. Metaphors and the metaphorizing process are…
Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...
USDA-ARS?s Scientific Manuscript database
Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...
Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff
2016-01-01
Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161
ERIC Educational Resources Information Center
Sun, Yan; Strobel, Johannes; Newby, Timothy J.
2017-01-01
Adopting a two-phase explanatory sequential mixed methods research design, the current study examined the impact of student teaching experiences on pre-service teachers' readiness for technology integration. In phase-1 of quantitative investigation, 2-level growth curve models were fitted using online repeated measures survey data collected from…
Source-to-Outcome Microbial Exposure and Risk Modeling Framework
A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...
ERIC Educational Resources Information Center
Owens, Susan T.
2017-01-01
Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…
Quantitative 13C NMR characterization of fast pyrolysis oils
Happs, Renee M.; Lisa, Kristina; Ferrell, III, Jack R.
2016-10-20
Quantitative 13C NMR analysis of model catalytic fast pyrolysis (CFP) oils following literature procedures showed poor agreement for aromatic hydrocarbons between NMR measured concentrations and actual composition. Furthermore, modifying integration regions based on DEPT analysis for aromatic carbons resulted in better agreement. Solvent effects were also investigated for hydrotreated CFP oil.
ERIC Educational Resources Information Center
Brown, Aaron D.
2016-01-01
The intent of this research is to offer a quantitative analysis of self-determined faculty motivation within the current corporate model of higher education across public and private research universities. With such a heightened integration of accountability structures, external reward systems, and the ongoing drive for more money and…
Quantitative 13C NMR characterization of fast pyrolysis oils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Happs, Renee M.; Lisa, Kristina; Ferrell, III, Jack R.
Quantitative 13C NMR analysis of model catalytic fast pyrolysis (CFP) oils following literature procedures showed poor agreement for aromatic hydrocarbons between NMR measured concentrations and actual composition. Furthermore, modifying integration regions based on DEPT analysis for aromatic carbons resulted in better agreement. Solvent effects were also investigated for hydrotreated CFP oil.
2012-01-01
Background Sasang constitutional medicine (SCM) is a unique form of traditional Korean medicine that divides human beings into four constitutional types (Tae-Yang: TY, Tae-Eum: TE, So-Yang: SY, and So-Eum: SE), which differ in inherited characteristics, such as external appearance, personality traits, susceptibility to particular diseases, drug responses, and equilibrium among internal organ functions. According to SCM, herbs that belong to a certain constitution cannot be used in patients with other constitutions; otherwise, this practice may result in no effect or in an adverse effect. Thus, the diagnosis of SC type is the most crucial step in SCM practice. The diagnosis, however, tends to be subjective due to a lack of quantitative standards for SC diagnosis. Methods We have attempted to make the diagnosis method as objective as possible by basing it on an analysis of quantitative data from various Oriental medical clinics. Four individual diagnostic models were developed with multinomial logistic regression based on face, body shape, voice, and questionnaire responses. Inspired by SCM practitioners’ holistic diagnostic processes, an integrated diagnostic model was then proposed by combining the four individual models. Results The diagnostic accuracies in the test set, after the four individual models had been integrated into a single model, improved to 64.0% and 55.2% in the male and female patient groups, respectively. Using a cut-off value for the integrated SC score, such as 1.6, the accuracies increased by 14.7% in male patients and by 4.6% in female patients, which showed that a higher integrated SC score corresponded to a higher diagnostic accuracy. Conclusions This study represents the first trial of integrating the objectification of SC diagnosis based on quantitative data and SCM practitioners’ holistic diagnostic processes. Although the diagnostic accuracy was not great, it is noted that the proposed diagnostic model represents common rules among practitioners who have various points of view. Our results are expected to contribute as a desirable research guide for objective diagnosis in traditional medicine, as well as to contribute to the precise diagnosis of SC types in an objective manner in clinical practice. PMID:22762505
Do, Jun-Hyeong; Jang, Eunsu; Ku, Boncho; Jang, Jun-Su; Kim, Honggie; Kim, Jong Yeol
2012-07-04
Sasang constitutional medicine (SCM) is a unique form of traditional Korean medicine that divides human beings into four constitutional types (Tae-Yang: TY, Tae-Eum: TE, So-Yang: SY, and So-Eum: SE), which differ in inherited characteristics, such as external appearance, personality traits, susceptibility to particular diseases, drug responses, and equilibrium among internal organ functions. According to SCM, herbs that belong to a certain constitution cannot be used in patients with other constitutions; otherwise, this practice may result in no effect or in an adverse effect. Thus, the diagnosis of SC type is the most crucial step in SCM practice. The diagnosis, however, tends to be subjective due to a lack of quantitative standards for SC diagnosis. We have attempted to make the diagnosis method as objective as possible by basing it on an analysis of quantitative data from various Oriental medical clinics. Four individual diagnostic models were developed with multinomial logistic regression based on face, body shape, voice, and questionnaire responses. Inspired by SCM practitioners' holistic diagnostic processes, an integrated diagnostic model was then proposed by combining the four individual models. The diagnostic accuracies in the test set, after the four individual models had been integrated into a single model, improved to 64.0% and 55.2% in the male and female patient groups, respectively. Using a cut-off value for the integrated SC score, such as 1.6, the accuracies increased by 14.7% in male patients and by 4.6% in female patients, which showed that a higher integrated SC score corresponded to a higher diagnostic accuracy. This study represents the first trial of integrating the objectification of SC diagnosis based on quantitative data and SCM practitioners' holistic diagnostic processes. Although the diagnostic accuracy was not great, it is noted that the proposed diagnostic model represents common rules among practitioners who have various points of view. Our results are expected to contribute as a desirable research guide for objective diagnosis in traditional medicine, as well as to contribute to the precise diagnosis of SC types in an objective manner in clinical practice.
Multidisciplinary, interdisciplinary, or dysfunctional? Team working in mixed-methods research.
O'Cathain, Alicia; Murphy, Elizabeth; Nicholl, Jon
2008-11-01
Combining qualitative and quantitative methods in a single study-otherwise known as mixed-methods research-is common. In health research these projects can be delivered by research teams. A typical scenario, for example, involves medical sociologists delivering qualitative components and researchers from medicine or health economics delivering quantitative components. We undertook semistructured interviews with 20 researchers who had worked on mixed-methods studies in health services research to explore the facilitators of and barriers to exploiting the potential of this approach. Team working emerged as a key issue, with three models of team working apparent: multidisciplinary, interdisciplinary, and dysfunctional. Interdisciplinary research was associated with integration of data or findings from the qualitative and quantitative components in both the final reports and the peer-reviewed publications. Methodological respect between team members and a principal investigator who valued integration emerged as essential to achieving integrated research outcomes.
Competitive forces in the medical group industry: a stakeholder perspective.
Blair, J D; Buesseler, J A
1998-01-01
Applying Porter's model of competitive forces to health care, stakeholder concepts are integrated to analyze the future of medical groups. Using both quantitative survey and qualitative observational data, competitors, physician suppliers, integrated systems new entrants, patient and managed care buyers, and hospitals substitutes are examined.
Using an Integrated, Multi-disciplinary Framework to Support Quantitative Microbial Risk Assessments
The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) provides the infrastructure to link disparate models and databases seamlessly, giving an assessor the ability to construct an appropriate conceptual site model from a host of modeling choices, so a numbe...
Quantitative systems toxicology
Bloomingdale, Peter; Housand, Conrad; Apgar, Joshua F.; Millard, Bjorn L.; Mager, Donald E.; Burke, John M.; Shah, Dhaval K.
2017-01-01
The overarching goal of modern drug development is to optimize therapeutic benefits while minimizing adverse effects. However, inadequate efficacy and safety concerns remain to be the major causes of drug attrition in clinical development. For the past 80 years, toxicity testing has consisted of evaluating the adverse effects of drugs in animals to predict human health risks. The U.S. Environmental Protection Agency recognized the need to develop innovative toxicity testing strategies and asked the National Research Council to develop a long-range vision and strategy for toxicity testing in the 21st century. The vision aims to reduce the use of animals and drug development costs through the integration of computational modeling and in vitro experimental methods that evaluates the perturbation of toxicity-related pathways. Towards this vision, collaborative quantitative systems pharmacology and toxicology modeling endeavors (QSP/QST) have been initiated amongst numerous organizations worldwide. In this article, we discuss how quantitative structure-activity relationship (QSAR), network-based, and pharmacokinetic/pharmacodynamic modeling approaches can be integrated into the framework of QST models. Additionally, we review the application of QST models to predict cardiotoxicity and hepatotoxicity of drugs throughout their development. Cell and organ specific QST models are likely to become an essential component of modern toxicity testing, and provides a solid foundation towards determining individualized therapeutic windows to improve patient safety. PMID:29308440
USDA-ARS?s Scientific Manuscript database
Multi-locus genome-wide association studies has become the state-of-the-art procedure to identify quantitative trait loci (QTL) associated with traits simultaneously. However, implementation of multi-locus model is still difficult. In this study, we integrated least angle regression with empirical B...
Calibration of a COTS Integration Cost Model Using Local Project Data
NASA Technical Reports Server (NTRS)
Boland, Dillard; Coon, Richard; Byers, Kathryn; Levitt, David
1997-01-01
The software measures and estimation techniques appropriate to a Commercial Off the Shelf (COTS) integration project differ from those commonly used for custom software development. Labor and schedule estimation tools that model COTS integration are available. Like all estimation tools, they must be calibrated with the organization's local project data. This paper describes the calibration of a commercial model using data collected by the Flight Dynamics Division (FDD) of the NASA Goddard Spaceflight Center (GSFC). The model calibrated is SLIM Release 4.0 from Quantitative Software Management (QSM). By adopting the SLIM reuse model and by treating configuration parameters as lines of code, we were able to establish a consistent calibration for COTS integration projects. The paper summarizes the metrics, the calibration process and results, and the validation of the calibration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
George A. Beitel
2004-02-01
In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less
A collaborative molecular modeling environment using a virtual tunneling service.
Lee, Jun; Kim, Jee-In; Kang, Lin-Woo
2012-01-01
Collaborative researches of three-dimensional molecular modeling can be limited by different time zones and locations. A networked virtual environment can be utilized to overcome the problem caused by the temporal and spatial differences. However, traditional approaches did not sufficiently consider integration of different computing environments, which were characterized by types of applications, roles of users, and so on. We propose a collaborative molecular modeling environment to integrate different molecule modeling systems using a virtual tunneling service. We integrated Co-Coot, which is a collaborative crystallographic object-oriented toolkit, with VRMMS, which is a virtual reality molecular modeling system, through a collaborative tunneling system. The proposed system showed reliable quantitative and qualitative results through pilot experiments.
AI/OR computational model for integrating qualitative and quantitative design methods
NASA Technical Reports Server (NTRS)
Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor
1990-01-01
A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.
Calabrese, Edward J
2013-11-01
The most common quantitative feature of the hormetic-biphasic dose response is its modest stimulatory response which at maximum is only 30-60% greater than control values, an observation that is consistently independent of biological model, level of organization (i.e., cell, organ or individual), endpoint measured, chemical/physical agent studied, or mechanism. This quantitative feature suggests an underlying "upstream" mechanism common across biological systems, therefore basic and general. Hormetic dose response relationships represent an estimate of the peak performance of integrative biological processes that are allometrically based. Hormetic responses reflect both direct stimulatory or overcompensation responses to damage induced by relatively low doses of chemical or physical agents. The integration of the hormetic dose response within an allometric framework provides, for the first time, an explanation for both the generality and the quantitative features of the hormetic dose response. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quantitative reconstructions in multi-modal photoacoustic and optical coherence tomography imaging
NASA Astrophysics Data System (ADS)
Elbau, P.; Mindrinos, L.; Scherzer, O.
2018-01-01
In this paper we perform quantitative reconstruction of the electric susceptibility and the Grüneisen parameter of a non-magnetic linear dielectric medium using measurement of a multi-modal photoacoustic and optical coherence tomography system. We consider the mathematical model presented in Elbau et al (2015 Handbook of Mathematical Methods in Imaging ed O Scherzer (New York: Springer) pp 1169-204), where a Fredholm integral equation of the first kind for the Grüneisen parameter was derived. For the numerical solution of the integral equation we consider a Galerkin type method.
Integration of Environmental Education and Environmental Law Enforcement for Police Officers
ERIC Educational Resources Information Center
Bovornkijprasert, Sravoot; Rawang, Wee
2016-01-01
The purpose of this research was to establish an integrated model of environmental education (EE) and environmental law enforcement (ELE) to improve the efficiency of functional competency for police officers in Bangkok Metropolitan Police Division 9 (MBP Div. 9). The research design was mixed methods of quantitative and qualitative approaches…
Physiologically based pharmacokinetic (PBPK) modeling considering methylated trivalent arsenicals
PBPK modeling provides a quantitative biologically-based framework to integrate diverse types of information for application to risk analysis. For example, genetic polymorphisms in arsenic metabolizing enzymes (AS3MT) can lead to differences in target tissue dosimetry for key tri...
ERIC Educational Resources Information Center
Brückner, Sebastian; Pellegrino, James W.
2016-01-01
The Standards for Educational and Psychological Testing indicate that validation of assessments should include analyses of participants' response processes. However, such analyses typically are conducted only to supplement quantitative field studies with qualitative data, and seldom are such data connected to quantitative data on student or item…
Pargett, Michael; Umulis, David M
2013-07-15
Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.
DAWN (Design Assistant Workstation) for advanced physical-chemical life support systems
NASA Technical Reports Server (NTRS)
Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.
1989-01-01
This paper reports the results of a project supported by the National Aeronautics and Space Administration, Office of Aeronautics and Space Technology (NASA-OAST) under the Advanced Life Support Development Program. It is an initial attempt to integrate artificial intelligence techniques (via expert systems) with conventional quantitative modeling tools for advanced physical-chemical life support systems. The addition of artificial intelligence techniques will assist the designer in the definition and simulation of loosely/well-defined life support processes/problems as well as assist in the capture of design knowledge, both quantitative and qualitative. Expert system and conventional modeling tools are integrated to provide a design workstation that assists the engineer/scientist in creating, evaluating, documenting and optimizing physical-chemical life support systems for short-term and extended duration missions.
A Collaborative Molecular Modeling Environment Using a Virtual Tunneling Service
Lee, Jun; Kim, Jee-In; Kang, Lin-Woo
2012-01-01
Collaborative researches of three-dimensional molecular modeling can be limited by different time zones and locations. A networked virtual environment can be utilized to overcome the problem caused by the temporal and spatial differences. However, traditional approaches did not sufficiently consider integration of different computing environments, which were characterized by types of applications, roles of users, and so on. We propose a collaborative molecular modeling environment to integrate different molecule modeling systems using a virtual tunneling service. We integrated Co-Coot, which is a collaborative crystallographic object-oriented toolkit, with VRMMS, which is a virtual reality molecular modeling system, through a collaborative tunneling system. The proposed system showed reliable quantitative and qualitative results through pilot experiments. PMID:22927721
NASA Astrophysics Data System (ADS)
Allan, A.; Barbour, E.; Salehin, M.; Hutton, C.; Lázár, A. N.; Nicholls, R. J.; Rahman, M. M.
2015-12-01
A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.
NASA Astrophysics Data System (ADS)
Allan, Andrew; Barbour, Emily; Salehin, Mashfiqus; Munsur Rahman, Md.; Hutton, Craig; Lazar, Attila
2016-04-01
A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.
Technical manual for basic version of the Markov chain nest productivity model (MCnest)
The Markov Chain Nest Productivity Model (or MCnest) integrates existing toxicity information from three standardized avian toxicity tests with information on species life history and the timing of pesticide applications relative to the timing of avian breeding seasons to quantit...
User’s manual for basic version of MCnest Markov chain nest productivity model
The Markov Chain Nest Productivity Model (or MCnest) integrates existing toxicity information from three standardized avian toxicity tests with information on species life history and the timing of pesticide applications relative to the timing of avian breeding seasons to quantit...
USDA-ARS?s Scientific Manuscript database
Agricultural research increasingly is expected to provide precise, quantitative information with an explicit geographic coverage. Limited availability of continuous daily meteorological records often constrains efforts to provide such information through integrated use of simulation models, spatial ...
Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic met...
Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.
2017-01-01
Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.
Unlocking the relationship of biotic integrity of impaired waters to anthropogenic stresses.
Novotny, Vladimir; Bartosová, Alena; O'Reilly, Neal; Ehlinger, Timothy
2005-01-01
The Clean Water Act expressed its goals in terms of restoring and preserving the physical, chemical and biological integrity of the Nation's waters. Integrity has been defined as the ability of the water body's ecological system to support and maintain a balanced integrated, adaptive community of organisms comparable to that of a natural biota of the region. Several indices of biotic integrity (IBIs) have been developed to measure quantitatively the biotic composition and, hence, the integrity. Integrity can be impaired by discharges of pollutants from point and nonpoint sources and by other pollution-related to watershed/landscape and channel stresses, including channel and riparian zone modifications and habitat impairment. Various models that link the stressors to the biotic assessment endpoints, i.e., the IBIs, have been presented and discussed. Simple models that link IBIs directly to single or multiple surrogate stressors such as percent imperviousness are inadequate because they may not represent a true cause-effect proximate relationship. Furthermore, some surrogate landscape parameters are irreversible and the relationships cannot be used for development of plans for restoration of the water body integrity. A concept of a layered hierarchical model that will link the watershed, landscape and stream morphology pollution stressors to the biotic assessment endpoints (IBIs) is described. The key groups of structural components of the model are: IBIs and their metrics in the top layer, chemical water and sediment risks and a habitat quality index in the layer below, in-stream concentrations in water and sediments and channel/habitat impairment parameters in the third layer, and watershed/landscaper pollution generating stressors, land use change rates, and hydrology in the lowest layer of stressors. A modified and expanded Maximum Species Richness concept is developed and used to reveal quantitatively the functional relationships between the top two layers of the structural components and parameters of the model.
Linking short-term responses to ecologically-relevant outcomes
Opportunity to participate in the conduct of collaborative integrative lab, field and modelling efforts to characterize molecular-to-organismal level responses and make quantitative testable predictions of population level outcomes
The Integration of Evaluation Paradigms Through Metaphor.
ERIC Educational Resources Information Center
Felker, Roberta M.
The point of view is presented that evaluation projects can be enriched by not using either an exclusively quantitative model or an exclusively qualitative model but by combining both models in one project. The concept of metaphor is used to clarify the usefulness of the combination. Iconic or holistic metaphors describe an object or event as…
ERIC Educational Resources Information Center
Hansen, John; Barnett, Michael; MaKinster, James; Keating, Thomas
2004-01-01
The increased availability of computational modeling software has created opportunities for students to engage in scientific inquiry through constructing computer-based models of scientific phenomena. However, despite the growing trend of integrating technology into science curricula, educators need to understand what aspects of these technologies…
Empirical methods for modeling landscape change, ecosystem services, and biodiversity
David Lewis; Ralph Alig
2009-01-01
The purpose of this paper is to synthesize recent economics research aimed at integrating discrete-choice econometric models of land-use change with spatially-explicit landscape simulations and quantitative ecology. This research explicitly models changes in the spatial pattern of landscapes in two steps: 1) econometric estimation of parcel-scale transition...
Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site
NASA Astrophysics Data System (ADS)
Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.
2012-04-01
The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".
Systems Toxicology: From Basic Research to Risk Assessment
2014-01-01
Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777
Systems toxicology: from basic research to risk assessment.
Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C
2014-03-17
Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.
ERIC Educational Resources Information Center
Schuchardt, Anita M.; Schunn, Christian D.
2016-01-01
Amid calls for integrating science, technology, engineering, and mathematics (iSTEM) in K-12 education, there is a pressing need to uncover productive methods of integration. Prior research has shown that increasing contextual linkages between science and mathematics is associated with student problem solving and conceptual understanding. However,…
ERIC Educational Resources Information Center
Ho, Hsuan-Fu; Hung, Chia-Chi
2008-01-01
Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…
From themes to hypotheses: following up with quantitative methods.
Morgan, David L
2015-06-01
One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.
Miyamoto, Tadayoshi; Manabe, Kou; Ueda, Shinya; Nakahara, Hidehiro
2018-05-01
What is the central question of this study? The lack of useful small-animal models for studying exercise hyperpnoea makes it difficult to investigate the underlying mechanisms of exercise-induced ventilatory abnormalities in various disease states. What is the main finding and its importance? We developed an anaesthetized-rat model for studying exercise hyperpnoea, using a respiratory equilibrium diagram for quantitative characterization of the respiratory chemoreflex feedback system. This experimental model will provide an opportunity to clarify the major determinant mechanisms of exercise hyperpnoea, and will be useful for understanding the mechanisms responsible for abnormal ventilatory responses to exercise in disease models. Exercise-induced ventilatory abnormalities in various disease states seem to arise from pathological changes of respiratory regulation. Although experimental studies in small animals are essential to investigate the pathophysiological basis of various disease models, the lack of an integrated framework for quantitatively characterizing respiratory regulation during exercise prevents us from resolving these problems. The purpose of this study was to develop an anaesthetized-rat model for studying exercise hyperpnoea for quantitative characterization of the respiratory chemoreflex feedback system. In 24 anaesthetized rats, we induced muscle contraction by stimulating bilateral distal sciatic nerves at low and high voltage to mimic exercise. We recorded breath-by-breath respiratory gas analysis data and cardiorespiratory responses while running two protocols to characterize the controller and plant of the respiratory chemoreflex. The controller was characterized by determining the linear relationship between end-tidal CO 2 pressure (P ETC O2) and minute ventilation (V̇E), and the plant by the hyperbolic relationship between V̇E and P ETC O2. During exercise, the controller curve shifted upward without change in controller gain, accompanying increased oxygen uptake. The hyperbolic plant curve shifted rightward and downward depending on exercise intensity as predicted by increased metabolism. Exercise intensity-dependent changes in operating points (V̇E and P ETC O2) were estimated by integrating the controller and plant curves in a respiratory equilibrium diagram. In conclusion, we developed an anaesthetized-rat model for studying exercise hyperpnoea, using systems analysis for quantitative characterization of the respiratory system. This novel experimental model will be useful for understanding the mechanisms responsible for abnormal ventilatory responses to exercise in disease models. © 2018 Morinomiya University of Medical Sciences. Experimental Physiology © 2018 The Physiological Society.
van den Berg, Ronald; Roerdink, Jos B T M; Cornelissen, Frans W
2010-01-22
An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called "crowding". Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, "compulsory averaging", and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality.
Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic me...
Pathways to Mathematics: Longitudinal Predictors of Performance
ERIC Educational Resources Information Center
LeFevre, Jo-Anne; Fast, Lisa; Skwarchuk, Sheri-Lynn; Smith-Chant, Brenda L.; Bisanz, Jeffrey; Kamawar, Deepthi; Penner-Wilger, Marcie
2010-01-01
A model of the relations among cognitive precursors, early numeracy skill, and mathematical outcomes was tested for 182 children from 4.5 to 7.5 years of age. The model integrates research from neuroimaging, clinical populations, and normal development in children and adults. It includes 3 precursor pathways: quantitative, linguistic, and spatial…
A Holistic Approach to Evaluating Vocational Education: Traditional Chinese Physicians (TCP) Model.
ERIC Educational Resources Information Center
Lee, Lung-Sheng; Chang, Liang-Te
Conventional approaches to evaluating vocational education have often been criticized for failing to deal holistically with the institution or program being evaluated. Integrated quantitative and qualitative evaluation methods have documented benefits; therefore, it would be useful to consider possibility of developing a model for evaluating…
Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero
2011-03-24
High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.
ERIC Educational Resources Information Center
Gülpinar, Mehmet Ali; Isoglu-Alkaç, Ümmühan; Yegen, Berrak Çaglayan
2015-01-01
Recently, integrated and contextual learning models such as problem-based learning (PBL) and brain/mind learning (BML) have become prominent. The present study aimed to develop and evaluate a PBL program enriched with BML principles. In this study, participants were 295 first-year medical students. The study used both quantitative and qualitative…
ERIC Educational Resources Information Center
Noble, Dorottya B.; Mochrie, Simon G. J.; O'Hern, Corey S.; Pollard, Thomas D.; Regan, Lynne
2016-01-01
In 2008, we established the Integrated Graduate Program in Physical and Engineering Biology (IGPPEB) at Yale University. Our goal was to create a comprehensive graduate program to train a new generation of scientists who possess a sophisticated understanding of biology and who are capable of applying physical and quantitative methodologies to…
The integrated effect of moderate exercise on coronary heart disease.
Mathews, Marc J; Mathews, Edward H; Mathews, George E
Moderate exercise is associated with a lower risk for coronary heart disease (CHD). A suitable integrated model of the CHD pathogenetic pathways relevant to moderate exercise may help to elucidate this association. Such a model is currently not available in the literature. An integrated model of CHD was developed and used to investigate pathogenetic pathways of importance between exercise and CHD. Using biomarker relative-risk data, the pathogenetic effects are representable as measurable effects based on changes in biomarkers. The integrated model provides insight into higherorder interactions underlying the associations between CHD and moderate exercise. A novel 'connection graph' was developed, which simplifies these interactions. It quantitatively illustrates the relationship between moderate exercise and various serological biomarkers of CHD. The connection graph of moderate exercise elucidates all the possible integrated actions through which risk reduction may occur. An integrated model of CHD provides a summary of the effects of moderate exercise on CHD. It also shows the importance of each CHD pathway that moderate exercise influences. The CHD risk-reducing effects of exercise appear to be primarily driven by decreased inflammation and altered metabolism.
Wires in the soup: quantitative models of cell signaling
Cheong, Raymond; Levchenko, Andre
2014-01-01
Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Peter Andrew
The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomicmore » scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.« less
Quantitative modeling of failure propagation in intelligent transportation systems.
DOT National Transportation Integrated Search
2014-08-01
Unmanned vehicles are projected to reach consumer use within this decade - related legislation has already passed in California. The : most significant technical challenge associated with these vehicles is their integration in transportation environm...
DigitalHuman (DH): An Integrative Mathematical Model ofHuman Physiology
NASA Technical Reports Server (NTRS)
Hester, Robert L.; Summers, Richard L.; lIescu, Radu; Esters, Joyee; Coleman, Thomas G.
2010-01-01
Mathematical models and simulation are important tools in discovering the key causal relationships governing physiological processes and improving medical intervention when physiological complexity is a central issue. We have developed a model of integrative human physiology called DigitalHuman (DH) consisting of -5000 variables modeling human physiology describing cardiovascular, renal, respiratory, endocrine, neural and metabolic physiology. Users can view time-dependent solutions and interactively introduce perturbations by altering numerical parameters to investigate new hypotheses. The variables, parameters and quantitative relationships as well as all other model details are described in XML text files. All aspects of the model, including the mathematical equations describing the physiological processes are written in XML open source, text-readable files. Model structure is based upon empirical data of physiological responses documented within the peer-reviewed literature. The model can be used to understand proposed physiological mechanisms and physiological interactions that may not be otherwise intUitively evident. Some of the current uses of this model include the analyses of renal control of blood pressure, the central role of the liver in creating and maintaining insulin resistance, and the mechanisms causing orthostatic hypotension in astronauts. Additionally the open source aspect of the modeling environment allows any investigator to add detailed descriptions of human physiology to test new concepts. The model accurately predicts both qualitative and more importantly quantitative changes in clinically and experimentally observed responses. DigitalHuman provides scientists a modeling environment to understand the complex interactions of integrative physiology. This research was supported by.NIH HL 51971, NSF EPSCoR, and NASA
NASA Astrophysics Data System (ADS)
Barra, Adriano; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia
2014-02-01
How does immigrant integration in a country change with immigration density? Guided by a statistical mechanics perspective we propose a novel approach to this problem. The analysis focuses on classical integration quantifiers such as the percentage of jobs (temporary and permanent) given to immigrants, mixed marriages, and newborns with parents of mixed origin. We find that the average values of different quantifiers may exhibit either linear or non-linear growth on immigrant density and we suggest that social action, a concept identified by Max Weber, causes the observed non-linearity. Using the statistical mechanics notion of interaction to quantitatively emulate social action, a unified mathematical model for integration is proposed and it is shown to explain both growth behaviors observed. The linear theory instead, ignoring the possibility of interaction effects would underestimate the quantifiers up to 30% when immigrant densities are low, and overestimate them as much when densities are high. The capacity to quantitatively isolate different types of integration mechanisms makes our framework a suitable tool in the quest for more efficient integration policies.
Saint: a lightweight integration environment for model annotation.
Lister, Allyson L; Pocock, Matthew; Taschuk, Morgan; Wipat, Anil
2009-11-15
Saint is a web application which provides a lightweight annotation integration environment for quantitative biological models. The system enables modellers to rapidly mark up models with biological information derived from a range of data sources. Saint is freely available for use on the web at http://www.cisban.ac.uk/saint. The web application is implemented in Google Web Toolkit and Tomcat, with all major browsers supported. The Java source code is freely available for download at http://saint-annotate.sourceforge.net. The Saint web server requires an installation of libSBML and has been tested on Linux (32-bit Ubuntu 8.10 and 9.04).
A Method for Label-Free, Differential Top-Down Proteomics.
Ntai, Ioanna; Toby, Timothy K; LeDuc, Richard D; Kelleher, Neil L
2016-01-01
Biomarker discovery in the translational research has heavily relied on labeled and label-free quantitative bottom-up proteomics. Here, we describe a new approach to biomarker studies that utilizes high-throughput top-down proteomics and is the first to offer whole protein characterization and relative quantitation within the same experiment. Using yeast as a model, we report procedures for a label-free approach to quantify the relative abundance of intact proteins ranging from 0 to 30 kDa in two different states. In this chapter, we describe the integrated methodology for the large-scale profiling and quantitation of the intact proteome by liquid chromatography-mass spectrometry (LC-MS) without the need for metabolic or chemical labeling. This recent advance for quantitative top-down proteomics is best implemented with a robust and highly controlled sample preparation workflow before data acquisition on a high-resolution mass spectrometer, and the application of a hierarchical linear statistical model to account for the multiple levels of variance contained in quantitative proteomic comparisons of samples for basic and clinical research.
Elayavilli, Ravikumar Komandur; Liu, Hongfang
2016-01-01
Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.
Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff
2016-01-01
Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. © 2016 K. Hoffman, S. Leupen, et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Modeling and Mapping Personal Learning Environment of Thai International Higher Education Students
ERIC Educational Resources Information Center
Sharafuddin, Mohamed Ali; Sawad, Buncha Panacharoen; Wongwai, Sarun
2018-01-01
This research article is part of a periodic study conducted to understand, model, map and to develop an integrated approach for effective and interactive self-learning phases of Thai International Hospitality and Tourism higher education students. Questionnaire containing both qualitative and quantitative questions was distributed at the beginning…
Bifurcations of large networks of two-dimensional integrate and fire neurons.
Nicola, Wilten; Campbell, Sue Ann
2013-08-01
Recently, a class of two-dimensional integrate and fire models has been used to faithfully model spiking neurons. This class includes the Izhikevich model, the adaptive exponential integrate and fire model, and the quartic integrate and fire model. The bifurcation types for the individual neurons have been thoroughly analyzed by Touboul (SIAM J Appl Math 68(4):1045-1079, 2008). However, when the models are coupled together to form networks, the networks can display bifurcations that an uncoupled oscillator cannot. For example, the networks can transition from firing with a constant rate to burst firing. This paper introduces a technique to reduce a full network of this class of neurons to a mean field model, in the form of a system of switching ordinary differential equations. The reduction uses population density methods and a quasi-steady state approximation to arrive at the mean field system. Reduced models are derived for networks with different topologies and different model neurons with biologically derived parameters. The mean field equations are able to qualitatively and quantitatively describe the bifurcations that the full networks display. Extensions and higher order approximations are discussed.
NASA Astrophysics Data System (ADS)
Muzy, Jean-François; Baïle, Rachel; Bacry, Emmanuel
2013-04-01
In this paper we propose a new model for volatility fluctuations in financial time series. This model relies on a nonstationary Gaussian process that exhibits aging behavior. It turns out that its properties, over any finite time interval, are very close to continuous cascade models. These latter models are indeed well known to reproduce faithfully the main stylized facts of financial time series. However, it involves a large-scale parameter (the so-called “integral scale” where the cascade is initiated) that is hard to interpret in finance. Moreover, the empirical value of the integral scale is in general deeply correlated to the overall length of the sample. This feature is precisely predicted by our model, which, as illustrated by various examples from daily stock index data, quantitatively reproduces the empirical observations.
ERIC Educational Resources Information Center
Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff
2016-01-01
Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory…
Differential memory in the trilinear model magnetotail
NASA Technical Reports Server (NTRS)
Chen, James; Mitchell, Horage G.; Palmadesso, Peter J.
1990-01-01
The previously proposed concept of 'differential memory' is quantitatively demonstrated using an idealized analytical model of particle dynamics in the magnetotail geometry. In this model (the 'trilinear' tail model) the magnetotail is divided into three regions. The particle orbits are solved exactly in each region, thus reducing the orbit integration to an analytical mapping. It is shown that the trilinear model reproduces the essential phase space features of the earlier model (Chen and Palmadesso, 1986), possessing well-defined entry and exit regions, and stochastic, integrable (regular), and transient orbits, occupying disjoint phase space regions. Different regions have widely separated characteristic time scales corresponding to different types of particle motion. Using the analytical model, the evolution of single-particle distribution functions is calculated.
van den Berg, Ronald; Roerdink, Jos B. T. M.; Cornelissen, Frans W.
2010-01-01
An object in the peripheral visual field is more difficult to recognize when surrounded by other objects. This phenomenon is called “crowding”. Crowding places a fundamental constraint on human vision that limits performance on numerous tasks. It has been suggested that crowding results from spatial feature integration necessary for object recognition. However, in the absence of convincing models, this theory has remained controversial. Here, we present a quantitative and physiologically plausible model for spatial integration of orientation signals, based on the principles of population coding. Using simulations, we demonstrate that this model coherently accounts for fundamental properties of crowding, including critical spacing, “compulsory averaging”, and a foveal-peripheral anisotropy. Moreover, we show that the model predicts increased responses to correlated visual stimuli. Altogether, these results suggest that crowding has little immediate bearing on object recognition but is a by-product of a general, elementary integration mechanism in early vision aimed at improving signal quality. PMID:20098499
Assessing healthcare professionals' experiences of integrated care: do surveys tell the full story?
Stephenson, Matthew D; Campbell, Jared M; Lisy, Karolina; Aromataris, Edoardo C
2017-09-01
Integrated care is the combination of different healthcare services with the goal to provide comprehensive, seamless, effective and efficient patient care. Assessing the experiences of healthcare professionals (HCPs) is an important aspect when evaluating integrated care strategies. The aim of this rapid review was to investigate if quantitative surveys used to assess HCPs' experiences with integrated care capture all the aspects highlighted as being important in qualitative research, with a view to informing future survey development. The review considered all types of health professionals in primary care, and hospital and specialist services, with a specific focus on the provision of integrated care aimed at improving the patient journey. PubMed, CINAHL and grey literature sources were searched for relevant surveys/program evaluations and qualitative research studies. Full text articles deemed to be of relevance to the review were appraised for methodological quality using abridged critical appraisal instruments from the Joanna Briggs Institute. Data were extracted from included studies using standardized data extraction templates. Findings from included studies were grouped into domains based on similarity of meaning. Similarities and differences in the domains covered in quantitative surveys and those identified as being important in qualitative research were explored. A total of 37 studies (19 quantitative surveys, 14 qualitative studies and four mixed-method studies) were included in the review. A range of healthcare professions participated in the included studies, the majority being primary care providers. Common domains identified from quantitative surveys and qualitative studies included Communication, Agreement on Clear Roles and Responsibilities, Facilities, Information Systems, and Coordination of Care and Access. Qualitative research highlighted domains identified by HCPs as being relevant to their experiences with integrated care that have not routinely being surveyed, including Workload, Clear Leadership/Decision-Making, Management, Flexibility of Integrated Care Model, Engagement, Usefulness of Integrated Care and Collaboration, and Positive Impact/Clinical Benefits/Practice Level Benefits. There were several domains identified from qualitative research that are not routinely included in quantitative surveys to assess health professionals' experiences of integrated care. In addition, the qualitative findings suggest that the experiences of HCPs are often impacted by deeper aspects than those measured by existing surveys. Incorporation of targeted items within these domains in the design of surveys should enhance the capture of data that are relevant to the experiences of HCPs with integrated care, which may assist in more comprehensive evaluation and subsequent improvement of integrated care programs.
Integrating real-time GIS and social media for qualitative transportation data collection.
DOT National Transportation Integrated Search
2016-12-26
New technologies such as global positioning system, smartphone, and social media are changing the way we move around. Traditional : transportation research has overwhelmingly emphasized the collection of quantitative data for modeling, without much c...
Getting quantitative about consequences of cross-ecosystem resource subsidies on recipient consumers
Richardson, John S.; Wipfli, Mark S.
2016-01-01
Most studies of cross-ecosystem resource subsidies have demonstrated positive effects on recipient consumer populations, often with very large effect sizes. However, it is important to move beyond these initial addition–exclusion experiments to consider the quantitative consequences for populations across gradients in the rates and quality of resource inputs. In our introduction to this special issue, we describe at least four potential models that describe functional relationships between subsidy input rates and consumer responses, most of them asymptotic. Here we aim to advance our quantitative understanding of how subsidy inputs influence recipient consumers and their communities. In the papers following, fish were either the recipient consumers or the subsidy as carcasses of anadromous species. Advancing general, predictive models will enable us to further consider what other factors are potentially co-limiting (e.g., nutrients, other population interactions, physical habitat, etc.) and better integrate resource subsidies into consumer–resource, biophysical dynamics models.
Stenner, A Jackson; Fisher, William P; Stone, Mark H; Burdick, Donald S
2013-01-01
Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained.
Stenner, A. Jackson; Fisher, William P.; Stone, Mark H.; Burdick, Donald S.
2013-01-01
Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained. PMID:23986726
Hallow, K M; Gebremichael, Y
2017-06-01
Renal function plays a central role in cardiovascular, kidney, and multiple other diseases, and many existing and novel therapies act through renal mechanisms. Even with decades of accumulated knowledge of renal physiology, pathophysiology, and pharmacology, the dynamics of renal function remain difficult to understand and predict, often resulting in unexpected or counterintuitive therapy responses. Quantitative systems pharmacology modeling of renal function integrates this accumulated knowledge into a quantitative framework, allowing evaluation of competing hypotheses, identification of knowledge gaps, and generation of new experimentally testable hypotheses. Here we present a model of renal physiology and control mechanisms involved in maintaining sodium and water homeostasis. This model represents the core renal physiological processes involved in many research questions in drug development. The model runs in R and the code is made available. In a companion article, we present a case study using the model to explore mechanisms and pharmacology of salt-sensitive hypertension. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Calcium as a signal integrator in developing epithelial tissues.
Brodskiy, Pavel A; Zartman, Jeremiah J
2018-05-16
Decoding how tissue properties emerge across multiple spatial and temporal scales from the integration of local signals is a grand challenge in quantitative biology. For example, the collective behavior of epithelial cells is critical for shaping developing embryos. Understanding how epithelial cells interpret a diverse range of local signals to coordinate tissue-level processes requires a systems-level understanding of development. Integration of multiple signaling pathways that specify cell signaling information requires second messengers such as calcium ions. Increasingly, specific roles have been uncovered for calcium signaling throughout development. Calcium signaling regulates many processes including division, migration, death, and differentiation. However, the pleiotropic and ubiquitous nature of calcium signaling implies that many additional functions remain to be discovered. Here we review a selection of recent studies to highlight important insights into how multiple signals are transduced by calcium transients in developing epithelial tissues. Quantitative imaging and computational modeling have provided important insights into how calcium signaling integration occurs. Reverse-engineering the conserved features of signal integration mediated by calcium signaling will enable novel approaches in regenerative medicine and synthetic control of morphogenesis.
Ishikawa, Akira
2017-11-27
Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.
Krueger, Robert F.; Markon, Kristian E.; Patrick, Christopher J.; Benning, Stephen D.; Kramer, Mark D.
2008-01-01
Antisocial behavior, substance use, and impulsive and aggressive personality traits often co-occur, forming a coherent spectrum of personality and psychopathology. In the current research, the authors developed a novel quantitative model of this spectrum. Over 3 waves of iterative data collection, 1,787 adult participants selected to represent a range across the externalizing spectrum provided extensive data about specific externalizing behaviors. Statistical methods such as item response theory and semiparametric factor analysis were used to model these data. The model and assessment instrument that emerged from the research shows how externalizing phenomena are organized hierarchically and cover a wide range of individual differences. The authors discuss the utility of this model for framing research on the correlates and the etiology of externalizing phenomena. PMID:18020714
A semantic web framework to integrate cancer omics data with biological knowledge.
Holford, Matthew E; McCusker, James P; Cheung, Kei-Hoi; Krauthammer, Michael
2012-01-25
The RDF triple provides a simple linguistic means of describing limitless types of information. Triples can be flexibly combined into a unified data source we call a semantic model. Semantic models open new possibilities for the integration of variegated biological data. We use Semantic Web technology to explicate high throughput clinical data in the context of fundamental biological knowledge. We have extended Corvus, a data warehouse which provides a uniform interface to various forms of Omics data, by providing a SPARQL endpoint. With the querying and reasoning tools made possible by the Semantic Web, we were able to explore quantitative semantic models retrieved from Corvus in the light of systematic biological knowledge. For this paper, we merged semantic models containing genomic, transcriptomic and epigenomic data from melanoma samples with two semantic models of functional data - one containing Gene Ontology (GO) data, the other, regulatory networks constructed from transcription factor binding information. These two semantic models were created in an ad hoc manner but support a common interface for integration with the quantitative semantic models. Such combined semantic models allow us to pose significant translational medicine questions. Here, we study the interplay between a cell's molecular state and its response to anti-cancer therapy by exploring the resistance of cancer cells to Decitabine, a demethylating agent. We were able to generate a testable hypothesis to explain how Decitabine fights cancer - namely, that it targets apoptosis-related gene promoters predominantly in Decitabine-sensitive cell lines, thus conveying its cytotoxic effect by activating the apoptosis pathway. Our research provides a framework whereby similar hypotheses can be developed easily.
Meinherz, Franziska; Videira, Nuno
2018-04-10
The aim of this paper is to contribute to the exploration of environmental modeling methods based on the elicitation of stakeholders' mental models. This aim is motivated by the necessity to understand the dilemmas and behavioral rationales of individuals for supporting the management of environmental problems. The methodology developed for this paper integrates qualitative and quantitative methods by deploying focus groups for the elicitation of the behavioral rationales of the target population, and grounded theory to code the information gained in the focus groups and to guide the development of a dynamic simulation model. The approach is applied to a case of urban air pollution caused by residential heating with wood in central Chile. The results show how the households' behavior interrelates with the governmental management strategies and provide valuable and novel insights into potential challenges to the implementation of policies to manage the local air pollution problem. The experience further shows that the developed participatory modeling approach allows to overcome some of the issues currently encountered in the elicitation of individuals' behavioral rationales and in the quantification of qualitative information.
Systems microscopy: an emerging strategy for the life sciences.
Lock, John G; Strömblad, Staffan
2010-05-01
Dynamic cellular processes occurring in time and space are fundamental to all physiology and disease. To understand complex and dynamic cellular processes therefore demands the capacity to record and integrate quantitative multiparametric data from the four spatiotemporal dimensions within which living cells self-organize, and to subsequently use these data for the mathematical modeling of cellular systems. To this end, a raft of complementary developments in automated fluorescence microscopy, cell microarray platforms, quantitative image analysis and data mining, combined with multivariate statistics and computational modeling, now coalesce to produce a new research strategy, "systems microscopy", which facilitates systems biology analyses of living cells. Systems microscopy provides the crucial capacities to simultaneously extract and interrogate multiparametric quantitative data at resolution levels ranging from the molecular to the cellular, thereby elucidating a more comprehensive and richly integrated understanding of complex and dynamic cellular systems. The unique capacities of systems microscopy suggest that it will become a vital cornerstone of systems biology, and here we describe the current status and future prospects of this emerging field, as well as outlining some of the key challenges that remain to be overcome. Copyright 2010 Elsevier Inc. All rights reserved.
General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.
de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael
2016-11-01
Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.
Gender Integration on U.S. Navy Submarines: Views of the First Wave
2015-06-01
Radiological Controls Assistant DACOWITS Defense Advisory Committee on Women in the Services DH Department Head DINQ delinquent DO Division...Previous studies have attempted to build statistical models based on surface fleet data to forecast female sustainability in the submarine fleet, yet 2...their integration? Such questions cannot be answered by collecting the type of quantitative data that can be analyzed using statistical methods. Complex
ERIC Educational Resources Information Center
Elander, Kelly; Cronje, Johannes C.
2016-01-01
While learning interventions were traditionally classified as either objectivist or constructivist there has been an increasing tendency for practitioners to use elements of both paradigms in a consolidated fashion. This has meant a re-think of the two perspectives as diametrically opposite. A four-quadrant model, first proposed in this journal…
ERIC Educational Resources Information Center
Collins, Cyleste C.; Dressler, William W.
2008-01-01
This study uses mixed methods and theory from cognitive anthropology to examine the cultural models of domestic violence among domestic violence agency workers, welfare workers, nurses, and a general population comparison group. Data collection and analysis uses quantitative and qualitative techniques, and the findings are integrated for…
Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into acco...
Prediction of tautomer ratios by embedded-cluster integral equation theory
NASA Astrophysics Data System (ADS)
Kast, Stefan M.; Heil, Jochen; Güssregen, Stefan; Schmidt, K. Friedemann
2010-04-01
The "embedded cluster reference interaction site model" (EC-RISM) approach combines statistical-mechanical integral equation theory and quantum-chemical calculations for predicting thermodynamic data for chemical reactions in solution. The electronic structure of the solute is determined self-consistently with the structure of the solvent that is described by 3D RISM integral equation theory. The continuous solvent-site distribution is mapped onto a set of discrete background charges ("embedded cluster") that represent an additional contribution to the molecular Hamiltonian. The EC-RISM analysis of the SAMPL2 challenge set of tautomers proceeds in three stages. Firstly, the group of compounds for which quantitative experimental free energy data was provided was taken to determine appropriate levels of quantum-chemical theory for geometry optimization and free energy prediction. Secondly, the resulting workflow was applied to the full set, allowing for chemical interpretations of the results. Thirdly, disclosure of experimental data for parts of the compounds facilitated a detailed analysis of methodical issues and suggestions for future improvements of the model. Without specifically adjusting parameters, the EC-RISM model yields the smallest value of the root mean square error for the first set (0.6 kcal mol-1) as well as for the full set of quantitative reaction data (2.0 kcal mol-1) among the SAMPL2 participants.
Cost-effectiveness analysis of microdose clinical trials in drug development.
Yamane, Naoe; Igarashi, Ataru; Kusama, Makiko; Maeda, Kazuya; Ikeda, Toshihiko; Sugiyama, Yuichi
2013-01-01
Microdose (MD) clinical trials have been introduced to obtain human pharmacokinetic data early in drug development. Here we assessed the cost-effectiveness of microdose integrated drug development in a hypothetical model, as there was no such quantitative research that weighed the additional effectiveness against the additional time and/or cost. First, we calculated the cost and effectiveness (i.e., success rate) of 3 types of MD integrated drug development strategies: liquid chromatography-tandem mass spectrometry, accelerator mass spectrometry, and positron emission tomography. Then, we analyzed the cost-effectiveness of 9 hypothetical scenarios where 100 drug candidates entering into a non-clinical toxicity study were selected by different methods as the conventional scenario without MD. In the base-case, where 70 drug candidates were selected without MD and 30 selected evenly by one of the three MD methods, incremental cost-effectiveness ratio per one additional drug approved was JPY 12.7 billion (US$ 0.159 billion), whereas the average cost-effectiveness ratio of the conventional strategy was JPY 24.4 billion, which we set as a threshold. Integrating MD in the conventional drug development was cost-effective in this model. This quantitative analytical model which allows various modifications according to each company's conditions, would be helpful for guiding decisions early in clinical development.
Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol
2011-02-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.
Parallel labeling experiments for pathway elucidation and (13)C metabolic flux analysis.
Antoniewicz, Maciek R
2015-12-01
Metabolic pathway models provide the foundation for quantitative studies of cellular physiology through the measurement of intracellular metabolic fluxes. For model organisms metabolic models are well established, with many manually curated genome-scale model reconstructions, gene knockout studies and stable-isotope tracing studies. However, for non-model organisms a similar level of knowledge is often lacking. Compartmentation of cellular metabolism in eukaryotic systems also presents significant challenges for quantitative (13)C-metabolic flux analysis ((13)C-MFA). Recently, innovative (13)C-MFA approaches have been developed based on parallel labeling experiments, the use of multiple isotopic tracers and integrated data analysis, that allow more rigorous validation of pathway models and improved quantification of metabolic fluxes. Applications of these approaches open new research directions in metabolic engineering, biotechnology and medicine. Copyright © 2015 Elsevier Ltd. All rights reserved.
Quantitative Investigation of the Role of Intra-/Intercellular Dynamics in Bacterial Quorum Sensing.
Leaman, Eric J; Geuther, Brian Q; Behkam, Bahareh
2018-04-20
Bacteria utilize diffusible signals to regulate population density-dependent coordinated gene expression in a process called quorum sensing (QS). While the intracellular regulatory mechanisms of QS are well-understood, the effect of spatiotemporal changes in the population configuration on the sensitivity and robustness of the QS response remains largely unexplored. Using a microfluidic device, we quantitatively characterized the emergent behavior of a population of swimming E. coli bacteria engineered with the lux QS system and a GFP reporter. We show that the QS activation time follows a power law with respect to bacterial population density, but this trend is disrupted significantly by microscale variations in population configuration and genetic circuit noise. We then developed a computational model that integrates population dynamics with genetic circuit dynamics to enable accurate (less than 7% error) quantitation of the bacterial QS activation time. Through modeling and experimental analyses, we show that changes in spatial configuration of swimming bacteria can drastically alter the QS activation time, by up to 22%. The integrative model developed herein also enables examination of the performance robustness of synthetic circuits with respect to growth rate, circuit sensitivity, and the population's initial size and spatial structure. Our framework facilitates quantitative tuning of microbial systems performance through rational engineering of synthetic ribosomal binding sites. We have demonstrated this through modulation of QS activation time over an order of magnitude. Altogether, we conclude that predictive engineering of QS-based bacterial systems requires not only the precise temporal modulation of gene expression (intracellular dynamics) but also accounting for the spatiotemporal changes in population configuration (intercellular dynamics).
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
A road map for integrating eco-evolutionary processes into biodiversity models.
Thuiller, Wilfried; Münkemüller, Tamara; Lavergne, Sébastien; Mouillot, David; Mouquet, Nicolas; Schiffers, Katja; Gravel, Dominique
2013-05-01
The demand for projections of the future distribution of biodiversity has triggered an upsurge in modelling at the crossroads between ecology and evolution. Despite the enthusiasm around these so-called biodiversity models, most approaches are still criticised for not integrating key processes known to shape species ranges and community structure. Developing an integrative modelling framework for biodiversity distribution promises to improve the reliability of predictions and to give a better understanding of the eco-evolutionary dynamics of species and communities under changing environments. In this article, we briefly review some eco-evolutionary processes and interplays among them, which are essential to provide reliable projections of species distributions and community structure. We identify gaps in theory, quantitative knowledge and data availability hampering the development of an integrated modelling framework. We argue that model development relying on a strong theoretical foundation is essential to inspire new models, manage complexity and maintain tractability. We support our argument with an example of a novel integrated model for species distribution modelling, derived from metapopulation theory, which accounts for abiotic constraints, dispersal, biotic interactions and evolution under changing environmental conditions. We hope such a perspective will motivate exciting and novel research, and challenge others to improve on our proposed approach. © 2013 John Wiley & Sons Ltd/CNRS.
Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol
2011-01-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032
Computerized image analysis for quantitative neuronal phenotyping in zebrafish.
Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C
2006-06-15
An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.
The Technology Acceptance of Mobile Applications in Education
ERIC Educational Resources Information Center
Camilleri, Mark Anthony; Camilleri, Adriana Caterina
2017-01-01
This research explores the educators' attitudes and behavioural intention toward mobile applications. The methodology integrates measures from "the pace of technological innovativeness" and the "technology acceptance model" to understand the rationale for further investment in mobile learning (m-learning). A quantitative study…
Wikswo, J P; Prokop, A; Baudenbacher, F; Cliffel, D; Csukas, B; Velkovsky, M
2006-08-01
Systems biology, i.e. quantitative, postgenomic, postproteomic, dynamic, multiscale physiology, addresses in an integrative, quantitative manner the shockwave of genetic and proteomic information using computer models that may eventually have 10(6) dynamic variables with non-linear interactions. Historically, single biological measurements are made over minutes, suggesting the challenge of specifying 10(6) model parameters. Except for fluorescence and micro-electrode recordings, most cellular measurements have inadequate bandwidth to discern the time course of critical intracellular biochemical events. Micro-array expression profiles of thousands of genes cannot determine quantitative dynamic cellular signalling and metabolic variables. Major gaps must be bridged between the computational vision and experimental reality. The analysis of cellular signalling dynamics and control requires, first, micro- and nano-instruments that measure simultaneously multiple extracellular and intracellular variables with sufficient bandwidth; secondly, the ability to open existing internal control and signalling loops; thirdly, external BioMEMS micro-actuators that provide high bandwidth feedback and externally addressable intracellular nano-actuators; and, fourthly, real-time, closed-loop, single-cell control algorithms. The unravelling of the nested and coupled nature of cellular control loops requires simultaneous recording of multiple single-cell signatures. Externally controlled nano-actuators, needed to effect changes in the biochemical, mechanical and electrical environment both outside and inside the cell, will provide a major impetus for nanoscience.
Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
NASA Astrophysics Data System (ADS)
Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.
2017-04-01
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.
Label-free hyperspectral dark-field microscopy for quantitative scatter imaging
NASA Astrophysics Data System (ADS)
Cheney, Philip; McClatchy, David; Kanick, Stephen; Lemaillet, Paul; Allen, David; Samarov, Daniel; Pogue, Brian; Hwang, Jeeseong
2017-03-01
A hyperspectral dark-field microscope has been developed for imaging spatially distributed diffuse reflectance spectra from light-scattering samples. In this report, quantitative scatter spectroscopy is demonstrated with a uniform scattering phantom, namely a solution of polystyrene microspheres. A Monte Carlo-based inverse model was used to calculate the reduced scattering coefficients of samples of different microsphere concentrations from wavelength-dependent backscattered signal measured by the dark-field microscope. The results are compared to the measurement results from a NIST double-integrating sphere system for validation. Ongoing efforts involve quantitative mapping of scattering and absorption coefficients in samples with spatially heterogeneous optical properties.
NASA Astrophysics Data System (ADS)
Lawrenz, Frances; McCreath, Heather
Qualitative and quantitative evaluation procedures were used to compare two physical-science teacher inservice training programs. The two programs followed the master teacher training model espoused by NSF but used different types of master teachers and types of activities. The two evaluation procedures produced different results and together they provided a much clearer picture of the strengths and weaknesses of the two programs. Using only one approach or the other would have substantially altered the conclusions.
Oxidative DNA damage background estimated by a system model of base excision repair
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sokhansanj, B A; Wilson, III, D M
Human DNA can be damaged by natural metabolism through free radical production. It has been suggested that the equilibrium between innate damage and cellular DNA repair results in an oxidative DNA damage background that potentially contributes to disease and aging. Efforts to quantitatively characterize the human oxidative DNA damage background level based on measuring 8-oxoguanine lesions as a biomarker have led to estimates varying over 3-4 orders of magnitude, depending on the method of measurement. We applied a previously developed and validated quantitative pathway model of human DNA base excision repair, integrating experimentally determined endogenous damage rates and model parametersmore » from multiple sources. Our estimates of at most 100 8-oxoguanine lesions per cell are consistent with the low end of data from biochemical and cell biology experiments, a result robust to model limitations and parameter variation. Our results show the power of quantitative system modeling to interpret composite experimental data and make biologically and physiologically relevant predictions for complex human DNA repair pathway mechanisms and capacity.« less
Computational modeling of brain tumors: discrete, continuum or hybrid?
NASA Astrophysics Data System (ADS)
Wang, Zhihui; Deisboeck, Thomas S.
In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silico brain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.
Computational modeling of brain tumors: discrete, continuum or hybrid?
NASA Astrophysics Data System (ADS)
Wang, Zhihui; Deisboeck, Thomas S.
2008-04-01
In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silicobrain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.
Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane
2016-09-01
Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
A semantic web framework to integrate cancer omics data with biological knowledge
2012-01-01
Background The RDF triple provides a simple linguistic means of describing limitless types of information. Triples can be flexibly combined into a unified data source we call a semantic model. Semantic models open new possibilities for the integration of variegated biological data. We use Semantic Web technology to explicate high throughput clinical data in the context of fundamental biological knowledge. We have extended Corvus, a data warehouse which provides a uniform interface to various forms of Omics data, by providing a SPARQL endpoint. With the querying and reasoning tools made possible by the Semantic Web, we were able to explore quantitative semantic models retrieved from Corvus in the light of systematic biological knowledge. Results For this paper, we merged semantic models containing genomic, transcriptomic and epigenomic data from melanoma samples with two semantic models of functional data - one containing Gene Ontology (GO) data, the other, regulatory networks constructed from transcription factor binding information. These two semantic models were created in an ad hoc manner but support a common interface for integration with the quantitative semantic models. Such combined semantic models allow us to pose significant translational medicine questions. Here, we study the interplay between a cell's molecular state and its response to anti-cancer therapy by exploring the resistance of cancer cells to Decitabine, a demethylating agent. Conclusions We were able to generate a testable hypothesis to explain how Decitabine fights cancer - namely, that it targets apoptosis-related gene promoters predominantly in Decitabine-sensitive cell lines, thus conveying its cytotoxic effect by activating the apoptosis pathway. Our research provides a framework whereby similar hypotheses can be developed easily. PMID:22373303
NASA Astrophysics Data System (ADS)
Udupa, Jayaram K.; Odhner, Dewey; Falcao, Alexandre X.; Ciesielski, Krzysztof C.; Miranda, Paulo A. V.; Vaideeswaran, Pavithra; Mishra, Shipra; Grevera, George J.; Saboury, Babak; Torigian, Drew A.
2011-03-01
To make Quantitative Radiology (QR) a reality in routine clinical practice, computerized automatic anatomy recognition (AAR) becomes essential. As part of this larger goal, we present in this paper a novel fuzzy strategy for building bodywide group-wise anatomic models. They have the potential to handle uncertainties and variability in anatomy naturally and to be integrated with the fuzzy connectedness framework for image segmentation. Our approach is to build a family of models, called the Virtual Quantitative Human, representing normal adult subjects at a chosen resolution of the population variables (gender, age). Models are represented hierarchically, the descendents representing organs contained in parent organs. Based on an index of fuzziness of the models, 32 thorax data sets, and 10 organs defined in them, we found that the hierarchical approach to modeling can effectively handle the non-linear relationships in position, scale, and orientation that exist among organs in different patients.
NASA Astrophysics Data System (ADS)
Chen, Ying; Yuan, Jianghong; Zhang, Yingchao; Huang, Yonggang; Feng, Xue
2017-10-01
The interfacial failure of integrated circuit (IC) chips integrated on flexible substrates under bending deformation has been studied theoretically and experimentally. A compressive buckling test is used to impose the bending deformation onto the interface between the IC chip and the flexible substrate quantitatively, after which the failed interface is investigated using scanning electron microscopy. A theoretical model is established based on the beam theory and a bi-layer interface model, from which an analytical expression of the critical curvature in relation to the interfacial failure is obtained. The relationships between the critical curvature, the material, and the geometric parameters of the device are discussed in detail, providing guidance for future optimization flexible circuits based on IC chips.
Integrated narrative assessment exemplification: a leukaemia case history.
Artioli, Giovanna; Foà, Chiara; Cosentino, Chiara; Sollami, Alfonso; Taffurelli, Chiara
2017-07-18
In the Integrated Narrative Nursing Assessment (INNA), the Evidence-Based Nursing Model is integrated with the Narrative-Based Nursing Model. The INNA makes use of quantitative instruments, arising from the natural sciences as well as of qualitative ones, arising from the human achieving results of standardization and reproducibility, as well as of customization and uniqueness. Accordingly, the purpose of this work is to exemplify the thinking process of and the method adopted by a nurse adopting an integrated narrative assessment in the evaluation of a patient. The patient suffered from acute myeloid leukaemia, treated with chemotherapy. Her nurse worked in a haematology ward in a North Italy Hospital. The nurse had previous experience in conducting the assessment according to INNA. Based on patient's characteristics, the nurse chose to use the narration (to explore needs from their subjective perception) and the scales (to measure them objectively) among the various assessment instruments provided by the INNA. The resultant integrated outcomes helped the nurse to have a comprehensive overview of the person's health-care needs and their connections. These outcomes derive from the integration of narrative information with those obtained from the scales, which in this paper have shown consistent results. It is very difficult to reach this complexity by considering qualitative and quantitative assessment strategies as mutually foreclosing, given that both emerged as being very useful in identifying, understanding and measuring the needs of the assisted person. Then they both could be used to design a customized intervention, encouraging new connections between disease, illness, sickness and everyday life.
Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J
2017-11-01
Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Chiu, Grace S; Wu, Margaret A; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt-clay content-all regarded a priori as qualitatively important abiotic drivers-towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general.
Integrating art into science education: a survey of science teachers' practices
NASA Astrophysics Data System (ADS)
Turkka, Jaakko; Haatainen, Outi; Aksela, Maija
2017-07-01
Numerous case studies suggest that integrating art and science education could engage students with creative projects and encourage students to express science in multitude of ways. However, little is known about art integration practices in everyday science teaching. With a qualitative e-survey, this study explores the art integration of science teachers (n = 66). A pedagogical model for science teachers' art integration emerged from a qualitative content analysis conducted on examples of art integration. In the model, art integration is characterised as integration through content and activities. Whilst the links in the content were facilitated either directly between concepts and ideas or indirectly through themes or artefacts, the integration through activity often connected an activity in one domain and a concept, idea or artefact in the other domain with the exception of some activities that could belong to both domains. Moreover, the examples of art integration in everyday classroom did not include expression of emotions often associated with art. In addition, quantitative part of the survey confirmed that integration is infrequent in all mapped areas. The findings of this study have implications for science teacher education that should offer opportunities for more consistent art integration.
Parker, Stephen; Dark, Frances; Newman, Ellie; Korman, Nicole; Meurk, Carla; Siskind, Dan; Harris, Meredith
2016-06-02
A novel staffing model integrating peer support workers and clinical staff within a unified team is being trialled at community based residential rehabilitation units in Australia. A mixed-methods protocol for the longitudinal evaluation of the outcomes, expectations and experiences of care by consumers and staff under this staffing model in two units will be compared to one unit operating a traditional clinical staffing. The study is unique with regards to the context, the longitudinal approach and consideration of multiple stakeholder perspectives. The longitudinal mixed methods design integrates a quantitative evaluation of the outcomes of care for consumers at three residential rehabilitation units with an applied qualitative research methodology. The quantitative component utilizes a prospective cohort design to explore whether equivalent outcomes are achieved through engagement at residential rehabilitation units operating integrated and clinical staffing models. Comparative data will be available from the time of admission, discharge and 12-month period post-discharge from the units. Additionally, retrospective data for the 12-month period prior to admission will be utilized to consider changes in functioning pre and post engagement with residential rehabilitation care. The primary outcome will be change in psychosocial functioning, assessed using the total score on the Health of the Nation Outcome Scales (HoNOS). Planned secondary outcomes will include changes in symptomatology, disability, recovery orientation, carer quality of life, emergency department presentations, psychiatric inpatient bed days, and psychological distress and wellbeing. Planned analyses will include: cohort description; hierarchical linear regression modelling of the predictors of change in HoNOS following CCU care; and descriptive comparisons of the costs associated with the two staffing models. The qualitative component utilizes a pragmatic approach to grounded theory, with collection of data from consumers and staff at multiple time points exploring their expectations, experiences and reflections on the care provided by these services. It is expected that the new knowledge gained through this study will guide the adaptation of these and similar services. For example, if differential outcomes are achieved for consumers under the integrated and clinical staffing models this may inform staffing guidelines.
Analysis and Management of Animal Populations: Modeling, Estimation and Decision Making
Williams, B.K.; Nichols, J.D.; Conroy, M.J.
2002-01-01
This book deals with the processes involved in making informed decisions about the management of animal populations. It covers the modeling of population responses to management actions, the estimation of quantities needed in the modeling effort, and the application of these estimates and models to the development of sound management decisions. The book synthesizes and integrates in a single volume the methods associated with these themes, as they apply to ecological assessment and conservation of animal populations. KEY FEATURES * Integrates population modeling, parameter estimation and * decision-theoretic approaches to management in a single, cohesive framework * Provides authoritative, state-of-the-art descriptions of quantitative * approaches to modeling, estimation and decision-making * Emphasizes the role of mathematical modeling in the conduct of science * and management * Utilizes a unifying biological context, consistent mathematical notation, * and numerous biological examples
Veltman, Karin; Huijbregts, Mark A J; Hendriks, A Jan
2010-07-01
Both biotic ligand models (BLM) and bioaccumulation models aim to quantify metal exposure based on mechanistic knowledge, but key factors included in the description of metal uptake differ between the two approaches. Here, we present a quantitative comparison of both approaches and show that BLM and bioaccumulation kinetics can be merged into a common mechanistic framework for metal uptake in aquatic organisms. Our results show that metal-specific absorption efficiencies calculated from BLM-parameters for freshwater fish are highly comparable, i.e. within a factor of 2.4 for silver, cadmium, copper, and zinc, to bioaccumulation-absorption efficiencies for predominantly marine fish. Conditional affinity constants are significantly related to the metal-specific covalent index. Additionally, the affinity constants of calcium, cadmium, copper, sodium, and zinc are significantly comparable across aquatic species, including molluscs, daphnids, and fish. This suggests that affinity constants can be estimated from the covalent index, and constants can be extrapolated across species. A new model is proposed that integrates the combined effect of metal chemodynamics, as speciation, competition, and ligand affinity, and species characteristics, as size, on metal uptake by aquatic organisms. An important direction for further research is the quantitative comparison of the proposed model with acute toxicity values for organisms belonging to different size classes.
Introductory Life Science Mathematics and Quantitative Neuroscience Courses
ERIC Educational Resources Information Center
Duffus, Dwight; Olifer, Andrei
2010-01-01
We describe two sets of courses designed to enhance the mathematical, statistical, and computational training of life science undergraduates at Emory College. The first course is an introductory sequence in differential and integral calculus, modeling with differential equations, probability, and inferential statistics. The second is an…
A quantitative visual dashboard to explore exposures to ...
The Exposure Prioritization (Ex Priori) model features a simplified, quantitative visual dashboard to explore exposures across chemical space. Diverse data streams are integrated within the interface such that different exposure scenarios for “individual,” “population,” or “professional” time-use profiles can be interchanged to tailor exposure and quantitatively explore multi-chemical signatures of exposure, internalized dose (uptake), body burden, and elimination. Ex Priori will quantitatively extrapolate single-point estimates of both exposure and internal dose for multiple exposure scenarios, factors, products, and pathways. Currently, EPA is investigating its usefulness in life cycle analysis, insofar as its ability to enhance exposure factors used in calculating characterization factors for human health. Presented at 2016 Annual ISES Meeting held in Utrecht, The Netherlands, from 9-13 October 2016.
Helbling, Ignacio M; Ibarra, Juan C D; Luna, Julio A
2012-02-28
A mathematical modeling of controlled release of drug from one-layer torus-shaped devices is presented. Analytical solutions based on Refined Integral Method (RIM) are derived. The validity and utility of the model are ascertained by comparison of the simulation results with matrix-type vaginal rings experimental release data reported in the literature. For the comparisons, the pair-wise procedure is used to measure quantitatively the fit of the theoretical predictions to the experimental data. A good agreement between the model prediction and the experimental data is observed. A comparison with a previously reported model is also presented. More accurate results are achieved for small A/C(s) ratios. Copyright © 2011 Elsevier B.V. All rights reserved.
Microcirculation and the physiome projects.
Bassingthwaighte, James B
2008-11-01
The Physiome projects comprise a loosely knit worldwide effort to define the Physiome through databases and theoretical models, with the goal of better understanding the integrative functions of cells, organs, and organisms. The projects involve developing and archiving models, providing centralized databases, and linking experimental information and models from many laboratories into self-consistent frameworks. Increasingly accurate and complete models that embody quantitative biological hypotheses, adhere to high standards, and are publicly available and reproducible, together with refined and curated data, will enable biological scientists to advance integrative, analytical, and predictive approaches to the study of medicine and physiology. This review discusses the rationale and history of the Physiome projects, the role of theoretical models in the development of the Physiome, and the current status of efforts in this area addressing the microcirculation.
Tetko, Igor V; Maran, Uko; Tropsha, Alexander
2017-03-01
Thousands of (Quantitative) Structure-Activity Relationships (Q)SAR models have been described in peer-reviewed publications; however, this way of sharing seldom makes models available for the use by the research community outside of the developer's laboratory. Conversely, on-line models allow broad dissemination and application representing the most effective way of sharing the scientific knowledge. Approaches for sharing and providing on-line access to models range from web services created by individual users and laboratories to integrated modeling environments and model repositories. This emerging transition from the descriptive and informative, but "static", and for the most part, non-executable print format to interactive, transparent and functional delivery of "living" models is expected to have a transformative effect on modern experimental research in areas of scientific and regulatory use of (Q)SAR models. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Towards A Topological Framework for Integrating Semantic Information Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff A.; Hogan, Emilie A.; Robinson, Michael
2014-09-07
In this position paper we argue for the role that topological modeling principles can play in providing a framework for sensor integration. While used successfully in standard (quantitative) sensors, we are developing this methodology in new directions to make it appropriate specifically for semantic information sources, including keyterms, ontology terms, and other general Boolean, categorical, ordinal, and partially-ordered data types. We illustrate the basics of the methodology in an extended use case/example, and discuss path forward.
Shah, Anup D; Inder, Kerry L; Shah, Alok K; Cristino, Alexandre S; McKie, Arthur B; Gabra, Hani; Davis, Melissa J; Hill, Michelle M
2016-10-07
Lipid rafts are dynamic membrane microdomains that orchestrate molecular interactions and are implicated in cancer development. To understand the functions of lipid rafts in cancer, we performed an integrated analysis of quantitative lipid raft proteomics data sets modeling progression in breast cancer, melanoma, and renal cell carcinoma. This analysis revealed that cancer development is associated with increased membrane raft-cytoskeleton interactions, with ∼40% of elevated lipid raft proteins being cytoskeletal components. Previous studies suggest a potential functional role for the raft-cytoskeleton in the action of the putative tumor suppressors PTRF/Cavin-1 and Merlin. To extend the observation, we examined lipid raft proteome modulation by an unrelated tumor suppressor opioid binding protein cell-adhesion molecule (OPCML) in ovarian cancer SKOV3 cells. In agreement with the other model systems, quantitative proteomics revealed that 39% of OPCML-depleted lipid raft proteins are cytoskeletal components, with microfilaments and intermediate filaments specifically down-regulated. Furthermore, protein-protein interaction network and simulation analysis showed significantly higher interactions among cancer raft proteins compared with general human raft proteins. Collectively, these results suggest increased cytoskeleton-mediated stabilization of lipid raft domains with greater molecular interactions as a common, functional, and reversible feature of cancer cells.
Multiscale digital Arabidopsis predicts individual organ and whole-organism growth.
Chew, Yin Hoon; Wenden, Bénédicte; Flis, Anna; Mengin, Virginie; Taylor, Jasper; Davey, Christopher L; Tindal, Christopher; Thomas, Howard; Ougham, Helen J; de Reffye, Philippe; Stitt, Mark; Williams, Mathew; Muetzelfeldt, Robert; Halliday, Karen J; Millar, Andrew J
2014-09-30
Understanding how dynamic molecular networks affect whole-organism physiology, analogous to mapping genotype to phenotype, remains a key challenge in biology. Quantitative models that represent processes at multiple scales and link understanding from several research domains can help to tackle this problem. Such integrated models are more common in crop science and ecophysiology than in the research communities that elucidate molecular networks. Several laboratories have modeled particular aspects of growth in Arabidopsis thaliana, but it was unclear whether these existing models could productively be combined. We test this approach by constructing a multiscale model of Arabidopsis rosette growth. Four existing models were integrated with minimal parameter modification (leaf water content and one flowering parameter used measured data). The resulting framework model links genetic regulation and biochemical dynamics to events at the organ and whole-plant levels, helping to understand the combined effects of endogenous and environmental regulators on Arabidopsis growth. The framework model was validated and tested with metabolic, physiological, and biomass data from two laboratories, for five photoperiods, three accessions, and a transgenic line, highlighting the plasticity of plant growth strategies. The model was extended to include stochastic development. Model simulations gave insight into the developmental control of leaf production and provided a quantitative explanation for the pleiotropic developmental phenotype caused by overexpression of miR156, which was an open question. Modular, multiscale models, assembling knowledge from systems biology to ecophysiology, will help to understand and to engineer plant behavior from the genome to the field.
QUANTITATIVE ASSESSMENT OF INTEGRATED PHRENIC NERVE ACTIVITY
Nichols, Nicole L.; Mitchell, Gordon S.
2016-01-01
Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1G93A Taconic rat groups (an ALS model). Meta-analysis results indicate: 1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; 2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ~1.0; and 3) consistently reduced activity in end-stage SOD1G93A rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. PMID:26724605
The integrated process rates (IPR) estimated by the Eta-CMAQ model at grid cells along the trajectory of the air mass transport path were analyzed to quantitatively investigate the relative importance of physical and chemical processes for O3 formation and evolution ov...
Brouwer, Andrew F; Masters, Nina B; Eisenberg, Joseph N S
2018-04-20
Waterborne enteric pathogens remain a global health threat. Increasingly, quantitative microbial risk assessment (QMRA) and infectious disease transmission modeling (IDTM) are used to assess waterborne pathogen risks and evaluate mitigation. These modeling efforts, however, have largely been conducted independently for different purposes and in different settings. In this review, we examine the settings where each modeling strategy is employed. QMRA research has focused on food contamination and recreational water in high-income countries (HICs) and drinking water and wastewater in low- and middle-income countries (LMICs). IDTM research has focused on large outbreaks (predominately LMICs) and vaccine-preventable diseases (LMICs and HICs). Human ecology determines the niches that pathogens exploit, leading researchers to focus on different risk assessment research strategies in different settings. To enhance risk modeling, QMRA and IDTM approaches should be integrated to include dynamics of pathogens in the environment and pathogen transmission through populations.
In silico method for modelling metabolism and gene product expression at genome scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lerman, Joshua A.; Hyduke, Daniel R.; Latif, Haythem
2012-07-03
Transcription and translation use raw materials and energy generated metabolically to create the macromolecular machinery responsible for all cellular functions, including metabolism. A biochemically accurate model of molecular biology and metabolism will facilitate comprehensive and quantitative computations of an organism's molecular constitution as a function of genetic and environmental parameters. Here we formulate a model of metabolism and macromolecular expression. Prototyping it using the simple microorganism Thermotoga maritima, we show our model accurately simulates variations in cellular composition and gene expression. Moreover, through in silico comparative transcriptomics, the model allows the discovery of new regulons and improving the genome andmore » transcription unit annotations. Our method presents a framework for investigating molecular biology and cellular physiology in silico and may allow quantitative interpretation of multi-omics data sets in the context of an integrated biochemical description of an organism.« less
Integration of PKPD relationships into benefit–risk analysis
Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar
2015-01-01
Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398
Integration of PKPD relationships into benefit-risk analysis.
Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar
2015-11-01
Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.
Contemporary Quantitative Methods and "Slow" Causal Inference: Response to Palinkas
ERIC Educational Resources Information Center
Stone, Susan
2014-01-01
This response considers together simultaneously occurring discussions about causal inference in social work and allied health and social science disciplines. It places emphasis on scholarship that integrates the potential outcomes model with directed acyclic graphing techniques to extract core steps in causal inference. Although this scholarship…
The nuclear receptor, PXR, is an integral part of the regulation of hepatic metabolism. It has been shown to regulate specific CYPs (phase I drug-metabolizing enzymes) as well as certain phase II drug metabolism activities, including UDP-glucuronosyl transferase (UGT), sulfotran...
USDA-ARS?s Scientific Manuscript database
Quantitative risk assessments of pollution and data related to the effectiveness of mitigating best management practices (BMPs) are important aspects of nonpoint source (NPS) pollution control efforts, particularly those driven by specific water quality objectives and by measurable improvement goals...
NASA Astrophysics Data System (ADS)
Merlin, Thibaut; Visvikis, Dimitris; Fernandez, Philippe; Lamare, Frédéric
2018-02-01
Respiratory motion reduces both the qualitative and quantitative accuracy of PET images in oncology. This impact is more significant for quantitative applications based on kinetic modeling, where dynamic acquisitions are associated with limited statistics due to the necessity of enhanced temporal resolution. The aim of this study is to address these drawbacks, by combining a respiratory motion correction approach with temporal regularization in a unique reconstruction algorithm for dynamic PET imaging. Elastic transformation parameters for the motion correction are estimated from the non-attenuation-corrected PET images. The derived displacement matrices are subsequently used in a list-mode based OSEM reconstruction algorithm integrating a temporal regularization between the 3D dynamic PET frames, based on temporal basis functions. These functions are simultaneously estimated at each iteration, along with their relative coefficients for each image voxel. Quantitative evaluation has been performed using dynamic FDG PET/CT acquisitions of lung cancer patients acquired on a GE DRX system. The performance of the proposed method is compared with that of a standard multi-frame OSEM reconstruction algorithm. The proposed method achieved substantial improvements in terms of noise reduction while accounting for loss of contrast due to respiratory motion. Results on simulated data showed that the proposed 4D algorithms led to bias reduction values up to 40% in both tumor and blood regions for similar standard deviation levels, in comparison with a standard 3D reconstruction. Patlak parameter estimations on reconstructed images with the proposed reconstruction methods resulted in 30% and 40% bias reduction in the tumor and lung region respectively for the Patlak slope, and a 30% bias reduction for the intercept in the tumor region (a similar Patlak intercept was achieved in the lung area). Incorporation of the respiratory motion correction using an elastic model along with a temporal regularization in the reconstruction process of the PET dynamic series led to substantial quantitative improvements and motion artifact reduction. Future work will include the integration of a linear FDG kinetic model, in order to directly reconstruct parametric images.
Merlin, Thibaut; Visvikis, Dimitris; Fernandez, Philippe; Lamare, Frédéric
2018-02-13
Respiratory motion reduces both the qualitative and quantitative accuracy of PET images in oncology. This impact is more significant for quantitative applications based on kinetic modeling, where dynamic acquisitions are associated with limited statistics due to the necessity of enhanced temporal resolution. The aim of this study is to address these drawbacks, by combining a respiratory motion correction approach with temporal regularization in a unique reconstruction algorithm for dynamic PET imaging. Elastic transformation parameters for the motion correction are estimated from the non-attenuation-corrected PET images. The derived displacement matrices are subsequently used in a list-mode based OSEM reconstruction algorithm integrating a temporal regularization between the 3D dynamic PET frames, based on temporal basis functions. These functions are simultaneously estimated at each iteration, along with their relative coefficients for each image voxel. Quantitative evaluation has been performed using dynamic FDG PET/CT acquisitions of lung cancer patients acquired on a GE DRX system. The performance of the proposed method is compared with that of a standard multi-frame OSEM reconstruction algorithm. The proposed method achieved substantial improvements in terms of noise reduction while accounting for loss of contrast due to respiratory motion. Results on simulated data showed that the proposed 4D algorithms led to bias reduction values up to 40% in both tumor and blood regions for similar standard deviation levels, in comparison with a standard 3D reconstruction. Patlak parameter estimations on reconstructed images with the proposed reconstruction methods resulted in 30% and 40% bias reduction in the tumor and lung region respectively for the Patlak slope, and a 30% bias reduction for the intercept in the tumor region (a similar Patlak intercept was achieved in the lung area). Incorporation of the respiratory motion correction using an elastic model along with a temporal regularization in the reconstruction process of the PET dynamic series led to substantial quantitative improvements and motion artifact reduction. Future work will include the integration of a linear FDG kinetic model, in order to directly reconstruct parametric images.
NASA Astrophysics Data System (ADS)
Arnold, J.; Gutmann, E. D.; Clark, M. P.; Nijssen, B.; Vano, J. A.; Addor, N.; Wood, A.; Newman, A. J.; Mizukami, N.; Brekke, L. D.; Rasmussen, R.; Mendoza, P. A.
2016-12-01
Climate change narratives for water-resource applications must represent the change signals contextualized by hydroclimatic process variability and uncertainty at multiple scales. Building narratives of plausible change includes assessing uncertainties across GCM structure, internal climate variability, climate downscaling methods, and hydrologic models. Work with this linked modeling chain has dealt mostly with GCM sampling directed separately to either model fidelity (does the model correctly reproduce the physical processes in the world?) or sensitivity (of different model responses to CO2 forcings) or diversity (of model type, structure, and complexity). This leaves unaddressed any interactions among those measures and with other components in the modeling chain used to identify water-resource vulnerabilities to specific climate threats. However, time-sensitive, real-world vulnerability studies typically cannot accommodate a full uncertainty ensemble across the whole modeling chain, so a gap has opened between current scientific knowledge and most routine applications for climate-changed hydrology. To close that gap, the US Army Corps of Engineers, the Bureau of Reclamation, and the National Center for Atmospheric Research are working on techniques to subsample uncertainties objectively across modeling chain components and to integrate results into quantitative hydrologic storylines of climate-changed futures. Importantly, these quantitative storylines are not drawn from a small sample of models or components. Rather, they stem from the more comprehensive characterization of the full uncertainty space for each component. Equally important from the perspective of water-resource practitioners, these quantitative hydrologic storylines are anchored in actual design and operations decisions potentially affected by climate change. This talk will describe part of our work characterizing variability and uncertainty across modeling chain components and their interactions using newly developed observational data, models and model outputs, and post-processing tools for making the resulting quantitative storylines most useful in practical hydrology applications.
Toward a quantitative approach to migrants integration
NASA Astrophysics Data System (ADS)
Barra, A.; Contucci, P.
2010-03-01
Migration phenomena and all the related issues, like integration of different social groups, are intrinsically complex problems since they strongly depend on several competitive mechanisms as economic factors, cultural differences and many others. By identifying a few essential assumptions, and using the statistical mechanics of complex systems, we propose a novel quantitative approach that provides a minimal theory for those phenomena. We show that the competitive interactions in decision making between a population of N host citizens and P immigrants, a bi-partite spin-glass, give rise to a social consciousness inside the host community in the sense of the associative memory of neural networks. The theory leads to a natural quantitative definition of migrant's "integration" inside the community. From the technical point of view this minimal picture assumes, as control parameters, only general notions like the strength of the random interactions, the ratio between the sizes of the two parties and the cultural influence. Few steps forward, toward more refined models, which include a digression on the kind of the felt experiences and some structure on the random interaction topology (as dilution to avoid the plain mean-field approach) and correlations of experiences felt between the two parties (biasing the distribution of the coupling) are discussed at the end, where we show the robustness of our approach.
Integrative strategies to identify candidate genes in rodent models of human alcoholism.
Treadwell, Julie A
2006-01-01
The search for genes underlying alcohol-related behaviours in rodent models of human alcoholism has been ongoing for many years with only limited success. Recently, new strategies that integrate several of the traditional approaches have provided new insights into the molecular mechanisms underlying ethanol's actions in the brain. We have used alcohol-preferring C57BL/6J (B6) and alcohol-avoiding DBA/2J (D2) genetic strains of mice in an integrative strategy combining high-throughput gene expression screening, genetic segregation analysis, and mapping to previously published quantitative trait loci to uncover candidate genes for the ethanol-preference phenotype. In our study, 2 genes, retinaldehyde binding protein 1 (Rlbp1) and syntaxin 12 (Stx12), were found to be strong candidates for ethanol preference. Such experimental approaches have the power and the potential to greatly speed up the laborious process of identifying candidate genes for the animal models of human alcoholism.
Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya
2010-04-01
Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.
[Indicators of communication and degree of professional integration in healthcare].
Mola, Ernesto; Maggio, Anna; Vantaggiato, Lucia
2009-01-01
According to the chronic care model, improving the management of chronic illness requires efficient communication between health care professionals and the creation of a web of integrated healthcare The aim of this study was to identify an efficient methodology for evaluating the degree of professional integration through indicators related to communication between healthcare professionals. The following types of indicators were identified:-structure indicators to evaluate the presence of prerequisites necessary for implementing the procedures -functional indicators to quantitatively evaluate the use of communications instruments-performance indicators Defining specific indicators may be an appropriate methodology for evaluating the degree of integration and communication between health professionals, available for a bargaining system of incentives.
Integrating FMEA in a Model-Driven Methodology
NASA Astrophysics Data System (ADS)
Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno
2016-08-01
Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartmann, Anja, E-mail: hartmann@ipk-gatersleben.de; Schreiber, Falk; Martin-Luther-University Halle-Wittenberg, Halle
The characterization of biological systems with respect to their behavior and functionality based on versatile biochemical interactions is a major challenge. To understand these complex mechanisms at systems level modeling approaches are investigated. Different modeling formalisms allow metabolic models to be analyzed depending on the question to be solved, the biochemical knowledge and the availability of experimental data. Here, we describe a method for an integrative analysis of the structure and dynamics represented by qualitative and quantitative metabolic models. Using various formalisms, the metabolic model is analyzed from different perspectives. Determined structural and dynamic properties are visualized in the contextmore » of the metabolic model. Interaction techniques allow the exploration and visual analysis thereby leading to a broader understanding of the behavior and functionality of the underlying biological system. The System Biology Metabolic Model Framework (SBM{sup 2} – Framework) implements the developed method and, as an example, is applied for the integrative analysis of the crop plant potato.« less
System-level modeling of acetone-butanol-ethanol fermentation.
Liao, Chen; Seo, Seung-Oh; Lu, Ting
2016-05-01
Acetone-butanol-ethanol (ABE) fermentation is a metabolic process of clostridia that produces bio-based solvents including butanol. It is enabled by an underlying metabolic reaction network and modulated by cellular gene regulation and environmental cues. Mathematical modeling has served as a valuable strategy to facilitate the understanding, characterization and optimization of this process. In this review, we highlight recent advances in system-level, quantitative modeling of ABE fermentation. We begin with an overview of integrative processes underlying the fermentation. Next we survey modeling efforts including early simple models, models with a systematic metabolic description, and those incorporating metabolism through simple gene regulation. Particular focus is given to a recent system-level model that integrates the metabolic reactions, gene regulation and environmental cues. We conclude by discussing the remaining challenges and future directions towards predictive understanding of ABE fermentation. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Quantitative prediction of drug side effects based on drug-related features.
Niu, Yanqing; Zhang, Wen
2017-09-01
Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.
Huang, Yongzhi; Green, Alexander L; Hyam, Jonathan; Fitzgerald, James; Aziz, Tipu Z; Wang, Shouyan
2018-01-01
Understanding the function of sensory thalamic neural activity is essential for developing and improving interventions for neuropathic pain. However, there is a lack of investigation of the relationship between sensory thalamic oscillations and pain relief in patients with neuropathic pain. This study aims to identify the oscillatory neural characteristics correlated with pain relief induced by deep brain stimulation (DBS), and develop a quantitative model to predict pain relief by integrating characteristic measures of the neural oscillations. Measures of sensory thalamic local field potentials (LFPs) in thirteen patients with neuropathic pain were screened in three dimensional feature space according to the rhythm, balancing, and coupling neural behaviours, and correlated with pain relief. An integrated approach based on principal component analysis (PCA) and multiple regression analysis is proposed to integrate the multiple measures and provide a predictive model. This study reveals distinct thalamic rhythms of theta, alpha, high beta and high gamma oscillations correlating with pain relief. The balancing and coupling measures between these neural oscillations were also significantly correlated with pain relief. The study enriches the series research on the function of thalamic neural oscillations in neuropathic pain and relief, and provides a quantitative approach for predicting pain relief by DBS using thalamic neural oscillations. Copyright © 2017 Elsevier Inc. All rights reserved.
2011-06-17
structure through quantitative assessment of stiffness and modal parameter changes resulting from modifications to the beam geometries and positions...power transmission assembly. If the power limit at a wheel exceeds the traction limit, then depending on the type of differential placed on the axle ...components with appropriate model connectivity instead to determine the free modal response of powertrain type components, without abstraction
2011-01-01
refinement of the vehicle body structure through quantitative assessment of stiffness and modal parameter changes resulting from modifications to the beam...differential placed on the axle , adjustment of the torque output to the opposite wheel may be required to obtain the correct solution. Thus...represented by simple inertial components with appropriate model connectivity instead to determine the free modal response of powertrain type
Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment
NASA Technical Reports Server (NTRS)
Ancel, Ersin; Shih, Ann T.
2014-01-01
NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.
Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data
Kümmel, Anne; Panke, Sven; Heinemann, Matthias
2006-01-01
As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595
Toward Integration: From Quantitative Biology to Mathbio-Biomath?
ERIC Educational Resources Information Center
Marsteller, Pat; de Pillis, Lisette; Findley, Ann; Joplin, Karl; Pelesko, John; Nelson, Karen; Thompson, Katerina; Usher, David; Watkins, Joseph
2010-01-01
In response to the call of "BIO2010" for integrating quantitative skills into undergraduate biology education, 30 Howard Hughes Medical Institute (HHMI) Program Directors at the 2006 HHMI Program Directors Meeting established a consortium to investigate, implement, develop, and disseminate best practices resulting from the integration of math and…
Wolf, Lisa
2013-02-01
To explore the relationship between multiple variables within a model of critical thinking and moral reasoning. A quantitative descriptive correlational design using a purposive sample of 200 emergency nurses. Measured variables were accuracy in clinical decision-making, moral reasoning, perceived care environment, and demographics. Analysis was by bivariate correlation using Pearson's product-moment correlation coefficients, chi square and multiple linear regression analysis. The elements as identified in the integrated ethically-driven environmental model of clinical decision-making (IEDEM-CD) corrected depict moral reasoning and environment of care as factors significantly affecting accuracy in decision-making. The integrated, ethically driven environmental model of clinical decision making is a framework useful for predicting clinical decision making accuracy for emergency nurses in practice, with further implications in education, research and policy. A diagnostic and therapeutic framework for identifying and remediating individual and environmental challenges to accurate clinical decision making. © 2012, The Author. International Journal of Nursing Knowledge © 2012, NANDA International.
Nguyen, Huu-Tho; Dawal, Siti Zawiah Md; Nukman, Yusoff; Rifai, Achmad P; Aoyama, Hideki
2016-01-01
The conveyor system plays a vital role in improving the performance of flexible manufacturing cells (FMCs). The conveyor selection problem involves the evaluation of a set of potential alternatives based on qualitative and quantitative criteria. This paper presents an integrated multi-criteria decision making (MCDM) model of a fuzzy AHP (analytic hierarchy process) and fuzzy ARAS (additive ratio assessment) for conveyor evaluation and selection. In this model, linguistic terms represented as triangular fuzzy numbers are used to quantify experts' uncertain assessments of alternatives with respect to the criteria. The fuzzy set is then integrated into the AHP to determine the weights of the criteria. Finally, a fuzzy ARAS is used to calculate the weights of the alternatives. To demonstrate the effectiveness of the proposed model, a case study is performed of a practical example, and the results obtained demonstrate practical potential for the implementation of FMCs.
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; P. Rifai, Achmad; Aoyama, Hideki
2016-01-01
The conveyor system plays a vital role in improving the performance of flexible manufacturing cells (FMCs). The conveyor selection problem involves the evaluation of a set of potential alternatives based on qualitative and quantitative criteria. This paper presents an integrated multi-criteria decision making (MCDM) model of a fuzzy AHP (analytic hierarchy process) and fuzzy ARAS (additive ratio assessment) for conveyor evaluation and selection. In this model, linguistic terms represented as triangular fuzzy numbers are used to quantify experts’ uncertain assessments of alternatives with respect to the criteria. The fuzzy set is then integrated into the AHP to determine the weights of the criteria. Finally, a fuzzy ARAS is used to calculate the weights of the alternatives. To demonstrate the effectiveness of the proposed model, a case study is performed of a practical example, and the results obtained demonstrate practical potential for the implementation of FMCs. PMID:27070543
Hofman, Abe D.; Visser, Ingmar; Jansen, Brenda R. J.; van der Maas, Han L. J.
2015-01-01
We propose and test three statistical models for the analysis of children’s responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM) following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779), and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808). For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development. PMID:26505905
Incorporating learning goals about modeling into an upper-division physics laboratory experiment
NASA Astrophysics Data System (ADS)
Zwickl, Benjamin M.; Finkelstein, Noah; Lewandowski, H. J.
2014-09-01
Implementing a laboratory activity involves a complex interplay among learning goals, available resources, feedback about the existing course, best practices for teaching, and an overall philosophy about teaching labs. Building on our previous work, which described a process of transforming an entire lab course, we now turn our attention to how an individual lab activity on the polarization of light was redesigned to include a renewed emphasis on one broad learning goal: modeling. By using this common optics lab as a concrete case study of a broadly applicable approach, we highlight many aspects of the activity development and show how modeling is used to integrate sophisticated conceptual and quantitative reasoning into the experimental process through the various aspects of modeling: constructing models, making predictions, interpreting data, comparing measurements with predictions, and refining models. One significant outcome is a natural way to integrate an analysis and discussion of systematic error into a lab activity.
Integrating Science and Management to Assess Forest Ecosystem Vulnerability to Climate Change
Leslie A. Brandt; Patricia R. Butler; Stephen D. Handler; Maria K. Janowiak; P. Danielle Shannon; Christopher W. Swanston
2017-01-01
We developed the ecosystem vulnerability assessment approach (EVAA) to help inform potential adaptation actions in response to a changing climate. EVAA combines multiple quantitative models and expert elicitation from scientists and land managers. In each of eight assessment areas, a panel of local experts determined potential vulnerability of forest ecosystems to...
ERIC Educational Resources Information Center
Bain, Kinsey; Rodriguez, Jon-Marc G.; Moon, Alena; Towns, Marcy H.
2018-01-01
Chemical kinetics is a highly quantitative content area that involves the use of multiple mathematical representations to model processes and is a context that is under-investigated in the literature. This qualitative study explored undergraduate student integration of chemistry and mathematics during problem solving in the context of chemical…
A Quantitative Assessment of the Factors that Influence Technology Acceptance in Emergency Response
ERIC Educational Resources Information Center
Seiter, Thomas C.
2012-01-01
Traditional models for studying user acceptance and adoption of technology focused on the factors that identify and tested the relationships forged between the user and the technology in question. In emergency response, implementing technology without user acceptance may affect the safety of the responders and citizenry. Integrating the factors…
Stress and Grief of a Perinatal Loss: Integrating Qualitative and Quantitative Methods.
ERIC Educational Resources Information Center
Thomas, Volker; Striegel, Phil
1995-01-01
Examined how parents grieve loss of a baby through miscarriage or stillbirth. Interviewed 26 couples 2 weeks after a perinatal loss. Ethnographic content analysis revealed 12 themes closely related to resources, meaning of the stressful event, and coping strategies, all of which are elements, of Hill's ABC-X stress model. (JBJ)
2012-09-01
15 3.5 Fractography ... Fractography Results .............................................................................................. 19 4.2.1 Fatigue Crack Growth Images...quantitative fractography [17, 18]. The determination of the ECS is achieved by a trial-and-error calculation with the aim of matching the experimental
The path for incorporating new alternative methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. Some of these challenges include development of relevant and predictive test systems and computational models to integrate...
Channel morphology investigations using Geographic Information Systems and field research
Scott N. Miller; Ann Youberg; D. Phillip Guertin; David C. Goodrich
2000-01-01
Stream channels are integral to watershed function and are affected by watershed management decisions. Given an understanding of the relationships among channel and watershed variables, they may serve as indicators of upland condition or used in distributed rainfall-runoff models. This paper presents a quantitative analysis of fluvial morphology as related to watershed...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, J.H.; Roy, D.M.; Mann, B.
1995-12-31
This paper describes an integrated approach to developing a predictive computer model for long-term performance of concrete engineered barriers utilized in LLRW and ILRW disposal facilities. The model development concept consists of three major modeling schemes: hydration modeling of the binder phase, pore solution speciation, and transport modeling in the concrete barrier and service environment. Although still in its inception, the model development approach demonstrated that the chemical and physical properties of complex cementitious materials and their interactions with service environments can be described quantitatively. Applying the integrated model development approach to modeling alkali (Na and K) leaching from amore » concrete pad barrier in an above-grade tumulus disposal unit, it is predicted that, in a near-surface land disposal facility where water infiltration through the facility is normally minimal, the alkalis control the pore solution pH of the concrete barriers for much longer than most previous concrete barrier degradation studies assumed. The results also imply that a highly alkaline condition created by the alkali leaching will result in alteration of the soil mineralogy in the vicinity of the disposal facility.« less
Quantitative assessment of integrated phrenic nerve activity.
Nichols, Nicole L; Mitchell, Gordon S
2016-06-01
Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1(G93A) Taconic rat groups (an ALS model). Meta-analysis results indicate: (1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; (2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ∼1.0; and (3) consistently reduced activity in end-stage SOD1(G93A) rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dietrich, Jörg; Funke, Markus
Integrated water resources management (IWRM) redefines conventional water management approaches through a closer cross-linkage between environment and society. The role of public participation and socio-economic considerations becomes more important within the planning and decision making process. In this paper we address aspects of the integration of catchment models into such a process taking the implementation of the European Water Framework Directive (WFD) as an example. Within a case study situated in the Werra river basin (Central Germany), a systems analytic decision process model was developed. This model uses the semantics of the Unified Modeling Language (UML) activity model. As an example application, the catchment model SWAT and the water quality model RWQM1 were applied to simulate the effect of phosphorus emissions from non-point and point sources on water quality. The decision process model was able to guide the participants of the case study through the interdisciplinary planning and negotiation of actions. Further improvements of the integration framework include tools for quantitative uncertainty analyses, which are crucial for real life application of models within an IWRM decision making toolbox. For the case study, the multi-criteria assessment of actions indicates that the polluter pays principle can be met at larger scales (sub-catchment or river basin) without significantly compromising cost efficiency for the local situation.
NASA Astrophysics Data System (ADS)
Wang, Lin; Cao, Xin; Ren, Qingyun; Chen, Xueli; He, Xiaowei
2018-05-01
Cerenkov luminescence imaging (CLI) is an imaging method that uses an optical imaging scheme to probe a radioactive tracer. Application of CLI with clinically approved radioactive tracers has opened an opportunity for translating optical imaging from preclinical to clinical applications. Such translation was further improved by developing an endoscopic CLI system. However, two-dimensional endoscopic imaging cannot identify accurate depth and obtain quantitative information. Here, we present an imaging scheme to retrieve the depth and quantitative information from endoscopic Cerenkov luminescence tomography, which can also be applied for endoscopic radio-luminescence tomography. In the scheme, we first constructed a physical model for image collection, and then a mathematical model for characterizing the luminescent light propagation from tracer to the endoscopic detector. The mathematical model is a hybrid light transport model combined with the 3rd order simplified spherical harmonics approximation, diffusion, and radiosity equations to warrant accuracy and speed. The mathematical model integrates finite element discretization, regularization, and primal-dual interior-point optimization to retrieve the depth and the quantitative information of the tracer. A heterogeneous-geometry-based numerical simulation was used to explore the feasibility of the unified scheme, which demonstrated that it can provide a satisfactory balance between imaging accuracy and computational burden.
Ramsey, Simeon J; Attkins, Neil J; Fish, Rebecca; van der Graaf, Piet H
2011-01-01
BACKGROUND AND PURPOSE A series of novel non-peptide corticotropin releasing factor type-1 receptor (CRF1) antagonists were found to display varying degrees of insurmountable and non-competitive behaviour in functional in vitro assays. We describe how we attempted to relate this behaviour to ligand receptor-binding kinetics in a quantitative manner and how this resulted in the development and implementation of an efficient pharmacological screening method based on principles described by Motulsky and Mahan. EXPERIMENTAL APPROACH A non-equilibrium binding kinetic assay was developed to determine the receptor binding kinetics of non-peptide CRF1 antagonists. Nonlinear, mixed-effects modelling was used to obtain estimates of the compounds association and dissociation rates. We present an integrated pharmacokinetic–pharmacodynamic (PKPD) approach, whereby the time course of in vivo CRF1 receptor binding of novel compounds can be predicted on the basis of in vitro assays. KEY RESULTS The non-competitive antagonist behaviour appeared to be correlated to the CRF1 receptor off-rate kinetics. The integrated PKPD model suggested that, at least in a qualitative manner, the in vitro assay can be used to triage and select compounds for further in vivo investigations. CONCLUSIONS AND IMPLICATIONS This study provides evidence for a link between ligand offset kinetics and insurmountable/non-competitive antagonism at the CRF1 receptor. The exact molecular pharmacological nature of this association remains to be determined. In addition, we have developed a quantitative framework to study and integrate in vitro and in vivo receptor binding kinetic behaviour of CRF1 receptor antagonists in an efficient manner in a drug discovery setting. PMID:21449919
Fu, Guifang; Dai, Xiaotian; Symanzik, Jürgen; Bushman, Shaun
2017-01-01
Leaf shape traits have long been a focus of many disciplines, but the complex genetic and environmental interactive mechanisms regulating leaf shape variation have not yet been investigated in detail. The question of the respective roles of genes and environment and how they interact to modulate leaf shape is a thorny evolutionary problem, and sophisticated methodology is needed to address it. In this study, we investigated a framework-level approach that inputs shape image photographs and genetic and environmental data, and then outputs the relative importance ranks of all variables after integrating shape feature extraction, dimension reduction, and tree-based statistical models. The power of the proposed framework was confirmed by simulation and a Populus szechuanica var. tibetica data set. This new methodology resulted in the detection of novel shape characteristics, and also confirmed some previous findings. The quantitative modeling of a combination of polygenetic, plastic, epistatic, and gene-environment interactive effects, as investigated in this study, will improve the discernment of quantitative leaf shape characteristics, and the methods are ready to be applied to other leaf morphology data sets. Unlike the majority of approaches in the quantitative leaf shape literature, this framework-level approach is data-driven, without assuming any pre-known shape attributes, landmarks, or model structures. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.
van Dijk, Aalt D J; Molenaar, Jaap
2017-01-01
The appropriate timing of flowering is crucial for the reproductive success of plants. Hence, intricate genetic networks integrate various environmental and endogenous cues such as temperature or hormonal statues. These signals integrate into a network of floral pathway integrator genes. At a quantitative level, it is currently unclear how the impact of genetic variation in signaling pathways on flowering time is mediated by floral pathway integrator genes. Here, using datasets available from literature, we connect Arabidopsis thaliana flowering time in genetic backgrounds varying in upstream signalling components with the expression levels of floral pathway integrator genes in these genetic backgrounds. Our modelling results indicate that flowering time depends in a quite linear way on expression levels of floral pathway integrator genes. This gradual, proportional response of flowering time to upstream changes enables a gradual adaptation to changing environmental factors such as temperature and light.
Integrating knowledge representation and quantitative modelling in physiology.
de Bono, Bernard; Hunter, Peter
2012-08-01
A wealth of potentially shareable resources, such as data and models, is being generated through the study of physiology by computational means. Although in principle the resources generated are reusable, in practice, few can currently be shared. A key reason for this disparity stems from the lack of consistent cataloguing and annotation of these resources in a standardised manner. Here, we outline our vision for applying community-based modelling standards in support of an automated integration of models across physiological systems and scales. Two key initiatives, the Physiome Project and the European contribution - the Virtual Phsysiological Human Project, have emerged to support this multiscale model integration, and we focus on the role played by two key components of these frameworks, model encoding and semantic metadata annotation. We present examples of biomedical modelling scenarios (the endocrine effect of atrial natriuretic peptide, and the implications of alcohol and glucose toxicity) to illustrate the role that encoding standards and knowledge representation approaches, such as ontologies, could play in the management, searching and visualisation of physiology models, and thus in providing a rational basis for healthcare decisions and contributing towards realising the goal of of personalized medicine. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A novel integrated framework and improved methodology of computer-aided drug design.
Chen, Calvin Yu-Chian
2013-01-01
Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates.
Quantitative workflow based on NN for weighting criteria in landfill suitability mapping
NASA Astrophysics Data System (ADS)
Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul
2017-10-01
Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.
Gittelsohn, Joel; Evans, Marguerite; Story, Mary; Davis, Sally M; Metcalfe, Lauve; Helitzer, Deborah L; Clay, Theresa E
2016-01-01
We describe the formative assessment process, using an approach based on social learning theory, for the development of a school-based obesity-prevention intervention into which cultural perspectives are integrated. The feasibility phase of the Pathways study was conducted in multiple settings in 6 American Indian nations. The Pathways formative assessment collected both qualitative and quantitative data. The qualitative data identified key social and environmental issues and enabled local people to express their own needs and views. The quantitative, structured data permitted comparison across sites. Both types of data were integrated by using a conceptual and procedural model. The formative assessment results were used to identify and rank the behavioral risk factors that were to become the focus of the Pathways intervention and to provide guidance on developing common intervention strategies that would be culturally appropriate and acceptable to all sites. PMID:10195601
[Landscape classification: research progress and development trend].
Liang, Fa-Chao; Liu, Li-Ming
2011-06-01
Landscape classification is the basis of the researches on landscape structure, process, and function, and also, the prerequisite for landscape evaluation, planning, protection, and management, directly affecting the precision and practicability of landscape research. This paper reviewed the research progress on the landscape classification system, theory, and methodology, and summarized the key problems and deficiencies of current researches. Some major landscape classification systems, e. g. , LANMAP and MUFIC, were introduced and discussed. It was suggested that a qualitative and quantitative comprehensive classification based on the ideology of functional structure shape and on the integral consideration of landscape classification utility, landscape function, landscape structure, physiogeographical factors, and human disturbance intensity should be the major research directions in the future. The integration of mapping, 3S technology, quantitative mathematics modeling, computer artificial intelligence, and professional knowledge to enhance the precision of landscape classification would be the key issues and the development trend in the researches of landscape classification.
Quantification of Microbial Phenotypes
Martínez, Verónica S.; Krömer, Jens O.
2016-01-01
Metabolite profiling technologies have improved to generate close to quantitative metabolomics data, which can be employed to quantitatively describe the metabolic phenotype of an organism. Here, we review the current technologies available for quantitative metabolomics, present their advantages and drawbacks, and the current challenges to generate fully quantitative metabolomics data. Metabolomics data can be integrated into metabolic networks using thermodynamic principles to constrain the directionality of reactions. Here we explain how to estimate Gibbs energy under physiological conditions, including examples of the estimations, and the different methods for thermodynamics-based network analysis. The fundamentals of the methods and how to perform the analyses are described. Finally, an example applying quantitative metabolomics to a yeast model by 13C fluxomics and thermodynamics-based network analysis is presented. The example shows that (1) these two methods are complementary to each other; and (2) there is a need to take into account Gibbs energy errors. Better estimations of metabolic phenotypes will be obtained when further constraints are included in the analysis. PMID:27941694
The Dilution Effect and Information Integration in Perceptual Decision Making
Hotaling, Jared M.; Cohen, Andrew L.; Shiffrin, Richard M.; Busemeyer, Jerome R.
2015-01-01
In cognitive science there is a seeming paradox: On the one hand, studies of human judgment and decision making have repeatedly shown that people systematically violate optimal behavior when integrating information from multiple sources. On the other hand, optimal models, often Bayesian, have been successful at accounting for information integration in fields such as categorization, memory, and perception. This apparent conflict could be due, in part, to different materials and designs that lead to differences in the nature of processing. Stimuli that require controlled integration of information, such as the quantitative or linguistic information (commonly found in judgment studies), may lead to suboptimal performance. In contrast, perceptual stimuli may lend themselves to automatic processing, resulting in integration that is closer to optimal. We tested this hypothesis with an experiment in which participants categorized faces based on resemblance to a family patriarch. The amount of evidence contained in the top and bottom halves of each test face was independently manipulated. These data allow us to investigate a canonical example of sub-optimal information integration from the judgment and decision making literature, the dilution effect. Splitting the top and bottom halves of a face, a manipulation meant to encourage controlled integration of information, produced farther from optimal behavior and larger dilution effects. The Multi-component Information Accumulation model, a hybrid optimal/averaging model of information integration, successfully accounts for key accuracy, response time, and dilution effects. PMID:26406323
The Dilution Effect and Information Integration in Perceptual Decision Making.
Hotaling, Jared M; Cohen, Andrew L; Shiffrin, Richard M; Busemeyer, Jerome R
2015-01-01
In cognitive science there is a seeming paradox: On the one hand, studies of human judgment and decision making have repeatedly shown that people systematically violate optimal behavior when integrating information from multiple sources. On the other hand, optimal models, often Bayesian, have been successful at accounting for information integration in fields such as categorization, memory, and perception. This apparent conflict could be due, in part, to different materials and designs that lead to differences in the nature of processing. Stimuli that require controlled integration of information, such as the quantitative or linguistic information (commonly found in judgment studies), may lead to suboptimal performance. In contrast, perceptual stimuli may lend themselves to automatic processing, resulting in integration that is closer to optimal. We tested this hypothesis with an experiment in which participants categorized faces based on resemblance to a family patriarch. The amount of evidence contained in the top and bottom halves of each test face was independently manipulated. These data allow us to investigate a canonical example of sub-optimal information integration from the judgment and decision making literature, the dilution effect. Splitting the top and bottom halves of a face, a manipulation meant to encourage controlled integration of information, produced farther from optimal behavior and larger dilution effects. The Multi-component Information Accumulation model, a hybrid optimal/averaging model of information integration, successfully accounts for key accuracy, response time, and dilution effects.
Mudaliar, Manikhandan; Tassi, Riccardo; Thomas, Funmilola C; McNeilly, Tom N; Weidt, Stefan K; McLaughlin, Mark; Wilson, David; Burchmore, Richard; Herzyk, Pawel; Eckersall, P David; Zadoks, Ruth N
2016-08-16
Mastitis, inflammation of the mammary gland, is the most common and costly disease of dairy cattle in the western world. It is primarily caused by bacteria, with Streptococcus uberis as one of the most prevalent causative agents. To characterize the proteome during Streptococcus uberis mastitis, an experimentally induced model of intramammary infection was used. Milk whey samples obtained from 6 cows at 6 time points were processed using label-free relative quantitative proteomics. This proteomic analysis complements clinical, bacteriological and immunological studies as well as peptidomic and metabolomic analysis of the same challenge model. A total of 2552 non-redundant bovine peptides were identified, and from these, 570 bovine proteins were quantified. Hierarchical cluster analysis and principal component analysis showed clear clustering of results by stage of infection, with similarities between pre-infection and resolution stages (0 and 312 h post challenge), early infection stages (36 and 42 h post challenge) and late infection stages (57 and 81 h post challenge). Ingenuity pathway analysis identified upregulation of acute phase protein pathways over the course of infection, with dominance of different acute phase proteins at different time points based on differential expression analysis. Antimicrobial peptides, notably cathelicidins and peptidoglycan recognition protein, were upregulated at all time points post challenge and peaked at 57 h, which coincided with 10 000-fold decrease in average bacterial counts. The integration of clinical, bacteriological, immunological and quantitative proteomics and other-omic data provides a more detailed systems level view of the host response to mastitis than has been achieved previously.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babendreier, Justin E.; Castleton, Karl J.
2005-08-01
Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less
Health impact assessment – A survey on quantifying tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org
Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less
Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?
Happ, Mary Beth
2010-01-01
This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973
Size effects on insect hovering aerodynamics: an integrated computational study.
Liu, H; Aono, H
2009-03-01
Hovering is a miracle of insects that is observed for all sizes of flying insects. Sizing effect in insect hovering on flapping-wing aerodynamics is of interest to both the micro-air-vehicle (MAV) community and also of importance to comparative morphologists. In this study, we present an integrated computational study of such size effects on insect hovering aerodynamics, which is performed using a biology-inspired dynamic flight simulator that integrates the modelling of realistic wing-body morphology, the modelling of flapping-wing and body kinematics and an in-house Navier-Stokes solver. Results of four typical insect hovering flights including a hawkmoth, a honeybee, a fruit fly and a thrips, over a wide range of Reynolds numbers from O(10(4)) to O(10(1)) are presented, which demonstrate the feasibility of the present integrated computational methods in quantitatively modelling and evaluating the unsteady aerodynamics in insect flapping flight. Our results based on realistically modelling of insect hovering therefore offer an integrated understanding of the near-field vortex dynamics, the far-field wake and downwash structures, and their correlation with the force production in terms of sizing and Reynolds number as well as wing kinematics. Our results not only give an integrated interpretation on the similarity and discrepancy of the near- and far-field vortex structures in insect hovering but also demonstrate that our methods can be an effective tool in the MAVs design.
Genome Scale Modeling in Systems Biology: Algorithms and Resources
Najafi, Ali; Bidkhori, Gholamreza; Bozorgmehr, Joseph H.; Koch, Ina; Masoudi-Nejad, Ali
2014-01-01
In recent years, in silico studies and trial simulations have complemented experimental procedures. A model is a description of a system, and a system is any collection of interrelated objects; an object, moreover, is some elemental unit upon which observations can be made but whose internal structure either does not exist or is ignored. Therefore, any network analysis approach is critical for successful quantitative modeling of biological systems. This review highlights some of most popular and important modeling algorithms, tools, and emerging standards for representing, simulating and analyzing cellular networks in five sections. Also, we try to show these concepts by means of simple example and proper images and graphs. Overall, systems biology aims for a holistic description and understanding of biological processes by an integration of analytical experimental approaches along with synthetic computational models. In fact, biological networks have been developed as a platform for integrating information from high to low-throughput experiments for the analysis of biological systems. We provide an overview of all processes used in modeling and simulating biological networks in such a way that they can become easily understandable for researchers with both biological and mathematical backgrounds. Consequently, given the complexity of generated experimental data and cellular networks, it is no surprise that researchers have turned to computer simulation and the development of more theory-based approaches to augment and assist in the development of a fully quantitative understanding of cellular dynamics. PMID:24822031
Developing a model for effective leadership in healthcare: a concept mapping approach.
Hargett, Charles William; Doty, Joseph P; Hauck, Jennifer N; Webb, Allison Mb; Cook, Steven H; Tsipis, Nicholas E; Neumann, Julie A; Andolsek, Kathryn M; Taylor, Dean C
2017-01-01
Despite increasing awareness of the importance of leadership in healthcare, our understanding of the competencies of effective leadership remains limited. We used a concept mapping approach (a blend of qualitative and quantitative analysis of group processes to produce a visual composite of the group's ideas) to identify stakeholders' mental model of effective healthcare leadership, clarifying the underlying structure and importance of leadership competencies. Literature review, focus groups, and consensus meetings were used to derive a representative set of healthcare leadership competency statements. Study participants subsequently sorted and rank-ordered these statements based on their perceived importance in contributing to effective healthcare leadership in real-world settings. Hierarchical cluster analysis of individual sortings was used to develop a coherent model of effective leadership in healthcare. A diverse group of 92 faculty and trainees individually rank-sorted 33 leadership competency statements. The highest rated statements were "Acting with Personal Integrity", "Communicating Effectively", "Acting with Professional Ethical Values", "Pursuing Excellence", "Building and Maintaining Relationships", and "Thinking Critically". Combining the results from hierarchical cluster analysis with our qualitative data led to a healthcare leadership model based on the core principle of Patient Centeredness and the core competencies of Integrity, Teamwork, Critical Thinking, Emotional Intelligence, and Selfless Service. Using a mixed qualitative-quantitative approach, we developed a graphical representation of a shared leadership model derived in the healthcare setting. This model may enhance learning, teaching, and patient care in this important area, as well as guide future research.
NASA Astrophysics Data System (ADS)
Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea
2017-12-01
Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.
NASA Astrophysics Data System (ADS)
Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea
2018-06-01
Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.
Chiu, Grace S.; Wu, Margaret A.; Lu, Lin
2013-01-01
The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt–clay content–all regarded a priori as qualitatively important abiotic drivers–towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general. PMID:23785443
Integrated Lunar Information Architecture for Decision Support Version 3.0 (ILIADS 3.0)
NASA Technical Reports Server (NTRS)
Talabac, Stephen; Ames, Troy; Blank, Karin; Hostetter, Carl; Brandt, Matthew
2013-01-01
ILIADS 3.0 provides the data management capabilities to access CxP-vetted lunar data sets from the LMMP-provided Data Portal and the LMMP-provided On-Moon lunar data product server. (LMMP stands for Lunar Mapping and Modeling Project.) It also provides specific quantitative analysis functions to meet the stated LMMP Level 3 functional and performance requirements specifications that were approved by the CxP. The purpose of ILIADS 3.0 is to provide an integrated, rich client lunar GIS software application
Automated detection of arterial input function in DSC perfusion MRI in a stroke rat model
NASA Astrophysics Data System (ADS)
Yeh, M.-Y.; Lee, T.-H.; Yang, S.-T.; Kuo, H.-H.; Chyi, T.-K.; Liu, H.-L.
2009-05-01
Quantitative cerebral blood flow (CBF) estimation requires deconvolution of the tissue concentration time curves with an arterial input function (AIF). However, image-based determination of AIF in rodent is challenged due to limited spatial resolution. We evaluated the feasibility of quantitative analysis using automated AIF detection and compared the results with commonly applied semi-quantitative analysis. Permanent occlusion of bilateral or unilateral common carotid artery was used to induce cerebral ischemia in rats. The image using dynamic susceptibility contrast method was performed on a 3-T magnetic resonance scanner with a spin-echo echo-planar-image sequence (TR/TE = 700/80 ms, FOV = 41 mm, matrix = 64, 3 slices, SW = 2 mm), starting from 7 s prior to contrast injection (1.2 ml/kg) at four different time points. For quantitative analysis, CBF was calculated by the AIF which was obtained from 10 voxels with greatest contrast enhancement after deconvolution. For semi-quantitative analysis, relative CBF was estimated by the integral divided by the first moment of the relaxivity time curves. We observed if the AIFs obtained in the three different ROIs (whole brain, hemisphere without lesion and hemisphere with lesion) were similar, the CBF ratios (lesion/normal) between quantitative and semi-quantitative analyses might have a similar trend at different operative time points. If the AIFs were different, the CBF ratios might be different. We concluded that using local maximum one can define proper AIF without knowing the anatomical location of arteries in a stroke rat model.
Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina
2018-01-01
The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.
Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K
2018-04-01
Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.
The US EPA Virtual Liver (v-Liver™) is developing an approach to predict dose-dependent hepatotoxicity as an in vivo tissue level response using in vitro data. The v-Liver accomplishes this using an in silico agent-based systems model that dynamically integrates environmental exp...
ERIC Educational Resources Information Center
Abdullah, Melissa Ng Lee Yen
2016-01-01
Purpose: This study aimed to examine the interaction effects of gender and motivational beliefs on students' self-regulated learning. Specifically, three types of motivational beliefs under the Expectancy-Value Model were examined, namely self-efficacy, control beliefs and anxiety. Methodology: A quantitative correlational research design was used…
Radiometric calibration of spacecraft using small lunar images
Kieffer, Hugh H.; Anderson, James M.; Becker, Kris J.
1999-01-01
In this study, the data reduction steps that can be used to extract the lunar irradiance from low resolution images of the Moon are examined and the attendant uncertainties are quantitatively assessed. The response integrated over an image is compared to a lunar irradiance model being developed from terrestrial multi-band photometric observations over the 350-2500 nm range.
Physics-based simulations of the impacts forest management practices have on hydrologic response
Adrianne Carr; Keith Loague
2012-01-01
The impacts of logging on near-surface hydrologic response at the catchment and watershed scales were examined quantitatively using numerical simulation. The simulations were conducted with the Integrated Hydrology Model (InHM) for the North Fork of Caspar Creek Experimental Watershed, located near Fort Bragg, California. InHM is a comprehensive physics-based...
Quinn, T. Alexander; Kohl, Peter
2013-01-01
Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215
The Inter-Sectoral Impact Model Intercomparison Project (ISI–MIP): Project framework
Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob
2014-01-01
The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316
A Simulation and Modeling Framework for Space Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S
This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less
Lehnert, Teresa; Figge, Marc Thilo
2017-01-01
Mathematical modeling and computer simulations have become an integral part of modern biological research. The strength of theoretical approaches is in the simplification of complex biological systems. We here consider the general problem of receptor-ligand binding in the context of antibody-antigen binding. On the one hand, we establish a quantitative mapping between macroscopic binding rates of a deterministic differential equation model and their microscopic equivalents as obtained from simulating the spatiotemporal binding kinetics by stochastic agent-based models. On the other hand, we investigate the impact of various properties of B cell-derived receptors-such as their dimensionality of motion, morphology, and binding valency-on the receptor-ligand binding kinetics. To this end, we implemented an algorithm that simulates antigen binding by B cell-derived receptors with a Y-shaped morphology that can move in different dimensionalities, i.e., either as membrane-anchored receptors or as soluble receptors. The mapping of the macroscopic and microscopic binding rates allowed us to quantitatively compare different agent-based model variants for the different types of B cell-derived receptors. Our results indicate that the dimensionality of motion governs the binding kinetics and that this predominant impact is quantitatively compensated by the bivalency of these receptors.
Multi-scale Multi-mechanism Toughening of Hydrogels
NASA Astrophysics Data System (ADS)
Zhao, Xuanhe
Hydrogels are widely used as scaffolds for tissue engineering, vehicles for drug delivery, actuators for optics and fluidics, and model extracellular matrices for biological studies. The scope of hydrogel applications, however, is often severely limited by their mechanical properties. Inspired by the mechanics and hierarchical structures of tough biological tissues, we propose that a general principle for the design of tough hydrogels is to implement two mechanisms for dissipating mechanical energy and maintaining high elasticity in hydrogels. A particularly promising strategy for the design is to integrate multiple pairs of mechanisms across multiple length scales into a hydrogel. We develop a multiscale theoretical framework to quantitatively guide the design of tough hydrogels. On the network level, we have developed micro-physical models to characterize the evolution of polymer networks under deformation. On the continuum level, we have implemented constitutive laws formulated from the network-level models into a coupled cohesive-zone and Mullins-effect model to quantitatively predict crack propagation and fracture toughness of hydrogels. Guided by the design principle and quantitative model, we will demonstrate a set of new hydrogels, based on diverse types of polymers, yet can achieve extremely high toughness superior to their natural counterparts such as cartilages. The work was supported by NSF(No. CMMI- 1253495) and ONR (No. N00014-14-1-0528).
Lehnert, Teresa; Figge, Marc Thilo
2017-01-01
Mathematical modeling and computer simulations have become an integral part of modern biological research. The strength of theoretical approaches is in the simplification of complex biological systems. We here consider the general problem of receptor–ligand binding in the context of antibody–antigen binding. On the one hand, we establish a quantitative mapping between macroscopic binding rates of a deterministic differential equation model and their microscopic equivalents as obtained from simulating the spatiotemporal binding kinetics by stochastic agent-based models. On the other hand, we investigate the impact of various properties of B cell-derived receptors—such as their dimensionality of motion, morphology, and binding valency—on the receptor–ligand binding kinetics. To this end, we implemented an algorithm that simulates antigen binding by B cell-derived receptors with a Y-shaped morphology that can move in different dimensionalities, i.e., either as membrane-anchored receptors or as soluble receptors. The mapping of the macroscopic and microscopic binding rates allowed us to quantitatively compare different agent-based model variants for the different types of B cell-derived receptors. Our results indicate that the dimensionality of motion governs the binding kinetics and that this predominant impact is quantitatively compensated by the bivalency of these receptors. PMID:29250071
A Unified Framework Integrating Parent-of-Origin Effects for Association Study
Xiao, Feifei; Ma, Jianzhong; Amos, Christopher I.
2013-01-01
Genetic imprinting is the most well-known cause for parent-of-origin effect (POE) whereby a gene is differentially expressed depending on the parental origin of the same alleles. Genetic imprinting is related to several human disorders, including diabetes, breast cancer, alcoholism, and obesity. This phenomenon has been shown to be important for normal embryonic development in mammals. Traditional association approaches ignore this important genetic phenomenon. In this study, we generalize the natural and orthogonal interactions (NOIA) framework to allow for estimation of both main allelic effects and POEs. We develop a statistical (Stat-POE) model that has the orthogonal estimates of parameters including the POEs. We conducted simulation studies for both quantitative and qualitative traits to evaluate the performance of the statistical and functional models with different levels of POEs. Our results showed that the newly proposed Stat-POE model, which ensures orthogonality of variance components if Hardy-Weinberg Equilibrium (HWE) or equal minor and major allele frequencies is satisfied, had greater power for detecting the main allelic additive effect than a Func-POE model, which codes according to allelic substitutions, for both quantitative and qualitative traits. The power for detecting the POE was the same for the Stat-POE and Func-POE models under HWE for quantitative traits. PMID:23991061
Integrated stoichiometric, thermodynamic and kinetic modelling of steady state metabolism
Fleming, R.M.T.; Thiele, I.; Provan, G.; Nasheuer, H.P.
2010-01-01
The quantitative analysis of biochemical reactions and metabolites is at frontier of biological sciences. The recent availability of high-throughput technology data sets in biology has paved the way for new modelling approaches at various levels of complexity including the metabolome of a cell or an organism. Understanding the metabolism of a single cell and multi-cell organism will provide the knowledge for the rational design of growth conditions to produce commercially valuable reagents in biotechnology. Here, we demonstrate how equations representing steady state mass conservation, energy conservation, the second law of thermodynamics, and reversible enzyme kinetics can be formulated as a single system of linear equalities and inequalities, in addition to linear equalities on exponential variables. Even though the feasible set is non-convex, the reformulation is exact and amenable to large-scale numerical analysis, a prerequisite for computationally feasible genome scale modelling. Integrating flux, concentration and kinetic variables in a unified constraint-based formulation is aimed at increasing the quantitative predictive capacity of flux balance analysis. Incorporation of experimental and theoretical bounds on thermodynamic and kinetic variables ensures that the predicted steady state fluxes are both thermodynamically and biochemically feasible. The resulting in silico predictions are tested against fluxomic data for central metabolism in E. coli and compare favourably with in silico prediction by flux balance analysis. PMID:20230840
NASA Astrophysics Data System (ADS)
Galanzha, Ekaterina I.; Tuchin, Valery V.; Chowdhury, Parimal; Zharov, Vladimir P.
2004-08-01
The digital transmission microscopy is very informative, noninvasive for vessels, simple and available method for studying and measuring lymph microvessels function in vivo. Rat mesentery can use as promising animal model of lymph microvessels in vivo. Such imaging system allowed visualizing the entire lymphangion (with input and output valves), its wall, lymphatic valves, lymph flow as well as single cells in flow; obtaining anew basic information on lymph microcirculation and quantitative data on lymphatic function including indexes of phasic contractions and valve function, the quantitative parameters of lymph-flow velocity. Rat mesentery is good model to create different types of lymphedemas in acute and chronic experiments. The obtained data revealed that significant edema started immediately after lymph node dissection in one-half of cases and was accompanied by lymphatic disturbances. The greatest degree of edema was found after 1 week. After 4 weeks, the degree of edema sometimes decreased, but functional lymphatic disturbances progressed. Nicotine had significant direct dose-dependent effect on microlymphatic function at the acute local application, but the same dose of this drug was not effect on microcirculation in chronic intoxication. Despite yielding interesting data, transmittance microscopy had some limitations when applied to microcirculation studies. The problems could be solved at the application of integrated measuring technique.
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Lindgren, Eric A.
2018-04-01
This paper expands on the objective and motivation for NDE-based characterization and includes a discussion of the current approach using model-assisted inversion being pursued within the Air Force Research Laboratory (AFRL). This includes a discussion of the multiple model-based methods that can be used, including physics-based models, deep machine learning, and heuristic approaches. The benefits and drawbacks of each method is reviewed and the potential to integrate multiple methods is discussed. Initial successes are included to highlight the ability to obtain quantitative values of damage. Additional steps remaining to realize this capability with statistical metrics of accuracy are discussed, and how these results can be used to enable probabilistic life management are addressed. The outcome of this initiative will realize the long-term desired capability of NDE methods to provide quantitative characterization to accelerate certification of new materials and enhance life management of engineered systems.
Metabolic network reconstruction of Chlamydomonas offers insight into light-driven algal metabolism
Chang, Roger L; Ghamsari, Lila; Manichaikul, Ani; Hom, Erik F Y; Balaji, Santhanam; Fu, Weiqi; Shen, Yun; Hao, Tong; Palsson, Bernhard Ø; Salehi-Ashtiani, Kourosh; Papin, Jason A
2011-01-01
Metabolic network reconstruction encompasses existing knowledge about an organism's metabolism and genome annotation, providing a platform for omics data analysis and phenotype prediction. The model alga Chlamydomonas reinhardtii is employed to study diverse biological processes from photosynthesis to phototaxis. Recent heightened interest in this species results from an international movement to develop algal biofuels. Integrating biological and optical data, we reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. PMID:21811229
Fostering a Healing Presence and Investigating Its Mediators
KREITZER, MARY JO; BELL, IRIS R.
2008-01-01
The purpose of this paper is the exploration and explication of the complex phenomena of “healing presence” and of appropriately supportive theoretical approaches to integrate emerging models for research design. Healing presence is described as an interpersonal, intrapersonal, and transpersonal to transcendent phenomenon that leads to a beneficial, therapeutic, and/or positive spiritual change within another individual (healee) and also within the healer. An integrated framework merging knowledge from diverse fields of research develops the multiple elements of healing presence, the healer, the healee's capacity for response and the healing effect as an entangled phenomenon. A conceptual systemic model is presented, and questions and dilemmas that emerge are delineated. An integrated qualitative-quantitative research design is proposed. A systemic relationship model, which includes the healer, the healee, and persons within the healee's environment is presented. The challenges are substantial, but the research questions are meaningful and worthwhile. The goal is to foster healing at bio-psycho-social-spiritual levels of the human being. PMID:15630820
Ortel, Terry W.; Spies, Ryan R.
2015-11-19
Next-Generation Radar (NEXRAD) has become an integral component in the estimation of precipitation (Kitzmiller and others, 2013). The high spatial and temporal resolution of NEXRAD has revolutionized the ability to estimate precipitation across vast regions, which is especially beneficial in areas without a dense rain-gage network. With the improved precipitation estimates, hydrologic models can produce reliable streamflow forecasts for areas across the United States. NEXRAD data from the National Weather Service (NWS) has been an invaluable tool used by the U.S. Geological Survey (USGS) for numerous projects and studies; NEXRAD data processing techniques similar to those discussed in this Fact Sheet have been developed within the USGS, including the NWS Quantitative Precipitation Estimates archive developed by Blodgett (2013).
NASA Astrophysics Data System (ADS)
Wörz, Stefan; Hoegen, Philipp; Liao, Wei; Müller-Eschner, Matthias; Kauczor, Hans-Ulrich; von Tengg-Kobligk, Hendrik; Rohr, Karl
2016-03-01
We introduce a framework for quantitative evaluation of 3D vessel segmentation approaches using vascular phantoms. Phantoms are designed using a CAD system and created with a 3D printer, and comprise realistic shapes including branches and pathologies such as abdominal aortic aneurysms (AAA). To transfer ground truth information to the 3D image coordinate system, we use a landmark-based registration scheme utilizing fiducial markers integrated in the phantom design. For accurate 3D localization of the markers we developed a novel 3D parametric intensity model that is directly fitted to the markers in the images. We also performed a quantitative evaluation of different vessel segmentation approaches for a phantom of an AAA.
Evaluating the Impact of Integrated Care on Service Utilization in Serious Mental Illness.
Waters, Heidi C; Furukawa, Michael F; Jorissen, Shari L
2018-06-14
Serious mental illness (SMI) affects 5% of the United States population and is associated with increased morbidity and mortality, and use of high-cost healthcare services including hospitalizations and emergency department visits. Integrating behavioral and physical healthcare may improve care for consumers with SMI, but prior research findings have been mixed. This quantitative retrospective cohort study assessed whether there was a predictive relationship between integrated healthcare clinic enrollment and inpatient and emergency department utilization for consumers with SMI when controlling for demographic characteristics and disease severity. While findings indicated no statistically significant impact of integrated care clinic enrollment on utilization, the sample had lower levels of utilization than would have been expected. Since policy and payment structures continue to support integrated care models, further research on different programs are encouraged, as each setting and practice pattern is unique.
Adamusiak, Tomasz; Parkinson, Helen; Muilu, Juha; Roos, Erik; van der Velde, Kasper Joeri; Thorisson, Gudmundur A; Byrne, Myles; Pang, Chao; Gollapudi, Sirisha; Ferretti, Vincent; Hillege, Hans; Brookes, Anthony J; Swertz, Morris A
2012-05-01
Genetic and epidemiological research increasingly employs large collections of phenotypic and molecular observation data from high quality human and model organism samples. Standardization efforts have produced a few simple formats for exchange of these various data, but a lightweight and convenient data representation scheme for all data modalities does not exist, hindering successful data integration, such as assignment of mouse models to orphan diseases and phenotypic clustering for pathways. We report a unified system to integrate and compare observation data across experimental projects, disease databases, and clinical biobanks. The core object model (Observ-OM) comprises only four basic concepts to represent any kind of observation: Targets, Features, Protocols (and their Applications), and Values. An easy-to-use file format (Observ-TAB) employs Excel to represent individual and aggregate data in straightforward spreadsheets. The systems have been tested successfully on human biobank, genome-wide association studies, quantitative trait loci, model organism, and patient registry data using the MOLGENIS platform to quickly setup custom data portals. Our system will dramatically lower the barrier for future data sharing and facilitate integrated search across panels and species. All models, formats, documentation, and software are available for free and open source (LGPLv3) at http://www.observ-om.org. © 2012 Wiley Periodicals, Inc.
Alford, Lea M; Stoddard, Daniel; Li, Jennifer H; Hunter, Emily L; Tritschler, Douglas; Bower, Raqual; Nicastro, Daniela; Porter, Mary E; Sale, Winfield S
2016-06-01
We developed quantitative assays to test the hypothesis that the N-DRC is required for integrity of the ciliary axoneme. We examined reactivated motility of demembranated drc cells, commonly termed "reactivated cell models." ATP-induced reactivation of wild-type cells resulted in the forward swimming of ∼90% of cell models. ATP-induced reactivation failed in a subset of drc cell models, despite forward motility in live drc cells. Dark-field light microscopic observations of drc cell models revealed various degrees of axonemal splaying. In contrast, >98% of axonemes from wild-type reactivated cell models remained intact. The sup-pf4 and drc3 mutants, unlike other drc mutants, retain most of the N-DRC linker that interconnects outer doublet microtubules. Reactivated sup-pf4 and drc3 cell models displayed nearly wild-type levels of forward motility. Thus, the N-DRC linker is required for axonemal integrity. We also examined reactivated motility and axoneme integrity in mutants defective in tubulin polyglutamylation. ATP-induced reactivation resulted in forward swimming of >75% of tpg cell models. Analysis of double mutants defective in tubulin polyglutamylation and different regions of the N-DRC indicate B-tubule polyglutamylation and the distal lobe of the linker region are both important for axonemal integrity and normal N-DRC function. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
ROOHOLAMINI, AZADEH; AMINI, MITRA; BAZRAFKAN, LEILA; DEHGHANI, MOHAMMAD REZA; ESMAEILZADEH, ZOHREH; NABEIEI, PARISA; REZAEE, RITA; KOJURI, JAVAD
2017-01-01
Introduction: In recent years curriculum reform and integration was done in many medical schools. The integrated curriculum is a popular concept all over the world. In Shiraz medical school, the reform was initiated by stablishing the horizontal basic science integration model and Early Clinical Exposure (ECE) for undergraduate medical education. The purpose of this study was to provide the required data for the program evaluation of this curriculum for undergraduate medical students, using CIPP program evaluation model. Methods: This study is an analytic descriptive and triangulation mixed method study which was carried out in Shiraz Medical School in 2012, based on the views of professors of basic sciences courses and first and second year medical students. The study evaluated the quality of the relationship between basic sciences and clinical courses and the method of presenting such courses based on the Context, Input, Process and Product (CIPP) model. The tools for collecting data, both quantitatively and qualitatively, were some questionnaires, content analysis of portfolios, semi- structured interview and brain storming sessions. For quantitative data analysis, SPSS software, version 14, was used. Results: In the context evaluation by modified DREEM questionnaire, 77.75%of the students believed that this educational system encourages them to actively participate in classes. Course schedule and atmosphere of class were reported suitable by 87.81% and 83.86% of students. In input domain that was measured by a researcher made questionnaire, the facilities for education were acceptable except for shortage of cadavers. In process evaluation, the quality of integrated modules presentation and Early Clinical Exposure (ECE) was good from the students’ viewpoint. In product evaluation, students’ brain storming, students’ portfolio and semi-structured interview with faculties were done, showing some positive aspects of integration and some areas that need improvement. Conclusion: The main advantage of assessing an educational program based on CIPP evaluation model is that the context, input, process and product of the program are viewed and evaluated systematically. This will help the educational authorities to make proper decisions based on the weaknesses and strengths of the program on its continuation, cessation and revision. Based on the results of this study, the integrated basic sciences course for undergraduate medical students in Shiraz Medical School is at a desirable level. However, attempts to improve or reform some sections and continual evaluation of the program and its accreditation seem to be necessary. PMID:28761888
Developing entrepreneurial competencies for successful business model canvas
NASA Astrophysics Data System (ADS)
Sundah, D. I. E.; Langi, C.; Maramis, D. R. S.; Tawalujan, L. dan
2018-01-01
We explore the competencies of entrepreneurship that contribute to business model canvas. This research conducted at smoked fish industries in Province of North Sulawesi, Indonesia. This research used a mixed method which integrating both quantitative and qualitative approaches in a sequential design. The technique of snowball sampling and questionnaire has been used in collecting data from 44 entrepreneurs. Structural equation modeling with SmartPLS application program has been used in analyzing this data to determine the effect of entrepreneurial competencies on business model canvas. We also investigate 3 entrepreneurs who conducted smoked fish business and analyzed their business by using business model canvas. Focus Group Discussion is used in collecting data from 2 groups of entrepreneurs from 2 different locations. The empirical results show that entrepreneurial competencies which consists of managerial competencies, technical competencies, marketing competencies, financial competencies, human relations competencies, and the specific working attitude of entrepreneur has a positive and significantly effect on business model canvas. Additionally, the empirical cases and discussion with 2 groups of entrepreneurs support the quantitative result and it found that human relations competencies have greater influence in achieving successful business model canvas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bandaru, Varaprasad; Izaurralde, Roberto C.; Manowitz, David H.
2013-12-01
The use of marginal lands (MLs) for biofuel production has been contemplated as a promising solution for meeting biofuel demands. However, there have been concerns with spatial location of MLs, their inherent biofuel potential, and possible environmental consequences with the cultivation of energy crops. Here, we developed a new quantitative approach that integrates high-resolution land cover and land productivity maps and uses conditional probability density functions for analyzing land use patterns as a function of land productivity to classify the agricultural lands. We subsequently applied this method to determine available productive croplands (P-CLs) and non-crop marginal lands (NC-MLs) in amore » nine-county Southern Michigan. Furthermore, Spatially Explicit Integrated Modeling Framework (SEIMF) using EPIC (Environmental Policy Integrated Climate) was used to understand the net energy (NE) and soil organic carbon (SOC) implications of cultivating different annual and perennial production systems.« less
Integrated Medical Model Overview
NASA Technical Reports Server (NTRS)
Myers, J.; Boley, L.; Foy, M.; Goodenow, D.; Griffin, D.; Keenan, A.; Kerstman, E.; Melton, S.; McGuire, K.; Saile, L.;
2015-01-01
The Integrated Medical Model (IMM) Project represents one aspect of NASA's Human Research Program (HRP) to quantitatively assess medical risks to astronauts for existing operational missions as well as missions associated with future exploration and commercial space flight ventures. The IMM takes a probabilistic approach to assessing the likelihood and specific outcomes of one hundred medical conditions within the envelope of accepted space flight standards of care over a selectable range of mission capabilities. A specially developed Integrated Medical Evidence Database (iMED) maintains evidence-based, organizational knowledge across a variety of data sources. Since becoming operational in 2011, version 3.0 of the IMM, the supporting iMED, and the expertise of the IMM project team have contributed to a wide range of decision and informational processes for the space medical and human research community. This presentation provides an overview of the IMM conceptual architecture and range of application through examples of actual space flight community questions posed to the IMM project.
James, Susan; Harris, Sara; Foster, Gary; Clarke, Juanne; Gadermann, Anne; Morrison, Marie; Bezanson, Birdie Jane
2013-01-01
This article outlines a model for conducting psychotherapy with people of diverse cultural backgrounds. The theoretical foundation for the model is based on clinical and cultural psychology. Cultural psychology integrates psychology and anthropology in order to provide a complex understanding of both culture and the individual within his or her cultural context. The model proposed in this article is also based on our clinical experience and mixed-method research with the Portuguese community. The model demonstrates its value with ethnic minority clients by situating the clients within the context of their multi-layered social reality. The individual, familial, socio-cultural, and religio-moral domains are explored in two research projects, revealing the interrelation of these levels/contexts. The article is structured according to these domains. Study 1 is a quantitative study that validates the Agonias Questionnaire in Ontario. The results of this study are used to illustrate the individual domain of our proposed model. Study 2 is an ethnography conducted in the Azorean Islands, and the results of this study are integrated to illustrate the other three levels of the model, namely family, socio-cultural, and the religio-moral levels. PMID:23720642
Modeling methodology for supply chain synthesis and disruption analysis
NASA Astrophysics Data System (ADS)
Wu, Teresa; Blackhurst, Jennifer
2004-11-01
The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.
The TAME Project: Towards improvement-oriented software environments
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Rombach, H. Dieter
1988-01-01
Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.
Low, Yen S.; Sedykh, Alexander; Rusyn, Ivan; Tropsha, Alexander
2017-01-01
Cheminformatics approaches such as Quantitative Structure Activity Relationship (QSAR) modeling have been used traditionally for predicting chemical toxicity. In recent years, high throughput biological assays have been increasingly employed to elucidate mechanisms of chemical toxicity and predict toxic effects of chemicals in vivo. The data generated in such assays can be considered as biological descriptors of chemicals that can be combined with molecular descriptors and employed in QSAR modeling to improve the accuracy of toxicity prediction. In this review, we discuss several approaches for integrating chemical and biological data for predicting biological effects of chemicals in vivo and compare their performance across several data sets. We conclude that while no method consistently shows superior performance, the integrative approaches rank consistently among the best yet offer enriched interpretation of models over those built with either chemical or biological data alone. We discuss the outlook for such interdisciplinary methods and offer recommendations to further improve the accuracy and interpretability of computational models that predict chemical toxicity. PMID:24805064
From Inverse Problems in Mathematical Physiology to Quantitative Differential Diagnoses
Zenker, Sven; Rubin, Jonathan; Clermont, Gilles
2007-01-01
The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting), using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge). We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of differential diagnoses. We outline possible steps toward translating this computational approach to the bedside, to supplement today's evidence-based medicine with a quantitatively founded model-based medicine that integrates mechanistic knowledge with patient-specific information. PMID:17997590
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barani, T.; Bruschi, E.; Pizzocri, D.
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. Experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of burst release in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which ismore » applied as an extension of diffusion-based models to allow for the burst release effect. The concept and governing equations of the model are presented, and the effect of the newly introduced parameters is evaluated through an analytic sensitivity analysis. Then, the model is assessed for application to integral fuel rod analysis. The approach that we take for model assessment involves implementation in two structurally different fuel performance codes, namely, BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D semi-analytic code). The model is validated against 19 Light Water Reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the qualitative representation of the FGR kinetics and the quantitative predictions of integral fuel rod FGR, relative to the canonical, purely diffusion-based models, with both codes. The overall quantitative improvement of the FGR predictions in the two codes is comparable. Furthermore, calculated radial profiles of xenon concentration are investigated and compared to experimental data, demonstrating the representation of the underlying mechanisms of burst release by the new model.« less
Smith, Amber R.; Williams, Paul H.; McGee, Seth A.; Dósa, Katalin; Pfammatter, Jesse
2014-01-01
Genetics instruction in introductory biology is often confined to Mendelian genetics and avoids the complexities of variation in quantitative traits. Given the driving question “What determines variation in phenotype (Pv)? (Pv=Genotypic variation Gv + environmental variation Ev),” we developed a 4-wk unit for an inquiry-based laboratory course focused on the inheritance and expression of a quantitative trait in varying environments. We utilized Brassica rapa Fast Plants as a model organism to study variation in the phenotype anthocyanin pigment intensity. As an initial curriculum assessment, we used free word association to examine students’ cognitive structures before and after the unit and explanations in students’ final research posters with particular focus on variation (Pv = Gv + Ev). Comparison of pre- and postunit word frequency revealed a shift in words and a pattern of co-occurring concepts indicative of change in cognitive structure, with particular focus on “variation” as a proposed threshold concept and primary goal for students’ explanations. Given review of 53 posters, we found ∼50% of students capable of intermediate to high-level explanations combining both Gv and Ev influence on expression of anthocyanin intensity (Pv). While far from “plug and play,” this conceptually rich, inquiry-based unit holds promise for effective integration of quantitative and Mendelian genetics. PMID:25185225
Integrating Multiscale Modeling with Drug Effects for Cancer Treatment.
Li, Xiangfang L; Oduola, Wasiu O; Qian, Lijun; Dougherty, Edward R
2015-01-01
In this paper, we review multiscale modeling for cancer treatment with the incorporation of drug effects from an applied system's pharmacology perspective. Both the classical pharmacology and systems biology are inherently quantitative; however, systems biology focuses more on networks and multi factorial controls over biological processes rather than on drugs and targets in isolation, whereas systems pharmacology has a strong focus on studying drugs with regard to the pharmacokinetic (PK) and pharmacodynamic (PD) relations accompanying drug interactions with multiscale physiology as well as the prediction of dosage-exposure responses and economic potentials of drugs. Thus, it requires multiscale methods to address the need for integrating models from the molecular levels to the cellular, tissue, and organism levels. It is a common belief that tumorigenesis and tumor growth can be best understood and tackled by employing and integrating a multifaceted approach that includes in vivo and in vitro experiments, in silico models, multiscale tumor modeling, continuous/discrete modeling, agent-based modeling, and multiscale modeling with PK/PD drug effect inputs. We provide an example application of multiscale modeling employing stochastic hybrid system for a colon cancer cell line HCT-116 with the application of Lapatinib drug. It is observed that the simulation results are similar to those observed from the setup of the wet-lab experiments at the Translational Genomics Research Institute.
NASA Astrophysics Data System (ADS)
Guldner, Ian H.; Yang, Lin; Cowdrick, Kyle R.; Wang, Qingfei; Alvarez Barrios, Wendy V.; Zellmer, Victoria R.; Zhang, Yizhe; Host, Misha; Liu, Fang; Chen, Danny Z.; Zhang, Siyuan
2016-04-01
Metastatic microenvironments are spatially and compositionally heterogeneous. This seemingly stochastic heterogeneity provides researchers great challenges in elucidating factors that determine metastatic outgrowth. Herein, we develop and implement an integrative platform that will enable researchers to obtain novel insights from intricate metastatic landscapes. Our two-segment platform begins with whole tissue clearing, staining, and imaging to globally delineate metastatic landscape heterogeneity with spatial and molecular resolution. The second segment of our platform applies our custom-developed SMART 3D (Spatial filtering-based background removal and Multi-chAnnel forest classifiers-based 3D ReconsTruction), a multi-faceted image analysis pipeline, permitting quantitative interrogation of functional implications of heterogeneous metastatic landscape constituents, from subcellular features to multicellular structures, within our large three-dimensional (3D) image datasets. Coupling whole tissue imaging of brain metastasis animal models with SMART 3D, we demonstrate the capability of our integrative pipeline to reveal and quantify volumetric and spatial aspects of brain metastasis landscapes, including diverse tumor morphology, heterogeneous proliferative indices, metastasis-associated astrogliosis, and vasculature spatial distribution. Collectively, our study demonstrates the utility of our novel integrative platform to reveal and quantify the global spatial and volumetric characteristics of the 3D metastatic landscape with unparalleled accuracy, opening new opportunities for unbiased investigation of novel biological phenomena in situ.
Huntley, Alyson L; King, Anna J L; Moore, Theresa H M; Paterson, Charlotte; Persad, Raj; Sharp, Debbie; Evans, Maggie
2017-01-01
To present a methodological exemplar of integrating findings from a quantitative and qualitative review on the same topic to provide insight into components of care that contribute to supportive care that is acceptable to men with prostate cancer. Men with prostate cancer are likely to live a long time with the disease, experience side effects from treatment and therefore have ongoing supportive care needs. Quantitative and qualitative reviews have been published but the findings have yet to be integrated. Integration of quantitative and qualitative synthesized evidence. Two previously published systematic reviews. Synthesized evidence on supportive care for men with prostate cancer was integrated from two previously published systematic reviews: a narrative quantitative review and a qualitative review with thematic synthesis. These two streams of synthesized evidence were synthesized using concurrent narrative summary. Data from both reviews were used to develop a set of propositions from which a summary of components of care that likely to contribute to supportive care acceptable to men with prostate cancer were identified. Nine propositions were developed which covered men's supportive care focusing on the role of health professionals. These propositions were used to compose nine components of care likely to lead to supportive care that is acceptable to men with prostate cancer. Some of these components are no/low cost such as developing a more empathic personalized approach, but more specific approaches need further investigation in randomized controlled trials, for example, online support. This methodological exemplar demonstrates the integration of quantitative and qualitative synthesized data to determine components of care likely to lead to provision of supportive care acceptable to men with prostate cancer. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.
Quantitative analysis of lentiviral transgene expression in mice over seven generations.
Wang, Yong; Song, Yong-tao; Liu, Qin; Liu, Cang'e; Wang, Lu-lu; Liu, Yu; Zhou, Xiao-yang; Wu, Jun; Wei, Hong
2010-10-01
Lentiviral transgenesis is now recognized as an extremely efficient and cost-effective method to produce transgenic animals. Transgenes delivered by lentiviral vectors exhibited inheritable expression in many species including those which are refractory to genetic modification such as non-human primates. However, epigenetic modification was frequently observed in lentiviral integrants, and transgene expression found to be inversely correlated with methylation density. Recent data showed that about one-third lentiviral integrants exhibited hypermethylation and low expression, but did not demonstrate whether those integrants with high expression could remain constant expression and hypomethylated during long term germline transmission. In this study, using lentiviral eGFP transgenic mice as the experimental animals, lentiviral eGFP expression levels and its integrant numbers in genome were quantitatively analyzed by fluorescent quantitative polymerase-chain reaction (FQ-PCR), using the house-keeping gene ribosomal protein S18 (Rps18) and the single copy gene fatty acid binding protein of the intestine (Fabpi) as the internal controls respectively. The methylation densities of the integrants were quantitatively analyzed by bisulfite sequencing. We found that the lentiviral integrants with high expression exhibited a relative constant expression level per integrant over at least seven generations. Besides, the individuals containing these integrants exhibited eGFP expression levels which were positively and almost linearly correlated with the integrant numbers in their genomes, suggesting that no remarkable position effect on transgene expression of the integrants analyzed was observed. In addition, over seven generations the methylation density of these integrants did not increase, but rather decreased remarkably, indicating that these high expressing integrants were not subjected to de novo methylation during at least seven generations of germline transmission. Taken together, these data suggested that transgenic lines with long term stable expression and no position effect can be established by lentiviral transgenesis.
The SAGE Model of Social Psychological Research.
Power, Séamus A; Velez, Gabriel; Qadafi, Ahmad; Tennant, Joseph
2018-05-01
We propose a SAGE model for social psychological research. Encapsulated in our acronym is a proposal to have a synthetic approach to social psychological research, in which qualitative methods are augmentative to quantitative ones, qualitative methods can be generative of new experimental hypotheses, and qualitative methods can capture experiences that evade experimental reductionism. We remind social psychological researchers that psychology was founded in multiple methods of investigation at multiple levels of analysis. We discuss historical examples and our own research as contemporary examples of how a SAGE model can operate in part or as an integrated whole. The implications of our model are discussed.
Ernst, Marielle; Kriston, Levente; Romero, Javier M; Frölich, Andreas M; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik
2016-01-01
We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.
[Watershed water environment pollution models and their applications: a review].
Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang
2013-10-01
Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.
Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.
Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen
2017-06-01
The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.
Systems analysis of the single photon response in invertebrate photoreceptors.
Pumir, Alain; Graves, Jennifer; Ranganathan, Rama; Shraiman, Boris I
2008-07-29
Photoreceptors of Drosophila compound eye employ a G protein-mediated signaling pathway that transduces single photons into transient electrical responses called "quantum bumps" (QB). Although most of the molecular components of this pathway are already known, the system-level understanding of the mechanism of QB generation has remained elusive. Here, we present a quantitative model explaining how QBs emerge from stochastic nonlinear dynamics of the signaling cascade. The model shows that the cascade acts as an "integrate and fire" device and explains how photoreceptors achieve reliable responses to light although keeping low background in the dark. The model predicts the nontrivial behavior of mutants that enhance or suppress signaling and explains the dependence on external calcium, which controls feedback regulation. The results provide insight into physiological questions such as single-photon response efficiency and the adaptation of response to high incident-light level. The system-level analysis enabled by modeling phototransduction provides a foundation for understanding G protein signaling pathways less amenable to quantitative approaches.
The fast and the slow of skilled bimanual rhythm production: parallel versus integrated timing.
Krampe, R T; Kliegl, R; Mayr, U; Engbert, R; Vorberg, D
2000-02-01
Professional pianists performed 2 bimanual rhythms at a wide range of different tempos. The polyrhythmic task required the combination of 2 isochronous sequences (3 against 4) between the hands; in the syncopated rhythm task successive keystrokes formed intervals of identical (isochronous) durations. At slower tempos, pianists relied on integrated timing control merging successive intervals between the hands into a common reference frame. A timer-motor model is proposed based on the concepts of rate fluctuation and the distinction between target specification and timekeeper execution processes as a quantitative account of performance at slow tempos. At rapid rates expert pianists used hand-independent, parallel timing control. In alternative to a model based on a single central clock, findings support a model of flexible control structures with multiple timekeepers that can work in parallel to accommodate specific task constraints.
ERIC Educational Resources Information Center
Viegas, Ricardo G.; Oliveira, Armando M.; Garriga-Trillo, Ana; Grieco, Alba
2012-01-01
In order to be treated quantitatively, subjective gains and losses (utilities/disutilities) must be psychologically measured. If legitimate comparisons are sought between them, measurement must be at least interval level, with a common unit. If comparisons of absolute magnitudes across gains and losses are further sought, as in standard…
Condensins under the microscope.
Maeshima, Kazuhiro; Hibino, Kayo; Hudson, Damien F
2018-04-30
Condensins are key players in mitotic chromosome condensation. Using an elegant combination of state-of-the-art imaging techniques, Walther et al. (2018. J. Cell Biol. https://doi.org/10.1083/jcb.201801048) counted the number of Condensins, examined their behaviors on human mitotic chromosomes, and integrated the quantitative data to propose a new mechanistic model for chromosome condensation. © 2018 Maeshima et al.
Transient deformation of a droplet near a microfluidic constriction: A quantitative analysis
NASA Astrophysics Data System (ADS)
Trégouët, Corentin; Salez, Thomas; Monteux, Cécile; Reyssat, Mathilde
2018-05-01
We report on experiments that consist of deforming a collection of monodisperse droplets produced by a microfluidic chip through a flow-focusing device. We show that a proper numerical modeling of the flow is necessary to access the stress applied by the latter on the droplet along its trajectory through the chip. This crucial step enables the full integration of the differential equation governing the dynamical deformation, and consequently the robust measurement of the interfacial tension by fitting the experiments with the calculated deformation. Our study thus demonstrates the feasibility of quantitative in situ rheology in microfluidic flows involving, e.g., droplets, capsules, or cells.
Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...
Unrean, Pornkamol; Khajeeram, Sutamat; Laoteng, Kobkul
2016-03-01
An integrative simultaneous saccharification and fermentation (SSF) modeling is a useful guiding tool for rapid process optimization to meet the techno-economic requirement of industrial-scale lignocellulosic ethanol production. In this work, we have developed the SSF model composing of a metabolic network of a Saccharomyces cerevisiae cell associated with fermentation kinetics and enzyme hydrolysis model to quantitatively capture dynamic responses of yeast cell growth and fermentation during SSF. By using model-based design of feeding profiles for substrate and yeast cell in the fed-batch SSF process, an efficient ethanol production with high titer of up to 65 g/L and high yield of 85 % of theoretical yield was accomplished. The ethanol titer and productivity was increased by 47 and 41 %, correspondingly, in optimized fed-batch SSF as compared to batch process. The developed integrative SSF model is, therefore, considered as a promising approach for systematic design of economical and sustainable SSF bioprocessing of lignocellulose.
Parallel FEM Simulation of Electromechanics in the Heart
NASA Astrophysics Data System (ADS)
Xia, Henian; Wong, Kwai; Zhao, Xiaopeng
2011-11-01
Cardiovascular disease is the leading cause of death in America. Computer simulation of complicated dynamics of the heart could provide valuable quantitative guidance for diagnosis and treatment of heart problems. In this paper, we present an integrated numerical model which encompasses the interaction of cardiac electrophysiology, electromechanics, and mechanoelectrical feedback. The model is solved by finite element method on a Linux cluster and the Cray XT5 supercomputer, kraken. Dynamical influences between the effects of electromechanics coupling and mechanic-electric feedback are shown.
ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling
Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf
2012-01-01
Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270
Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf
2012-05-01
Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/
Jolivet, Renaud; Coggan, Jay S.; Allaman, Igor; Magistretti, Pierre J.
2015-01-01
Glucose is the main energy substrate in the adult brain under normal conditions. Accumulating evidence, however, indicates that lactate produced in astrocytes (a type of glial cell) can also fuel neuronal activity. The quantitative aspects of this so-called astrocyte-neuron lactate shuttle (ANLS) are still debated. To address this question, we developed a detailed biophysical model of the brain’s metabolic interactions. Our model integrates three modeling approaches, the Buxton-Wang model of vascular dynamics, the Hodgkin-Huxley formulation of neuronal membrane excitability and a biophysical model of metabolic pathways. This approach provides a template for large-scale simulations of the neuron-glia-vasculature (NGV) ensemble, and for the first time integrates the respective timescales at which energy metabolism and neuronal excitability occur. The model is constrained by relative neuronal and astrocytic oxygen and glucose utilization, by the concentration of metabolites at rest and by the temporal dynamics of NADH upon activation. These constraints produced four observations. First, a transfer of lactate from astrocytes to neurons emerged in response to activity. Second, constrained by activity-dependent NADH transients, neuronal oxidative metabolism increased first upon activation with a subsequent delayed astrocytic glycolysis increase. Third, the model correctly predicted the dynamics of extracellular lactate and oxygen as observed in vivo in rats. Fourth, the model correctly predicted the temporal dynamics of tissue lactate, of tissue glucose and oxygen consumption, and of the BOLD signal as reported in human studies. These findings not only support the ANLS hypothesis but also provide a quantitative mathematical description of the metabolic activation in neurons and glial cells, as well as of the macroscopic measurements obtained during brain imaging. PMID:25719367
Jolivet, Renaud; Coggan, Jay S; Allaman, Igor; Magistretti, Pierre J
2015-02-01
Glucose is the main energy substrate in the adult brain under normal conditions. Accumulating evidence, however, indicates that lactate produced in astrocytes (a type of glial cell) can also fuel neuronal activity. The quantitative aspects of this so-called astrocyte-neuron lactate shuttle (ANLS) are still debated. To address this question, we developed a detailed biophysical model of the brain's metabolic interactions. Our model integrates three modeling approaches, the Buxton-Wang model of vascular dynamics, the Hodgkin-Huxley formulation of neuronal membrane excitability and a biophysical model of metabolic pathways. This approach provides a template for large-scale simulations of the neuron-glia-vasculature (NGV) ensemble, and for the first time integrates the respective timescales at which energy metabolism and neuronal excitability occur. The model is constrained by relative neuronal and astrocytic oxygen and glucose utilization, by the concentration of metabolites at rest and by the temporal dynamics of NADH upon activation. These constraints produced four observations. First, a transfer of lactate from astrocytes to neurons emerged in response to activity. Second, constrained by activity-dependent NADH transients, neuronal oxidative metabolism increased first upon activation with a subsequent delayed astrocytic glycolysis increase. Third, the model correctly predicted the dynamics of extracellular lactate and oxygen as observed in vivo in rats. Fourth, the model correctly predicted the temporal dynamics of tissue lactate, of tissue glucose and oxygen consumption, and of the BOLD signal as reported in human studies. These findings not only support the ANLS hypothesis but also provide a quantitative mathematical description of the metabolic activation in neurons and glial cells, as well as of the macroscopic measurements obtained during brain imaging.
Pargett, Michael; Rundell, Ann E.; Buzzard, Gregery T.; Umulis, David M.
2014-01-01
Discovery in developmental biology is often driven by intuition that relies on the integration of multiple types of data such as fluorescent images, phenotypes, and the outcomes of biochemical assays. Mathematical modeling helps elucidate the biological mechanisms at play as the networks become increasingly large and complex. However, the available data is frequently under-utilized due to incompatibility with quantitative model tuning techniques. This is the case for stem cell regulation mechanisms explored in the Drosophila germarium through fluorescent immunohistochemistry. To enable better integration of biological data with modeling in this and similar situations, we have developed a general parameter estimation process to quantitatively optimize models with qualitative data. The process employs a modified version of the Optimal Scaling method from social and behavioral sciences, and multi-objective optimization to evaluate the trade-off between fitting different datasets (e.g. wild type vs. mutant). Using only published imaging data in the germarium, we first evaluated support for a published intracellular regulatory network by considering alternative connections of the same regulatory players. Simply screening networks against wild type data identified hundreds of feasible alternatives. Of these, five parsimonious variants were found and compared by multi-objective analysis including mutant data and dynamic constraints. With these data, the current model is supported over the alternatives, but support for a biochemically observed feedback element is weak (i.e. these data do not measure the feedback effect well). When also comparing new hypothetical models, the available data do not discriminate. To begin addressing the limitations in data, we performed a model-based experiment design and provide recommendations for experiments to refine model parameters and discriminate increasingly complex hypotheses. PMID:24626201
Mudaliar, Manikhandan; Tassi, Riccardo; Thomas, Funmilola C.; McNeilly, Tom N.; Weidt, Stefan K.; McLaughlin, Mark; Wilson, David; Burchmore, Richard; Herzyk, Pawel; Eckersall, P. David
2016-01-01
Mastitis, inflammation of the mammary gland, is the most common and costly disease of dairy cattle in the western world. It is primarily caused by bacteria, with Streptococcus uberis as one of the most prevalent causative agents. To characterize the proteome during Streptococcus uberis mastitis, an experimentally induced model of intramammary infection was used. Milk whey samples obtained from 6 cows at 6 time points were processed using label-free relative quantitative proteomics. This proteomic analysis complements clinical, bacteriological and immunological studies as well as peptidomic and metabolomic analysis of the same challenge model. A total of 2552 non-redundant bovine peptides were identified, and from these, 570 bovine proteins were quantified. Hierarchical cluster analysis and principal component analysis showed clear clustering of results by stage of infection, with similarities between pre-infection and resolution stages (0 and 312 h post challenge), early infection stages (36 and 42 h post challenge) and late infection stages (57 and 81 h post challenge). Ingenuity pathway analysis identified upregulation of acute phase protein pathways over the course of infection, with dominance of different acute phase proteins at different time points based on differential expression analysis. Antimicrobial peptides, notably cathelicidins and peptidoglycan recognition protein, were upregulated at all time points post challenge and peaked at 57 h, which coincided with 10 000-fold decrease in average bacterial counts. The integration of clinical, bacteriological, immunological and quantitative proteomics and other-omic data provides a more detailed systems level view of the host response to mastitis than has been achieved previously. PMID:27412694
Burnout in Nurses Working With Youth With Chronic Pain: A Mixed-Methods Analysis.
Rodrigues, Nikita P; Cohen, Lindsey L; Swartout, Kevin M; Trotochaud, Karen; Murray, Eileen
2018-05-01
Nursing is a rewarding but also challenging profession. Nurses are at risk for burnout and premature exit from the profession, which is detrimental to them, their patients, and the healthcare system. There are few studies examining the unique correlates of burnout in nurses working with pediatric populations. The current 2-study project used mixed-methods (qualitative and then quantitative) analysis to explore burnout in nurses working in an inpatient unit with youth with chronic pain. Study I participants included all of the 32 nurses who worked in an inpatient pediatric unit, which admits patients with chronic pain. Qualitative analyses of focus groups were used to extract themes. These themes were examined via a quantitative battery completed by 41 nurses from 2 inpatient pediatric units with youth with chronic pain. The themes were burnout, moral distress, negative beliefs about chronic pain, barriers to pain management, fear of losing compassion, coworker support as a coping method, time worked in the unit, professional self-efficacy, and negative views of the hospital environment. Quantitative results supported most of the qualitative findings, and taken together, the findings supported a model of burnout in nurses working with youth with chronic pain. Conclusions We integrated qualitative and quantitative findings to develop a model of nurse burnout. This model provides a framework for evaluating and targeting burnout in nurses working with pediatric patients with chronic pain.
Modeling and Diagnostic Software for Liquefying-Fuel Rockets
NASA Technical Reports Server (NTRS)
Poll, Scott; Iverson, David; Ou, Jeremy; Sanderfer, Dwight; Patterson-Hine, Ann
2005-01-01
A report presents a study of five modeling and diagnostic computer programs considered for use in an integrated vehicle health management (IVHM) system during testing of liquefying-fuel hybrid rocket engines in the Hybrid Combustion Facility (HCF) at NASA Ames Research Center. Three of the programs -- TEAMS, L2, and RODON -- are model-based reasoning (or diagnostic) programs. The other two programs -- ICS and IMS -- do not attempt to isolate the causes of failures but can be used for detecting faults. In the study, qualitative models (in TEAMS and L2) and quantitative models (in RODON) having varying scope and completeness were created. Each of the models captured the structure and behavior of the HCF as a physical system. It was noted that in the cases of the qualitative models, the temporal aspects of the behavior of the HCF and the abstraction of sensor data are handled outside of the models, and it is necessary to develop additional code for this purpose. A need for additional code was also noted in the case of the quantitative model, though the amount of development effort needed was found to be less than that for the qualitative models.
Bridging the divide: a model-data approach to Polar and Alpine microbiology.
Bradley, James A; Anesio, Alexandre M; Arndt, Sandra
2016-03-01
Advances in microbial ecology in the cryosphere continue to be driven by empirical approaches including field sampling and laboratory-based analyses. Although mathematical models are commonly used to investigate the physical dynamics of Polar and Alpine regions, they are rarely applied in microbial studies. Yet integrating modelling approaches with ongoing observational and laboratory-based work is ideally suited to Polar and Alpine microbial ecosystems given their harsh environmental and biogeochemical characteristics, simple trophic structures, distinct seasonality, often difficult accessibility, geographical expansiveness and susceptibility to accelerated climate changes. In this opinion paper, we explain how mathematical modelling ideally complements field and laboratory-based analyses. We thus argue that mathematical modelling is a powerful tool for the investigation of these extreme environments and that fully integrated, interdisciplinary model-data approaches could help the Polar and Alpine microbiology community address some of the great research challenges of the 21st century (e.g. assessing global significance and response to climate change). However, a better integration of field and laboratory work with model design and calibration/validation, as well as a stronger focus on quantitative information is required to advance models that can be used to make predictions and upscale processes and fluxes beyond what can be captured by observations alone. © FEMS 2016.
Bridging the divide: a model-data approach to Polar and Alpine microbiology
Bradley, James A.; Anesio, Alexandre M.; Arndt, Sandra
2016-01-01
Advances in microbial ecology in the cryosphere continue to be driven by empirical approaches including field sampling and laboratory-based analyses. Although mathematical models are commonly used to investigate the physical dynamics of Polar and Alpine regions, they are rarely applied in microbial studies. Yet integrating modelling approaches with ongoing observational and laboratory-based work is ideally suited to Polar and Alpine microbial ecosystems given their harsh environmental and biogeochemical characteristics, simple trophic structures, distinct seasonality, often difficult accessibility, geographical expansiveness and susceptibility to accelerated climate changes. In this opinion paper, we explain how mathematical modelling ideally complements field and laboratory-based analyses. We thus argue that mathematical modelling is a powerful tool for the investigation of these extreme environments and that fully integrated, interdisciplinary model-data approaches could help the Polar and Alpine microbiology community address some of the great research challenges of the 21st century (e.g. assessing global significance and response to climate change). However, a better integration of field and laboratory work with model design and calibration/validation, as well as a stronger focus on quantitative information is required to advance models that can be used to make predictions and upscale processes and fluxes beyond what can be captured by observations alone. PMID:26832206
Cell Culture Systems To Study Human Herpesvirus 6A/B Chromosomal Integration.
Gravel, Annie; Dubuc, Isabelle; Wallaschek, Nina; Gilbert-Girard, Shella; Collin, Vanessa; Hall-Sedlak, Ruth; Jerome, Keith R; Mori, Yasuko; Carbonneau, Julie; Boivin, Guy; Kaufer, Benedikt B; Flamand, Louis
2017-07-15
Human herpesviruses 6A/B (HHV-6A/B) can integrate their viral genomes in the telomeres of human chromosomes. The viral and cellular factors contributing to HHV-6A/B integration remain largely unknown, mostly due to the lack of efficient and reproducible cell culture models to study HHV-6A/B integration. In this study, we characterized the HHV-6A/B integration efficiencies in several human cell lines using two different approaches. First, after a short-term infection (5 h), cells were processed for single-cell cloning and analyzed for chromosomally integrated HHV-6A/B (ciHHV-6A/B). Second, cells were infected with HHV-6A/B and allowed to grow in bulk for 4 weeks or longer and then analyzed for the presence of ciHHV-6. Using quantitative PCR (qPCR), droplet digital PCR, and fluorescent in situ hybridization, we could demonstrate that HHV-6A/B integrated in most human cell lines tested, including telomerase-positive (HeLa, MCF-7, HCT-116, and HEK293T) and telomerase-negative cell lines (U2OS and GM847). Our results also indicate that inhibition of DNA replication, using phosphonoacetic acid, did not affect HHV-6A/B integration. Certain clones harboring ciHHV-6A/B spontaneously express viral genes and proteins. Treatment of cells with phorbol ester or histone deacetylase inhibitors triggered the expression of many viral genes, including U39 , U90 , and U100 , without the production of infectious virus, suggesting that the tested stimuli were not sufficient to trigger full reactivation. In summary, both integration models yielded comparable results and should enable the identification of viral and cellular factors contributing to HHV-6A/B integration and the screening of drugs influencing viral gene expression, as well as the release of infectious HHV-6A/B from the integrated state. IMPORTANCE The analysis and understanding of HHV-6A/B genome integration into host DNA is currently limited due to the lack of reproducible and efficient viral integration systems. In the present study, we describe two quantitative cell culture viral integration systems. These systems can be used to define cellular and viral factors that play a role in HHV-6A/B integration. Furthermore, these systems will allow us to decipher the conditions resulting in virus gene expression and excision of the integrated viral genome resulting in reactivation. Copyright © 2017 American Society for Microbiology.
Modelling the co-evolution of indirect genetic effects and inherited variability.
Marjanovic, Jovana; Mulder, Han A; Rönnegård, Lars; Bijma, Piter
2018-03-28
When individuals interact, their phenotypes may be affected not only by their own genes but also by genes in their social partners. This phenomenon is known as Indirect Genetic Effects (IGEs). In aquaculture species and some plants, however, competition not only affects trait levels of individuals, but also inflates variability of trait values among individuals. In the field of quantitative genetics, the variability of trait values has been studied as a quantitative trait in itself, and is often referred to as inherited variability. Such studies, however, consider only the genetic effect of the focal individual on trait variability and do not make a connection to competition. Although the observed phenotypic relationship between competition and variability suggests an underlying genetic relationship, the current quantitative genetic models of IGE and inherited variability do not allow for such a relationship. The lack of quantitative genetic models that connect IGEs to inherited variability limits our understanding of the potential of variability to respond to selection, both in nature and agriculture. Models of trait levels, for example, show that IGEs may considerably change heritable variation in trait values. Currently, we lack the tools to investigate whether this result extends to variability of trait values. Here we present a model that integrates IGEs and inherited variability. In this model, the target phenotype, say growth rate, is a function of the genetic and environmental effects of the focal individual and of the difference in trait value between the social partner and the focal individual, multiplied by a regression coefficient. The regression coefficient is a genetic trait, which is a measure of cooperation; a negative value indicates competition, a positive value cooperation, and an increasing value due to selection indicates the evolution of cooperation. In contrast to the existing quantitative genetic models, our model allows for co-evolution of IGEs and variability, as the regression coefficient can respond to selection. Our simulations show that the model results in increased variability of body weight with increasing competition. When competition decreases, i.e., cooperation evolves, variability becomes significantly smaller. Hence, our model facilitates quantitative genetic studies on the relationship between IGEs and inherited variability. Moreover, our findings suggest that we may have been overlooking an entire level of genetic variation in variability, the one due to IGEs.
Criteria for quantitative and qualitative data integration: mixed-methods research methodology.
Lee, Seonah; Smith, Carrol A M
2012-05-01
Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.
Wickman, Jonas; Diehl, Sebastian; Blasius, Bernd; Klausmeier, Christopher A; Ryabov, Alexey B; Brännström, Åke
2017-04-01
Spatial structure can decisively influence the way evolutionary processes unfold. To date, several methods have been used to study evolution in spatial systems, including population genetics, quantitative genetics, moment-closure approximations, and individual-based models. Here we extend the study of spatial evolutionary dynamics to eco-evolutionary models based on reaction-diffusion equations and adaptive dynamics. Specifically, we derive expressions for the strength of directional and stabilizing/disruptive selection that apply both in continuous space and to metacommunities with symmetrical dispersal between patches. For directional selection on a quantitative trait, this yields a way to integrate local directional selection across space and determine whether the trait value will increase or decrease. The robustness of this prediction is validated against quantitative genetics. For stabilizing/disruptive selection, we show that spatial heterogeneity always contributes to disruptive selection and hence always promotes evolutionary branching. The expression for directional selection is numerically very efficient and hence lends itself to simulation studies of evolutionary community assembly. We illustrate the application and utility of the expressions for this purpose with two examples of the evolution of resource utilization. Finally, we outline the domain of applicability of reaction-diffusion equations as a modeling framework and discuss their limitations.
Bian, Xihui; Li, Shujuan; Lin, Ligang; Tan, Xiaoyao; Fan, Qingjie; Li, Ming
2016-06-21
Accurate prediction of the model is fundamental to the successful analysis of complex samples. To utilize abundant information embedded over frequency and time domains, a novel regression model is presented for quantitative analysis of hydrocarbon contents in the fuel oil samples. The proposed method named as high and low frequency unfolded PLSR (HLUPLSR), which integrates empirical mode decomposition (EMD) and unfolded strategy with partial least squares regression (PLSR). In the proposed method, the original signals are firstly decomposed into a finite number of intrinsic mode functions (IMFs) and a residue by EMD. Secondly, the former high frequency IMFs are summed as a high frequency matrix and the latter IMFs and residue are summed as a low frequency matrix. Finally, the two matrices are unfolded to an extended matrix in variable dimension, and then the PLSR model is built between the extended matrix and the target values. Coupled with Ultraviolet (UV) spectroscopy, HLUPLSR has been applied to determine hydrocarbon contents of light gas oil and diesel fuels samples. Comparing with single PLSR and other signal processing techniques, the proposed method shows superiority in prediction ability and better model interpretation. Therefore, HLUPLSR method provides a promising tool for quantitative analysis of complex samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Abidi, Samina
2017-10-26
Clinical management of comorbidities is a challenge, especially in a clinical decision support setting, as it requires the safe and efficient reconciliation of multiple disease-specific clinical procedures to formulate a comorbid therapeutic plan that is both effective and safe for the patient. In this paper we pursue the integration of multiple disease-specific Clinical Practice Guidelines (CPG) in order to manage co-morbidities within a computerized Clinical Decision Support System (CDSS). We present a CPG integration framework-termed as COMET (Comorbidity Ontological Modeling & ExecuTion) that manifests a knowledge management approach to model, computerize and integrate multiple CPG to yield a comorbid CPG knowledge model that upon execution can provide evidence-based recommendations for handling comorbid patients. COMET exploits semantic web technologies to achieve (a) CPG knowledge synthesis to translate a paper-based CPG to disease-specific clinical pathways (CP) that include specialized co-morbidity management procedures based on input from domain experts; (b) CPG knowledge modeling to computerize the disease-specific CP using a Comorbidity CPG ontology; (c) CPG knowledge integration by aligning multiple ontologically-modeled CP to develop a unified comorbid CPG knowledge model; and (e) CPG knowledge execution using reasoning engines to derive CPG-mediated recommendations for managing patients with comorbidities. We present a web-accessible COMET CDSS that provides family physicians with CPG-mediated comorbidity decision support to manage Atrial Fibrillation and Chronic Heart Failure. We present our qualitative and quantitative analysis of the knowledge content and usability of COMET CDSS.
Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lepinski, James
2013-09-30
A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments were conducted on three (3) sites using the QFMEA model: (1) SACROC Northern Platform CO{sub 2}-EOR Site in the Permian Basin, Scurry County, TX, (2) Pump Canyon CO{sub 2}-ECBM Site in the San Juan Basin, San Juan County, NM, and (3) Farnsworth Unit CO{sub 2}-EOR Site in the Anadarko Basin, Ochiltree County, TX. The sites were sufficiently different from each other to test the robustness of the QFMEA model.« less
Boyle, Kerry E.; Monaco, Hilary; van Ditmarsch, Dave; Deforet, Maxime; Xavier, Joao B.
2015-01-01
Many unicellular organisms live in multicellular communities that rely on cooperation between cells. However, cooperative traits are vulnerable to exploitation by non-cooperators (cheaters). We expand our understanding of the molecular mechanisms that allow multicellular systems to remain robust in the face of cheating by dissecting the dynamic regulation of cooperative rhamnolipids required for swarming in Pseudomonas aeruginosa. We combine mathematical modeling and experiments to quantitatively characterize the integration of metabolic and population density signals (quorum sensing) governing expression of the rhamnolipid synthesis operon rhlAB. The combined computational/experimental analysis reveals that when nutrients are abundant, rhlAB promoter activity increases gradually in a density dependent way. When growth slows down due to nutrient limitation, rhlAB promoter activity can stop abruptly, decrease gradually or even increase depending on whether the growth-limiting nutrient is the carbon source, nitrogen source or iron. Starvation by specific nutrients drives growth on intracellular nutrient pools as well as the qualitative rhlAB promoter response, which itself is modulated by quorum sensing. Our quantitative analysis suggests a supply-driven activation that integrates metabolic prudence with quorum sensing in a non-digital manner and allows P. aeruginosa cells to invest in cooperation only when the population size is large enough (quorum sensing) and individual cells have enough metabolic resources to do so (metabolic prudence). Thus, the quantitative description of rhlAB regulatory dynamics brings a greater understating to the regulation required to make swarming cooperation stable. PMID:26102206
NASA Astrophysics Data System (ADS)
Issaei, Ali; Szczygiel, Lukasz; Hossein-Javaheri, Nima; Young, Mei; Molday, L. L.; Molday, R. S.; Sarunic, M. V.
2011-03-01
Scanning Laser Ophthalmoscopy (SLO) and Coherence Tomography (OCT) are complimentary retinal imaging modalities. Integration of SLO and OCT allows for both fluorescent detection and depth- resolved structural imaging of the retinal cell layers to be performed in-vivo. System customization is required to image rodents used in medical research by vision scientists. We are investigating multimodal SLO/OCT imaging of a rodent model of Stargardt's Macular Dystrophy which is characterized by retinal degeneration and accumulation of toxic autofluorescent lipofuscin deposits. Our new findings demonstrate the ability to track fundus autofluorescence and retinal degeneration concurrently.
Tensorial Minkowski functionals of triply periodic minimal surfaces
Mickel, Walter; Schröder-Turk, Gerd E.; Mecke, Klaus
2012-01-01
A fundamental understanding of the formation and properties of a complex spatial structure relies on robust quantitative tools to characterize morphology. A systematic approach to the characterization of average properties of anisotropic complex interfacial geometries is provided by integral geometry which furnishes a family of morphological descriptors known as tensorial Minkowski functionals. These functionals are curvature-weighted integrals of tensor products of position vectors and surface normal vectors over the interfacial surface. We here demonstrate their use by application to non-cubic triply periodic minimal surface model geometries, whose Weierstrass parametrizations allow for accurate numerical computation of the Minkowski tensors. PMID:24098847
System monitoring and diagnosis with qualitative models
NASA Technical Reports Server (NTRS)
Kuipers, Benjamin
1991-01-01
A substantial foundation of tools for model-based reasoning with incomplete knowledge was developed: QSIM (a qualitative simulation program) and its extensions for qualitative simulation; Q2, Q3 and their successors for quantitative reasoning on a qualitative framework; and the CC (component-connection) and QPC (Qualitative Process Theory) model compilers for building QSIM QDE (qualitative differential equation) models starting from different ontological assumptions. Other model-compilers for QDE's, e.g., using bond graphs or compartmental models, have been developed elsewhere. These model-building tools will support automatic construction of qualitative models from physical specifications, and further research into selection of appropriate modeling viewpoints. For monitoring and diagnosis, plausible hypotheses are unified against observations to strengthen or refute the predicted behaviors. In MIMIC (Model Integration via Mesh Interpolation Coefficients), multiple hypothesized models of the system are tracked in parallel in order to reduce the 'missing model' problem. Each model begins as a qualitative model, and is unified with a priori quantitative knowledge and with the stream of incoming observational data. When the model/data unification yields a contradiction, the model is refuted. When there is no contradiction, the predictions of the model are progressively strengthened, for use in procedure planning and differential diagnosis. Only under a qualitative level of description can a finite set of models guarantee the complete coverage necessary for this performance. The results of this research are presented in several publications. Abstracts of these published papers are presented along with abtracts of papers representing work that was synergistic with the NASA grant but funded otherwise. These 28 papers include but are not limited to: 'Combined qualitative and numerical simulation with Q3'; 'Comparative analysis and qualitative integral representations'; 'Model-based monitoring of dynamic systems'; 'Numerical behavior envelopes for qualitative models'; 'Higher-order derivative constraints in qualitative simulation'; and 'Non-intersection of trajectories in qualitative phase space: a global constraint for qualitative simulation.'
AlQuraishi, Mohammed; Tang, Shengdong; Xia, Xide
2015-11-19
Molecular interactions between proteins and DNA molecules underlie many cellular processes, including transcriptional regulation, chromosome replication, and nucleosome positioning. Computational analyses of protein-DNA interactions rely on experimental data characterizing known protein-DNA interactions structurally and biochemically. While many databases exist that contain either structural or biochemical data, few integrate these two data sources in a unified fashion. Such integration is becoming increasingly critical with the rapid growth of structural and biochemical data, and the emergence of algorithms that rely on the synthesis of multiple data types to derive computational models of molecular interactions. We have developed an integrated affinity-structure database in which the experimental and quantitative DNA binding affinities of helix-turn-helix proteins are mapped onto the crystal structures of the corresponding protein-DNA complexes. This database provides access to: (i) protein-DNA structures, (ii) quantitative summaries of protein-DNA binding affinities using position weight matrices, and (iii) raw experimental data of protein-DNA binding instances. Critically, this database establishes a correspondence between experimental structural data and quantitative binding affinity data at the single basepair level. Furthermore, we present a novel alignment algorithm that structurally aligns the protein-DNA complexes in the database and creates a unified residue-level coordinate system for comparing the physico-chemical environments at the interface between complexes. Using this unified coordinate system, we compute the statistics of atomic interactions at the protein-DNA interface of helix-turn-helix proteins. We provide an interactive website for visualization, querying, and analyzing this database, and a downloadable version to facilitate programmatic analysis. This database will facilitate the analysis of protein-DNA interactions and the development of programmatic computational methods that capitalize on integration of structural and biochemical datasets. The database can be accessed at http://ProteinDNA.hms.harvard.edu.
NASA Astrophysics Data System (ADS)
Fanchiang, Christine
Crew performance, including both accommodation and utilization factors, is an integral part of every human spaceflight mission from commercial space tourism, to the demanding journey to Mars and beyond. Spacecraft were historically built by engineers and technologists trying to adapt the vehicle into cutting edge rocketry with the assumption that the astronauts could be trained and will adapt to the design. By and large, that is still the current state of the art. It is recognized, however, that poor human-machine design integration can lead to catastrophic and deadly mishaps. The premise of this work relies on the idea that if an accurate predictive model exists to forecast crew performance issues as a result of spacecraft design and operations, it can help designers and managers make better decisions throughout the design process, and ensure that the crewmembers are well-integrated with the system from the very start. The result should be a high-quality, user-friendly spacecraft that optimizes the utilization of the crew while keeping them alive, healthy, and happy during the course of the mission. Therefore, the goal of this work was to develop an integrative framework to quantitatively evaluate a spacecraft design from the crew performance perspective. The approach presented here is done at a very fundamental level starting with identifying and defining basic terminology, and then builds up important axioms of human spaceflight that lay the foundation for how such a framework can be developed. With the framework established, a methodology for characterizing the outcome using a mathematical model was developed by pulling from existing metrics and data collected on human performance in space. Representative test scenarios were run to show what information could be garnered and how it could be applied as a useful, understandable metric for future spacecraft design. While the model is the primary tangible product from this research, the more interesting outcome of this work is the structure of the framework and what it tells future researchers in terms of where the gaps and limitations exist for developing a better framework. It also identifies metrics that can now be collected as part of future validation efforts for the model.
NASA Technical Reports Server (NTRS)
Pepper, Stephen V.
1995-01-01
A grazing angle objective on an infrared microspectrometer is studied for quantitative spectroscopy by considering the angular dependence of the incident intensity within the objective's angular aperture. The assumption that there is no angular dependence is tested by comparing the experimental reflectance of Si and KBr surfaces with the reflectance calculated by integrating the Fresnel reflection coefficient over the angular aperture under this assumption. Good agreement was found, indicating that the specular reflectance of surfaces can straight-forwardly be quantitatively integrated over the angular aperture without considering non-uniform incident intensity. This quantitative approach is applied to the thickness determination of dipcoated Krytox on gold. The infrared optical constants of both materials are known, allowing the integration to be carried out. The thickness obtained is in fair agreement with the value determined by ellipsometry in the visible. Therefore, this paper illustrates a method for more quantitative use of a grazing angle objective for infrared reflectance microspectroscopy.
Using Active Learning to Teach Concepts and Methods in Quantitative Biology.
Waldrop, Lindsay D; Adolph, Stephen C; Diniz Behn, Cecilia G; Braley, Emily; Drew, Joshua A; Full, Robert J; Gross, Louis J; Jungck, John A; Kohler, Brynja; Prairie, Jennifer C; Shtylla, Blerta; Miller, Laura A
2015-11-01
This article provides a summary of the ideas discussed at the 2015 Annual Meeting of the Society for Integrative and Comparative Biology society-wide symposium on Leading Students and Faculty to Quantitative Biology through Active Learning. It also includes a brief review of the recent advancements in incorporating active learning approaches into quantitative biology classrooms. We begin with an overview of recent literature that shows that active learning can improve students' outcomes in Science, Technology, Engineering and Math Education disciplines. We then discuss how this approach can be particularly useful when teaching topics in quantitative biology. Next, we describe some of the recent initiatives to develop hands-on activities in quantitative biology at both the graduate and the undergraduate levels. Throughout the article we provide resources for educators who wish to integrate active learning and technology into their classrooms. © The Author 2015. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.
How predictive quantitative modelling of tissue organisation can inform liver disease pathogenesis.
Drasdo, Dirk; Hoehme, Stefan; Hengstler, Jan G
2014-10-01
From the more than 100 liver diseases described, many of those with high incidence rates manifest themselves by histopathological changes, such as hepatitis, alcoholic liver disease, fatty liver disease, fibrosis, and, in its later stages, cirrhosis, hepatocellular carcinoma, primary biliary cirrhosis and other disorders. Studies of disease pathogeneses are largely based on integrating -omics data pooled from cells at different locations with spatial information from stained liver structures in animal models. Even though this has led to significant insights, the complexity of interactions as well as the involvement of processes at many different time and length scales constrains the possibility to condense disease processes in illustrations, schemes and tables. The combination of modern imaging modalities with image processing and analysis, and mathematical models opens up a promising new approach towards a quantitative understanding of pathologies and of disease processes. This strategy is discussed for two examples, ammonia metabolism after drug-induced acute liver damage, and the recovery of liver mass as well as architecture during the subsequent regeneration process. This interdisciplinary approach permits integration of biological mechanisms and models of processes contributing to disease progression at various scales into mathematical models. These can be used to perform in silico simulations to promote unravelling the relation between architecture and function as below illustrated for liver regeneration, and bridging from the in vitro situation and animal models to humans. In the near future novel mechanisms will usually not be directly elucidated by modelling. However, models will falsify hypotheses and guide towards the most informative experimental design. Copyright © 2014 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.
Fang, Jiansong; Pang, Xiaocong; Wu, Ping; Yan, Rong; Gao, Li; Li, Chao; Lian, Wenwen; Wang, Qi; Liu, Ai-lin; Du, Guan-hua
2016-05-01
A dataset of 67 berberine derivatives for the inhibition of butyrylcholinesterase (BuChE) was studied based on the combination of quantitative structure-activity relationships models, molecular docking, and molecular dynamics methods. First, a series of berberine derivatives were reported, and their inhibitory activities toward butyrylcholinesterase (BuChE) were evaluated. By 2D- quantitative structure-activity relationships studies, the best model built by partial least-square had a conventional correlation coefficient of the training set (R(2)) of 0.883, a cross-validation correlation coefficient (Qcv2) of 0.777, and a conventional correlation coefficient of the test set (Rpred2) of 0.775. The model was also confirmed by Y-randomization examination. In addition, the molecular docking and molecular dynamics simulation were performed to better elucidate the inhibitory mechanism of three typical berberine derivatives (berberine, C2, and C55) toward BuChE. The predicted binding free energy results were consistent with the experimental data and showed that the van der Waals energy term (ΔEvdw) difference played the most important role in differentiating the activity among the three inhibitors (berberine, C2, and C55). The developed quantitative structure-activity relationships models provide details on the fine relationship linking structure and activity and offer clues for structural modifications, and the molecular simulation helps to understand the inhibitory mechanism of the three typical inhibitors. In conclusion, the results of this study provide useful clues for new drug design and discovery of BuChE inhibitors from berberine derivatives. © 2015 John Wiley & Sons A/S.
USDA-ARS?s Scientific Manuscript database
The genomics revolution provides vital tools to address global food security. Yet to be incorporated into livestock breeding, molecular techniques need to be integrated into a quantitative genetics framework. Within the U.S., with shrinking faculty numbers with the requisite skills, the capacity to ...
Measuring the Beginning: A Quantitative Study of the Transition to Higher Education
ERIC Educational Resources Information Center
Brooman, Simon; Darwent, Sue
2014-01-01
This quantitative study measures change in certain factors known to influence success of first-year students during the transition to higher education: self-efficacy, autonomous learning and social integration. A social integration scale was developed with three subscales: "sense of belonging", "relationship with staff" and…
A framework for scalable parameter estimation of gene circuit models using structural information.
Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin
2013-07-01
Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.
Quantitative analysis of microbial biomass yield in aerobic bioreactor.
Watanabe, Osamu; Isoda, Satoru
2013-12-01
We have studied the integrated model of reaction rate equations with thermal energy balance in aerobic bioreactor for food waste decomposition and showed that the integrated model has the capability both of monitoring microbial activity in real time and of analyzing biodegradation kinetics and thermal-hydrodynamic properties. On the other hand, concerning microbial metabolism, it was known that balancing catabolic reactions with anabolic reactions in terms of energy and electron flow provides stoichiometric metabolic reactions and enables the estimation of microbial biomass yield (stoichiometric reaction model). We have studied a method for estimating real-time microbial biomass yield in the bioreactor during food waste decomposition by combining the integrated model with the stoichiometric reaction model. As a result, it was found that the time course of microbial biomass yield in the bioreactor during decomposition can be evaluated using the operational data of the bioreactor (weight of input food waste and bed temperature) by the combined model. The combined model can be applied to manage a food waste decomposition not only for controlling system operation to keep microbial activity stable, but also for producing value-added products such as compost on optimum condition. Copyright © 2013 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.
Brock, P M; Fornace, K M; Parmiter, M; Cox, J; Drakeley, C J; Ferguson, H M; Kao, R R
2016-04-01
The public health threat posed by zoonotic Plasmodium knowlesi appears to be growing: it is increasingly reported across South East Asia, and is the leading cause of malaria in Malaysian Borneo. Plasmodium knowlesi threatens progress towards malaria elimination as aspects of its transmission, such as spillover from wildlife reservoirs and reliance on outdoor-biting vectors, may limit the effectiveness of conventional methods of malaria control. The development of new quantitative approaches that address the ecological complexity of P. knowlesi, particularly through a focus on its primary reservoir hosts, will be required to control it. Here, we review what is known about P. knowlesi transmission, identify key knowledge gaps in the context of current approaches to transmission modelling, and discuss the integration of these approaches with clinical parasitology and geostatistical analysis. We highlight the need to incorporate the influences of fine-scale spatial variation, rapid changes to the landscape, and reservoir population and transmission dynamics. The proposed integrated approach would address the unique challenges posed by malaria as a zoonosis, aid the identification of transmission hotspots, provide insight into the mechanistic links between incidence and land use change and support the design of appropriate interventions.
Quantitative Methods in the Study of Local History
ERIC Educational Resources Information Center
Davey, Pene
1974-01-01
The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)
NASA Astrophysics Data System (ADS)
Tian, Y.; Zheng, Y.; Zheng, C.; Han, F., Sr.
2017-12-01
Physically based and fully-distributed integrated hydrological models (IHMs) can quantitatively depict hydrological processes, both surface and subsurface, with sufficient spatial and temporal details. However, the complexity involved in pre-processing data and setting up models seriously hindered the wider application of IHMs in scientific research and management practice. This study introduces our design and development of Visual HEIFLOW, hereafter referred to as VHF, a comprehensive graphical data processing and modeling system for integrated hydrological simulation. The current version of VHF has been structured to accommodate an IHM named HEIFLOW (Hydrological-Ecological Integrated watershed-scale FLOW model). HEIFLOW is a model being developed by the authors, which has all typical elements of physically based and fully-distributed IHMs. It is based on GSFLOW, a representative integrated surface water-groundwater model developed by USGS. HEIFLOW provides several ecological modules that enable to simulate growth cycle of general vegetation and special plants (maize and populus euphratica). VHF incorporates and streamlines all key steps of the integrated modeling, and accommodates all types of GIS data necessary to hydrological simulation. It provides a GIS-based data processing framework to prepare an IHM for simulations, and has functionalities to flexibly display and modify model features (e.g., model grids, streams, boundary conditions, observational sites, etc.) and their associated data. It enables visualization and various spatio-temporal analyses of all model inputs and outputs at different scales (i.e., computing unit, sub-basin, basin, or user-defined spatial extent). The above system features, as well as many others, can significantly reduce the difficulty and time cost of building and using a complex IHM. The case study in the Heihe River Basin demonstrated the applicability of VHF for large scale integrated SW-GW modeling. Visualization and spatial-temporal analysis of the modeling results by HEIFLOW greatly facilitates our understanding on the complicated hydrologic cycle and relationship among the hydrological and ecological variables in the study area, and provides insights into the regional water resources management.
Haas, Magali; Stephenson, Diane; Romero, Klaus; Gordon, Mark Forrest; Zach, Neta; Geerts, Hugo
2016-09-01
Many disease-modifying clinical development programs in Alzheimer's disease (AD) have failed to date, and development of new and advanced preclinical models that generate actionable knowledge is desperately needed. This review reports on computer-based modeling and simulation approach as a powerful tool in AD research. Statistical data-analysis techniques can identify associations between certain data and phenotypes, such as diagnosis or disease progression. Other approaches integrate domain expertise in a formalized mathematical way to understand how specific components of pathology integrate into complex brain networks. Private-public partnerships focused on data sharing, causal inference and pathway-based analysis, crowdsourcing, and mechanism-based quantitative systems modeling represent successful real-world modeling examples with substantial impact on CNS diseases. Similar to other disease indications, successful real-world examples of advanced simulation can generate actionable support of drug discovery and development in AD, illustrating the value that can be generated for different stakeholders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Evolution of flowering strategies in Oenothera glazioviana: an integral projection model approach.
Rees, Mark; Rose, Karen E
2002-01-01
The timing of reproduction is a key determinant of fitness. Here, we develop parameterized integral projection models of size-related flowering for the monocarpic perennial Oenothera glazioviana and use these to predict the evolutionarily stable strategy (ESS) for flowering. For the most part there is excellent agreement between the model predictions and the results of quantitative field studies. However, the model predicts a much steeper relationship between plant size and the probability of flowering than observed in the field, indicating selection for a 'threshold size' flowering function. Elasticity and sensitivity analysis of population growth rate lambda and net reproductive rate R(0) are used to identify the critical traits that determine fitness and control the ESS for flowering. Using the fitted model we calculate the fitness landscape for invading genotypes and show that this is characterized by a ridge of approximately equal fitness. The implications of these results for the maintenance of genetic variation are discussed. PMID:12137582
Evolution of flowering strategies in Oenothera glazioviana: an integral projection model approach.
Rees, Mark; Rose, Karen E
2002-07-22
The timing of reproduction is a key determinant of fitness. Here, we develop parameterized integral projection models of size-related flowering for the monocarpic perennial Oenothera glazioviana and use these to predict the evolutionarily stable strategy (ESS) for flowering. For the most part there is excellent agreement between the model predictions and the results of quantitative field studies. However, the model predicts a much steeper relationship between plant size and the probability of flowering than observed in the field, indicating selection for a 'threshold size' flowering function. Elasticity and sensitivity analysis of population growth rate lambda and net reproductive rate R(0) are used to identify the critical traits that determine fitness and control the ESS for flowering. Using the fitted model we calculate the fitness landscape for invading genotypes and show that this is characterized by a ridge of approximately equal fitness. The implications of these results for the maintenance of genetic variation are discussed.
Integrated Experimental and Modelling Research for Non-Ferrous Smelting and Recycling Systems
NASA Astrophysics Data System (ADS)
Jak, Evgueni; Hidayat, Taufiq; Shishin, Denis; Mehrjardi, Ata Fallah; Chen, Jiang; Decterov, Sergei; Hayes, Peter
The chemistries of industrial pyrometallurgical non-ferrous smelting and recycling processes are becoming increasingly complex. Optimisation of process conditions, charge composition, temperature, oxygen partial pressure, and partitioning of minor elements between phases and different process streams require accurate description of phase equilibria and thermodynamics which are the focus of the present research. The experiments involve high temperature equilibration in controlled gas atmospheres, rapid quenching and direct measurement of equilibrium phase compositions with quantitative microanalytical techniques including electron probe X-ray microanalysis and Laser Ablation ICP-MS. The thermodynamic modelling is undertaken using computer package FactSage with the quasi-chemical model for the liquid slag phase and other advanced models. Experimental and modelling studies are combined into an integrated research program focused on the major elements Cu-Pb-Fe-O-Si-S system, slagging Al, Ca, Mg and other minor elements. The ongoing development of the research methodologies has resulted in significant advances in research capabilities. Examples of applications are given.
A Computational Model of Linguistic Humor in Puns.
Kao, Justine T; Levy, Roger; Goodman, Noah D
2016-07-01
Humor plays an essential role in human interactions. Precisely what makes something funny, however, remains elusive. While research on natural language understanding has made significant advancements in recent years, there has been little direct integration of humor research with computational models of language understanding. In this paper, we propose two information-theoretic measures-ambiguity and distinctiveness-derived from a simple model of sentence processing. We test these measures on a set of puns and regular sentences and show that they correlate significantly with human judgments of funniness. Moreover, within a set of puns, the distinctiveness measure distinguishes exceptionally funny puns from mediocre ones. Our work is the first, to our knowledge, to integrate a computational model of general language understanding and humor theory to quantitatively predict humor at a fine-grained level. We present it as an example of a framework for applying models of language processing to understand higher level linguistic and cognitive phenomena. © 2015 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
NASA Astrophysics Data System (ADS)
Yamamoto, Takahiro; Nadaoka, Kazuo
2018-04-01
Atmospheric, watershed and coastal ocean models were integrated to provide a holistic analysis approach for coastal ocean simulation. The coupled model was applied to coastal ocean in the Philippines where terrestrial sediment loads provided from several adjacent watersheds play a major role in influencing coastal turbidity and are partly responsible for the coastal ecosystem degradation. The coupled model was validated using weather and hydrologic measurement to examine its potential applicability. The results revealed that the coastal water quality may be governed by the loads not only from the adjacent watershed but also from the distant watershed via coastal currents. This important feature of the multiple linkages can be quantitatively characterized by a "stress connectivity matrix", which indicates the complex underlying structure of environmental stresses in coastal ocean. The multiple stress connectivity concept shows the potential advantage of the integrated modelling approach for coastal ocean assessment, which may also serve for compensating the lack of measured data especially in tropical basins.
Dynamic Redox Regulation of IL-4 Signaling.
Dwivedi, Gaurav; Gran, Margaret A; Bagchi, Pritha; Kemp, Melissa L
2015-11-01
Quantifying the magnitude and dynamics of protein oxidation during cell signaling is technically challenging. Computational modeling provides tractable, quantitative methods to test hypotheses of redox mechanisms that may be simultaneously operative during signal transduction. The interleukin-4 (IL-4) pathway, which has previously been reported to induce reactive oxygen species and oxidation of PTP1B, may be controlled by several other putative mechanisms of redox regulation; widespread proteomic thiol oxidation observed via 2D redox differential gel electrophoresis upon IL-4 treatment suggests more than one redox-sensitive protein implicated in this pathway. Through computational modeling and a model selection strategy that relied on characteristic STAT6 phosphorylation dynamics of IL-4 signaling, we identified reversible protein tyrosine phosphatase (PTP) oxidation as the primary redox regulatory mechanism in the pathway. A systems-level model of IL-4 signaling was developed that integrates synchronous pan-PTP oxidation with ROS-independent mechanisms. The model quantitatively predicts the dynamics of IL-4 signaling over a broad range of new redox conditions, offers novel hypotheses about regulation of JAK/STAT signaling, and provides a framework for interrogating putative mechanisms involving receptor-initiated oxidation.
Dynamic Redox Regulation of IL-4 Signaling
Dwivedi, Gaurav; Gran, Margaret A.; Bagchi, Pritha; Kemp, Melissa L.
2015-01-01
Quantifying the magnitude and dynamics of protein oxidation during cell signaling is technically challenging. Computational modeling provides tractable, quantitative methods to test hypotheses of redox mechanisms that may be simultaneously operative during signal transduction. The interleukin-4 (IL-4) pathway, which has previously been reported to induce reactive oxygen species and oxidation of PTP1B, may be controlled by several other putative mechanisms of redox regulation; widespread proteomic thiol oxidation observed via 2D redox differential gel electrophoresis upon IL-4 treatment suggests more than one redox-sensitive protein implicated in this pathway. Through computational modeling and a model selection strategy that relied on characteristic STAT6 phosphorylation dynamics of IL-4 signaling, we identified reversible protein tyrosine phosphatase (PTP) oxidation as the primary redox regulatory mechanism in the pathway. A systems-level model of IL-4 signaling was developed that integrates synchronous pan-PTP oxidation with ROS-independent mechanisms. The model quantitatively predicts the dynamics of IL-4 signaling over a broad range of new redox conditions, offers novel hypotheses about regulation of JAK/STAT signaling, and provides a framework for interrogating putative mechanisms involving receptor-initiated oxidation. PMID:26562652
Approaches to developing alternative and predictive toxicology based on PBPK/PD and QSAR modeling.
Yang, R S; Thomas, R S; Gustafson, D L; Campain, J; Benjamin, S A; Verhaar, H J; Mumtaz, M M
1998-01-01
Systematic toxicity testing, using conventional toxicology methodologies, of single chemicals and chemical mixtures is highly impractical because of the immense numbers of chemicals and chemical mixtures involved and the limited scientific resources. Therefore, the development of unconventional, efficient, and predictive toxicology methods is imperative. Using carcinogenicity as an end point, we present approaches for developing predictive tools for toxicologic evaluation of chemicals and chemical mixtures relevant to environmental contamination. Central to the approaches presented is the integration of physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) and quantitative structure--activity relationship (QSAR) modeling with focused mechanistically based experimental toxicology. In this development, molecular and cellular biomarkers critical to the carcinogenesis process are evaluated quantitatively between different chemicals and/or chemical mixtures. Examples presented include the integration of PBPK/PD and QSAR modeling with a time-course medium-term liver foci assay, molecular biology and cell proliferation studies. Fourier transform infrared spectroscopic analyses of DNA changes, and cancer modeling to assess and attempt to predict the carcinogenicity of the series of 12 chlorobenzene isomers. Also presented is an ongoing effort to develop and apply a similar approach to chemical mixtures using in vitro cell culture (Syrian hamster embryo cell transformation assay and human keratinocytes) methodologies and in vivo studies. The promise and pitfalls of these developments are elaborated. When successfully applied, these approaches may greatly reduce animal usage, personnel, resources, and time required to evaluate the carcinogenicity of chemicals and chemical mixtures. Images Figure 6 PMID:9860897
A multi-objective constraint-based approach for modeling genome-scale microbial ecosystems.
Budinich, Marko; Bourdon, Jérémie; Larhlimi, Abdelhalim; Eveillard, Damien
2017-01-01
Interplay within microbial communities impacts ecosystems on several scales, and elucidation of the consequent effects is a difficult task in ecology. In particular, the integration of genome-scale data within quantitative models of microbial ecosystems remains elusive. This study advocates the use of constraint-based modeling to build predictive models from recent high-resolution -omics datasets. Following recent studies that have demonstrated the accuracy of constraint-based models (CBMs) for simulating single-strain metabolic networks, we sought to study microbial ecosystems as a combination of single-strain metabolic networks that exchange nutrients. This study presents two multi-objective extensions of CBMs for modeling communities: multi-objective flux balance analysis (MO-FBA) and multi-objective flux variability analysis (MO-FVA). Both methods were applied to a hot spring mat model ecosystem. As a result, multiple trade-offs between nutrients and growth rates, as well as thermodynamically favorable relative abundances at community level, were emphasized. We expect this approach to be used for integrating genomic information in microbial ecosystems. Following models will provide insights about behaviors (including diversity) that take place at the ecosystem scale.
Extended Kalman Filter framework for forecasting shoreline evolution
Long, Joseph; Plant, Nathaniel G.
2012-01-01
A shoreline change model incorporating both long- and short-term evolution is integrated into a data assimilation framework that uses sparse observations to generate an updated forecast of shoreline position and to estimate unobserved geophysical variables and model parameters. Application of the assimilation algorithm provides quantitative statistical estimates of combined model-data forecast uncertainty which is crucial for developing hazard vulnerability assessments, evaluation of prediction skill, and identifying future data collection needs. Significant attention is given to the estimation of four non-observable parameter values and separating two scales of shoreline evolution using only one observable morphological quantity (i.e. shoreline position).
QSAR modeling of cumulative environmental end-points for the prioritization of hazardous chemicals.
Gramatica, Paola; Papa, Ester; Sangion, Alessandro
2018-01-24
The hazard of chemicals in the environment is inherently related to the molecular structure and derives simultaneously from various chemical properties/activities/reactivities. Models based on Quantitative Structure Activity Relationships (QSARs) are useful to screen, rank and prioritize chemicals that may have an adverse impact on humans and the environment. This paper reviews a selection of QSAR models (based on theoretical molecular descriptors) developed for cumulative multivariate endpoints, which were derived by mathematical combination of multiple effects and properties. The cumulative end-points provide an integrated holistic point of view to address environmentally relevant properties of chemicals.
The structure and timescales of heat perception in larval zebrafish.
Haesemeyer, Martin; Robson, Drew N; Li, Jennifer M; Schier, Alexander F; Engert, Florian
2015-11-25
Avoiding temperatures outside the physiological range is critical for animal survival, but how temperature dynamics are transformed into behavioral output is largely not understood. Here, we used an infrared laser to challenge freely swimming larval zebrafish with "white-noise" heat stimuli and built quantitative models relating external sensory information and internal state to behavioral output. These models revealed that larval zebrafish integrate temperature information over a time-window of 400 ms preceding a swimbout and that swimming is suppressed right after the end of a bout. Our results suggest that larval zebrafish compute both an integral and a derivative across heat in time to guide their next movement. Our models put important constraints on the type of computations that occur in the nervous system and reveal principles of how somatosensory temperature information is processed to guide behavioral decisions such as sensitivity to both absolute levels and changes in stimulation.
Guo, Yujie; Shen, Jie; Ye, Xuchun; Chen, Huali; Jiang, Anli
2013-08-01
This paper aims to report the design and test the effectiveness of an innovative caring teaching model based on the theoretical framework of caring in the Chinese context. Since the 1970's, caring has been a core value in nursing education. In a previous study, a theoretical framework of caring in the Chinese context is explored employing a grounded theory study, considered beneficial for caring education. A caring teaching model was designed theoretically and a one group pre- and post-test quasi-experimental study was administered to test its effectiveness. From Oct, 2009 to Jul, 2010, a cohort of grade-2 undergraduate nursing students (n=64) in a Chinese medical school was recruited to participate in the study. Data were gathered through quantitative and qualitative methods to evaluate the effectiveness of the caring teaching model. The caring teaching model created an esthetic situation and experiential learning style for teaching caring that was integrated within the curricula. Quantitative data from the quasi-experimental study showed that the post-test scores of each item were higher than those on the pre-test (p<0.01). Thematic analysis of 1220 narratives from students' caring journals and reports of participant class observation revealed two main thematic categories, which reflected, from the students' points of view, the development of student caring character and the impact that the caring teaching model had on this regard. The model could be used as an integrated approach to teach caring in nursing curricula. It would also be beneficial for nursing administrators in cultivating caring nurse practitioners. Copyright © 2012 Elsevier Ltd. All rights reserved.
Vlot, Anna H C; de Witte, Wilhelmus E A; Danhof, Meindert; van der Graaf, Piet H; van Westen, Gerard J P; de Lange, Elizabeth C M
2017-12-04
Selectivity is an important attribute of effective and safe drugs, and prediction of in vivo target and tissue selectivity would likely improve drug development success rates. However, a lack of understanding of the underlying (pharmacological) mechanisms and availability of directly applicable predictive methods complicates the prediction of selectivity. We explore the value of combining physiologically based pharmacokinetic (PBPK) modeling with quantitative structure-activity relationship (QSAR) modeling to predict the influence of the target dissociation constant (K D ) and the target dissociation rate constant on target and tissue selectivity. The K D values of CB1 ligands in the ChEMBL database are predicted by QSAR random forest (RF) modeling for the CB1 receptor and known off-targets (TRPV1, mGlu5, 5-HT1a). Of these CB1 ligands, rimonabant, CP-55940, and Δ 8 -tetrahydrocanabinol, one of the active ingredients of cannabis, were selected for simulations of target occupancy for CB1, TRPV1, mGlu5, and 5-HT1a in three brain regions, to illustrate the principles of the combined PBPK-QSAR modeling. Our combined PBPK and target binding modeling demonstrated that the optimal values of the K D and k off for target and tissue selectivity were dependent on target concentration and tissue distribution kinetics. Interestingly, if the target concentration is high and the perfusion of the target site is low, the optimal K D value is often not the lowest K D value, suggesting that optimization towards high drug-target affinity can decrease the benefit-risk ratio. The presented integrative structure-pharmacokinetic-pharmacodynamic modeling provides an improved understanding of tissue and target selectivity.
NASA Astrophysics Data System (ADS)
Shen, Chengcheng; Shi, Honghua; Liu, Yongzhi; Li, Fen; Ding, Dewen
2016-07-01
Marine ecosystem dynamic models (MEDMs) are important tools for the simulation and prediction of marine ecosystems. This article summarizes the methods and strategies used for the improvement and assessment of MEDM skill, and it attempts to establish a technical framework to inspire further ideas concerning MEDM skill improvement. The skill of MEDMs can be improved by parameter optimization (PO), which is an important step in model calibration. An efficient approach to solve the problem of PO constrained by MEDMs is the global treatment of both sensitivity analysis and PO. Model validation is an essential step following PO, which validates the efficiency of model calibration by analyzing and estimating the goodness-of-fit of the optimized model. Additionally, by focusing on the degree of impact of various factors on model skill, model uncertainty analysis can supply model users with a quantitative assessment of model confidence. Research on MEDMs is ongoing; however, improvement in model skill still lacks global treatments and its assessment is not integrated. Thus, the predictive performance of MEDMs is not strong and model uncertainties lack quantitative descriptions, limiting their application. Therefore, a large number of case studies concerning model skill should be performed to promote the development of a scientific and normative technical framework for the improvement of MEDM skill.
Saitou, Takashi; Imamura, Takeshi
2016-01-01
Cell cycle progression is strictly coordinated to ensure proper tissue growth, development, and regeneration of multicellular organisms. Spatiotemporal visualization of cell cycle phases directly helps us to obtain a deeper understanding of controlled, multicellular, cell cycle progression. The fluorescent ubiquitination-based cell cycle indicator (Fucci) system allows us to monitor, in living cells, the G1 and the S/G2/M phases of the cell cycle in red and green fluorescent colors, respectively. Since the discovery of Fucci technology, it has found numerous applications in the characterization of the timing of cell cycle phase transitions under diverse conditions and various biological processes. However, due to the complexity of cell cycle dynamics, understanding of specific patterns of cell cycle progression is still far from complete. In order to tackle this issue, quantitative approaches combined with mathematical modeling seem to be essential. Here, we review several studies that attempted to integrate Fucci technology and mathematical models to obtain quantitative information regarding cell cycle regulatory patterns. Focusing on the technological development of utilizing mathematics to retrieve meaningful information from the Fucci producing data, we discuss how the combined methods advance a quantitative understanding of cell cycle regulation. © 2015 Japanese Society of Developmental Biologists.
Yuan, Naiming; Fu, Zuntao; Liu, Shida
2014-01-01
Long term memory (LTM) in climate variability is studied by means of fractional integral techniques. By using a recently developed model, Fractional Integral Statistical Model (FISM), we in this report proposed a new method, with which one can estimate the long-lasting influences of historical climate states on the present time quantitatively, and further extract the influence as climate memory signals. To show the usability of this method, two examples, the Northern Hemisphere monthly Temperature Anomalies (NHTA) and the Pacific Decadal Oscillation index (PDO), are analyzed in this study. We find the climate memory signals indeed can be extracted and the whole variations can be further decomposed into two parts: the cumulative climate memory (CCM) and the weather-scale excitation (WSE). The stronger LTM is, the larger proportion the climate memory signals will account for in the whole variations. With the climate memory signals extracted, one can at least determine on what basis the considered time series will continue to change. Therefore, this report provides a new perspective on climate prediction. PMID:25300777
Integrated presentation of ecological risk from multiple stressors
NASA Astrophysics Data System (ADS)
Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman
2016-10-01
Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.
Integrated presentation of ecological risk from multiple stressors.
Goussen, Benoit; Price, Oliver R; Rendal, Cecilie; Ashauer, Roman
2016-10-26
Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.
Bridging the Engineering and Medicine Gap
NASA Technical Reports Server (NTRS)
Walton, M.; Antonsen, E.
2018-01-01
A primary challenge NASA faces is communication between the disparate entities of engineers and human system experts in life sciences. Clear communication is critical for exploration mission success from the perspective of both risk analysis and data handling. The engineering community uses probabilistic risk assessment (PRA) models to inform their own risk analysis and has extensive experience managing mission data, but does not always fully consider human systems integration (HSI). The medical community, as a part of HSI, has been working 1) to develop a suite of tools to express medical risk in quantitative terms that are relatable to the engineering approaches commonly in use, and 2) to manage and integrate HSI data with engineering data. This talk will review the development of the Integrated Medical Model as an early attempt to bridge the communication gap between the medical and engineering communities in the language of PRA. This will also address data communication between the two entities in the context of data management considerations of the Medical Data Architecture. Lessons learned from these processes will help identify important elements to consider in future communication and integration of these two groups.
The SAGE Model of Social Psychological Research
Power, Séamus A.; Velez, Gabriel; Qadafi, Ahmad; Tennant, Joseph
2018-01-01
We propose a SAGE model for social psychological research. Encapsulated in our acronym is a proposal to have a synthetic approach to social psychological research, in which qualitative methods are augmentative to quantitative ones, qualitative methods can be generative of new experimental hypotheses, and qualitative methods can capture experiences that evade experimental reductionism. We remind social psychological researchers that psychology was founded in multiple methods of investigation at multiple levels of analysis. We discuss historical examples and our own research as contemporary examples of how a SAGE model can operate in part or as an integrated whole. The implications of our model are discussed. PMID:29361241
Carroll, Linda J; Rothe, J Peter
2010-09-01
Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.
NASA Astrophysics Data System (ADS)
Poltavsky, Igor; DiStasio, Robert A.; Tkatchenko, Alexandre
2018-03-01
Nuclear quantum effects (NQE), which include both zero-point motion and tunneling, exhibit quite an impressive range of influence over the equilibrium and dynamical properties of molecules and materials. In this work, we extend our recently proposed perturbed path-integral (PPI) approach for modeling NQE in molecular systems [I. Poltavsky and A. Tkatchenko, Chem. Sci. 7, 1368 (2016)], which successfully combines the advantages of thermodynamic perturbation theory with path-integral molecular dynamics (PIMD), in a number of important directions. First, we demonstrate the accuracy, performance, and general applicability of the PPI approach to both molecules and extended (condensed-phase) materials. Second, we derive a series of estimators within the PPI approach to enable calculations of structural properties such as radial distribution functions (RDFs) that exhibit rapid convergence with respect to the number of beads in the PIMD simulation. Finally, we introduce an effective nuclear temperature formalism within the framework of the PPI approach and demonstrate that such effective temperatures can be an extremely useful tool in quantitatively estimating the "quantumness" associated with different degrees of freedom in the system as well as providing a reliable quantitative assessment of the convergence of PIMD simulations. Since the PPI approach only requires the use of standard second-order imaginary-time PIMD simulations, these developments enable one to include a treatment of NQE in equilibrium thermodynamic properties (such as energies, heat capacities, and RDFs) with the accuracy of higher-order methods but at a fraction of the computational cost, thereby enabling first-principles modeling that simultaneously accounts for the quantum mechanical nature of both electrons and nuclei in large-scale molecules and materials.
NASA Astrophysics Data System (ADS)
Doummar, Joanna; Kassem, Assaad
2017-04-01
In the framework of a three-year PEER (USAID/NSF) funded project, flow in a Karst system in Lebanon (Assal) dominated by snow and semi arid conditions was simulated and successfully calibrated using an integrated numerical model (MIKE-She 2016) based on high resolution input data and detailed catchment characterization. Point source infiltration and fast flow pathways were simulated by a bypass function and a high conductive lens respectively. The approach consisted of identifying all the factors used in qualitative vulnerability methods (COP, EPIK, PI, DRASTIC, GOD) applied in karst systems and to assess their influence on recharge signals in the different hydrological karst compartments (Atmosphere, Unsaturated zone and Saturated zone) based on the integrated numerical model. These parameters are usually attributed different weights according to their estimated impact on Groundwater vulnerability. The aim of this work is to quantify the importance of each of these parameters and outline parameters that are not accounted for in standard methods, but that might play a role in the vulnerability of a system. The spatial distribution of the detailed evapotranspiration, infiltration, and recharge signals from atmosphere to unsaturated zone to saturated zone was compared and contrasted among different surface settings and under varying flow conditions (e.g., in varying slopes, land cover, precipitation intensity, and soil properties as well point source infiltration). Furthermore a sensitivity analysis of individual or coupled major parameters allows quantifying their impact on recharge and indirectly on vulnerability. The preliminary analysis yields a new methodology that accounts for most of the factors influencing vulnerability while refining the weights attributed to each one of them, based on a quantitative approach.
Blatt, Michael R.; Wang, Yizhou; Leonhardt, Nathalie; Hills, Adrian
2014-01-01
It is widely recognized that the nature and characteristics of transport across eukaryotic membranes are so complex as to defy intuitive understanding. In these circumstances, quantitative mathematical modeling is an essential tool, both to integrate detailed knowledge of individual transporters and to extract the properties emergent from their interactions. As the first, fully integrated and quantitative modeling environment for the study of ion transport dynamics in a plant cell, OnGuard offers a unique tool for exploring homeostatic properties emerging from the interactions of ion transport, both at the plasma membrane and tonoplast in the guard cell. OnGuard has already yielded detail sufficient to guide phenotypic and mutational studies, and it represents a key step toward ‘reverse engineering’ of stomatal guard cell physiology, based on rational design and testing in simulation, to improve water use efficiency and carbon assimilation. Its construction from the HoTSig libraries enables translation of the software to other cell types, including growing root hairs and pollen. The problems inherent to transport are nonetheless challenging, and are compounded for those unfamiliar with conceptual ‘mindset’ of the modeler. Here we set out guidelines for the use of OnGuard and outline a standardized approach that will enable users to advance quickly to its application both in the classroom and laboratory. We also highlight the uncanny and emergent property of OnGuard models to reproduce the ‘communication’ evident between the plasma membrane and tonoplast of the guard cell. PMID:24268743
Blatt, Michael R; Wang, Yizhou; Leonhardt, Nathalie; Hills, Adrian
2014-05-15
It is widely recognized that the nature and characteristics of transport across eukaryotic membranes are so complex as to defy intuitive understanding. In these circumstances, quantitative mathematical modeling is an essential tool, both to integrate detailed knowledge of individual transporters and to extract the properties emergent from their interactions. As the first, fully integrated and quantitative modeling environment for the study of ion transport dynamics in a plant cell, OnGuard offers a unique tool for exploring homeostatic properties emerging from the interactions of ion transport, both at the plasma membrane and tonoplast in the guard cell. OnGuard has already yielded detail sufficient to guide phenotypic and mutational studies, and it represents a key step toward 'reverse engineering' of stomatal guard cell physiology, based on rational design and testing in simulation, to improve water use efficiency and carbon assimilation. Its construction from the HoTSig libraries enables translation of the software to other cell types, including growing root hairs and pollen. The problems inherent to transport are nonetheless challenging, and are compounded for those unfamiliar with conceptual 'mindset' of the modeler. Here we set out guidelines for the use of OnGuard and outline a standardized approach that will enable users to advance quickly to its application both in the classroom and laboratory. We also highlight the uncanny and emergent property of OnGuard models to reproduce the 'communication' evident between the plasma membrane and tonoplast of the guard cell. Copyright © 2014 The Authors. Published by Elsevier GmbH.. All rights reserved.
An integrative model of evolutionary covariance: a symposium on body shape in fishes.
Walker, Jeffrey A
2010-12-01
A major direction of current and future biological research is to understand how multiple, interacting functional systems coordinate in producing a body that works. This understanding is complicated by the fact that organisms need to work well in multiple environments, with both predictable and unpredictable environmental perturbations. Furthermore, organismal design reflects a history of past environments and not a plan for future environments. How complex, interacting functional systems evolve, then, is a truly grand challenge. In accepting the challenge, an integrative model of evolutionary covariance is developed. The model combines quantitative genetics, functional morphology/physiology, and functional ecology. The model is used to convene scientists ranging from geneticists, to physiologists, to ecologists, to engineers to facilitate the emergence of body shape in fishes as a model system for understanding how complex, interacting functional systems develop and evolve. Body shape of fish is a complex morphology that (1) results from many developmental paths and (2) functions in many different behaviors. Understanding the coordination and evolution of the many paths from genes to body shape, body shape to function, and function to a working fish body in a dynamic environment is now possible given new technologies from genetics to engineering and new theoretical models that integrate the different levels of biological organization (from genes to ecology).
Practice to research: integrating evidence-based practices with culture and context.
Weisner, Thomas S; Hay, M Cameron
2015-04-01
There are ways to integrate culturally competent services (CCS) and evidence-based practices (EBP) which can improve the experiences of patients and their families and communities when faced with health problems, as well as the effectiveness and positive experiences of practitioners. CCS and EBP evidence should be jointly deployed for helping patients and clinicians. Partnership research models are useful for achieving the integration of CCS and EBP, since they involve close observation of and participation by clinicians and practitioners in the research process, and often use integrated qualitative and quantitative mixed methods. We illustrate this with 3 examples of work that can help integrate CCS and EBP: ongoing collection of information from patients, clinicians and staff, or "evidence farming"; close study and continuous improvement of activities and accommodations; and use of evidence of tacit, implicit cultural scripts and norms, such as being "productive," as well as explicit scripts. From a research practice point of view, collaborative partnerships will likely produce research with culture and context bracketed in, and will contribute stronger research models, methods, and units of analysis. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
NASA Astrophysics Data System (ADS)
Wang, He; Zhang, Wen-Hao; Wong, K. Y. Michael; Wu, Si
Extensive studies suggest that the brain integrates multisensory signals in a Bayesian optimal way. However, it remains largely unknown how the sensory reliability and the prior information shape the neural architecture. In this work, we propose a biologically plausible neural field model, which can perform optimal multisensory integration and encode the whole profile of the posterior. Our model is composed of two modules, each for one modality. The crosstalks between the two modules can be carried out through feedforwad cross-links and reciprocal connections. We found that the reciprocal couplings are crucial to optimal multisensory integration in that the reciprocal coupling pattern is shaped by the correlation in the joint prior distribution of the sensory stimuli. A perturbative approach is developed to illustrate the relation between the prior information and features in coupling patterns quantitatively. Our results show that a decentralized architecture based on reciprocal connections is able to accommodate complex correlation structures across modalities and utilize this prior information in optimal multisensory integration. This work is supported by the Research Grants Council of Hong Kong (N_HKUST606/12 and 605813) and National Basic Research Program of China (2014CB846101) and the Natural Science Foundation of China (31261160495).
Wilczynski, Bartek; Furlong, Eileen E M
2010-04-15
Development is regulated by dynamic patterns of gene expression, which are orchestrated through the action of complex gene regulatory networks (GRNs). Substantial progress has been made in modeling transcriptional regulation in recent years, including qualitative "coarse-grain" models operating at the gene level to very "fine-grain" quantitative models operating at the biophysical "transcription factor-DNA level". Recent advances in genome-wide studies have revealed an enormous increase in the size and complexity or GRNs. Even relatively simple developmental processes can involve hundreds of regulatory molecules, with extensive interconnectivity and cooperative regulation. This leads to an explosion in the number of regulatory functions, effectively impeding Boolean-based qualitative modeling approaches. At the same time, the lack of information on the biophysical properties for the majority of transcription factors within a global network restricts quantitative approaches. In this review, we explore the current challenges in moving from modeling medium scale well-characterized networks to more poorly characterized global networks. We suggest to integrate coarse- and find-grain approaches to model gene regulatory networks in cis. We focus on two very well-studied examples from Drosophila, which likely represent typical developmental regulatory modules across metazoans. Copyright (c) 2009 Elsevier Inc. All rights reserved.
System Modeling and Diagnostics for Liquefying-Fuel Hybrid Rockets
NASA Technical Reports Server (NTRS)
Poll, Scott; Iverson, David; Ou, Jeremy; Sanderfer, Dwight; Patterson-Hine, Ann
2003-01-01
A Hybrid Combustion Facility (HCF) was recently built at NASA Ames Research Center to study the combustion properties of a new fuel formulation that burns approximately three times faster than conventional hybrid fuels. Researchers at Ames working in the area of Integrated Vehicle Health Management recognized a good opportunity to apply IVHM techniques to a candidate technology for next generation launch systems. Five tools were selected to examine various IVHM techniques for the HCF. Three of the tools, TEAMS (Testability Engineering and Maintenance System), L2 (Livingstone2), and RODON, are model-based reasoning (or diagnostic) systems. Two other tools in this study, ICS (Interval Constraint Simulator) and IMS (Inductive Monitoring System) do not attempt to isolate the cause of the failure but may be used for fault detection. Models of varying scope and completeness were created, both qualitative and quantitative. In each of the models, the structure and behavior of the physical system are captured. In the qualitative models, the temporal aspects of the system behavior and the abstraction of sensor data are handled outside of the model and require the development of additional code. In the quantitative model, less extensive processing code is also necessary. Examples of fault diagnoses are given.
Aarons, Gregory A; Green, Amy E; Willging, Cathleen E; Ehrhart, Mark G; Roesch, Scott C; Hecht, Debra B; Chaffin, Mark J
2014-12-10
This study examines sustainment of an EBI implemented in 11 United States service systems across two states, and delivered in 87 counties. The aims are to 1) determine the impact of state and county policies and contracting on EBI provision and sustainment; 2) investigate the role of public, private, and academic relationships and collaboration in long-term EBI sustainment; 3) assess organizational and provider factors that affect EBI reach/penetration, fidelity, and organizational sustainment climate; and 4) integrate findings through a collaborative process involving the investigative team, consultants, and system and community-based organization (CBO) stakeholders in order to further develop and refine a conceptual model of sustainment to guide future research and provide a resource for service systems to prepare for sustainment as the ultimate goal of the implementation process. A mixed-method prospective and retrospective design will be used. Semi-structured individual and group interviews will be used to collect information regarding influences on EBI sustainment including policies, attitudes, and practices; organizational factors and external policies affecting model implementation; involvement of or collaboration with other stakeholders; and outer- and inner-contextual supports that facilitate ongoing EBI sustainment. Document review (e.g., legislation, executive orders, regulations, monitoring data, annual reports, agendas and meeting minutes) will be used to examine the roles of state, county, and local policies in EBI sustainment. Quantitative measures will be collected via administrative data and web surveys to assess EBI reach/penetration, staff turnover, EBI model fidelity, organizational culture and climate, work attitudes, implementation leadership, sustainment climate, attitudes toward EBIs, program sustainment, and level of institutionalization. Hierarchical linear modeling will be used for quantitative analyses. Qualitative analyses will be tailored to each of the qualitative methods (e.g., document review, interviews). Qualitative and quantitative approaches will be integrated through an inclusive process that values stakeholder perspectives. The study of sustainment is critical to capitalizing on and benefiting from the time and fiscal investments in EBI implementation. Sustainment is also critical to realizing broad public health impact of EBI implementation. The present study takes a comprehensive mixed-method approach to understanding sustainment and refining a conceptual model of sustainment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.
2011-02-01
This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs,more » and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.« less
Venkatakrishnan, K; Friberg, L E; Ouellet, D; Mettetal, J T; Stein, A; Trocóniz, I F; Bruno, R; Mehrotra, N; Gobburu, J; Mould, D R
2015-01-01
Despite advances in biomedical research that have deepened our understanding of cancer hallmarks, resulting in the discovery and development of targeted therapies, the success rates of oncology drug development remain low. Opportunities remain for objective dose selection informed by exposure-response understanding to optimize the benefit-risk balance of novel therapies for cancer patients. This review article discusses the principles and applications of modeling and simulation approaches across the lifecycle of development of oncology therapeutics. Illustrative examples are used to convey the value gained from integration of quantitative clinical pharmacology strategies from the preclinical-translational phase through confirmatory clinical evaluation of efficacy and safety. © 2014 American Society for Clinical Pharmacology and Therapeutics.
Qualitative and Quantitative Distinctions in Personality Disorder
Wright, Aidan G. C.
2011-01-01
The “categorical-dimensional debate” has catalyzed a wealth of empirical advances in the study of personality pathology. However, this debate is merely one articulation of a broader conceptual question regarding whether to define and describe psychopathology as a quantitatively extreme expression of normal functioning or as qualitatively distinct in its process. In this paper I argue that dynamic models of personality (e.g., object-relations, cognitive-affective processing system) offer the conceptual scaffolding to reconcile these seemingly incompatible approaches to characterizing the relationship between normal and pathological personality. I propose that advances in personality assessment that sample behavior and experiences intensively provide the empirical techniques, whereas interpersonal theory offers an integrative theoretical framework, for accomplishing this goal. PMID:22804676
Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik
2016-01-01
Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840
NASA Astrophysics Data System (ADS)
Torres-Verdin, C.
2007-05-01
This paper describes the successful implementation of a new 3D AVA stochastic inversion algorithm to quantitatively integrate pre-stack seismic amplitude data and well logs. The stochastic inversion algorithm is used to characterize flow units of a deepwater reservoir located in the central Gulf of Mexico. Conventional fluid/lithology sensitivity analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generates typical Class III AVA responses. On the other hand, layer- dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution. Accordingly, AVA stochastic inversion, which combines the advantages of AVA analysis with those of geostatistical inversion, provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties (P-velocity, S-velocity, density), and lithotype (sand- shale) distributions. The quantitative use of rock/fluid information through AVA seismic amplitude data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, yields accurate 3D models of petrophysical properties such as porosity and permeability. Finally, by fully integrating pre-stack seismic amplitude data and well logs, the vertical resolution of inverted products is higher than that of deterministic inversions methods.
Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.
Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816
Protein kinetic signatures of the remodeling heart following isoproterenol stimulation.
Lam, Maggie P Y; Wang, Ding; Lau, Edward; Liem, David A; Kim, Allen K; Ng, Dominic C M; Liang, Xiangbo; Bleakley, Brian J; Liu, Chenguang; Tabaraki, Jason D; Cadeiras, Martin; Wang, Yibin; Deng, Mario C; Ping, Peipei
2014-04-01
Protein temporal dynamics play a critical role in time-dimensional pathophysiological processes, including the gradual cardiac remodeling that occurs in early-stage heart failure. Methods for quantitative assessments of protein kinetics are lacking, and despite knowledge gained from single-protein studies, integrative views of the coordinated behavior of multiple proteins in cardiac remodeling are scarce. Here, we developed a workflow that integrates deuterium oxide (2H2O) labeling, high-resolution mass spectrometry (MS), and custom computational methods to systematically interrogate in vivo protein turnover. Using this workflow, we characterized the in vivo turnover kinetics of 2,964 proteins in a mouse model of β-adrenergic-induced cardiac remodeling. The data provided a quantitative and longitudinal view of cardiac remodeling at the molecular level, revealing widespread kinetic regulations in calcium signaling, metabolism, proteostasis, and mitochondrial dynamics. We translated the workflow to human studies, creating a reference dataset of 496 plasma protein turnover rates from 4 healthy adults. The approach is applicable to short, minimal label enrichment and can be performed on as little as a single biopsy, thereby overcoming critical obstacles to clinical investigations. The protein turnover quantitation experiments and computational workflow described here should be widely applicable to large-scale biomolecular investigations of human disease mechanisms with a temporal perspective.
Protein kinetic signatures of the remodeling heart following isoproterenol stimulation
Lam, Maggie P.Y.; Wang, Ding; Lau, Edward; Liem, David A.; Kim, Allen K.; Ng, Dominic C.M.; Liang, Xiangbo; Bleakley, Brian J.; Liu, Chenguang; Tabaraki, Jason D.; Cadeiras, Martin; Wang, Yibin; Deng, Mario C.; Ping, Peipei
2014-01-01
Protein temporal dynamics play a critical role in time-dimensional pathophysiological processes, including the gradual cardiac remodeling that occurs in early-stage heart failure. Methods for quantitative assessments of protein kinetics are lacking, and despite knowledge gained from single-protein studies, integrative views of the coordinated behavior of multiple proteins in cardiac remodeling are scarce. Here, we developed a workflow that integrates deuterium oxide (2H2O) labeling, high-resolution mass spectrometry (MS), and custom computational methods to systematically interrogate in vivo protein turnover. Using this workflow, we characterized the in vivo turnover kinetics of 2,964 proteins in a mouse model of β-adrenergic–induced cardiac remodeling. The data provided a quantitative and longitudinal view of cardiac remodeling at the molecular level, revealing widespread kinetic regulations in calcium signaling, metabolism, proteostasis, and mitochondrial dynamics. We translated the workflow to human studies, creating a reference dataset of 496 plasma protein turnover rates from 4 healthy adults. The approach is applicable to short, minimal label enrichment and can be performed on as little as a single biopsy, thereby overcoming critical obstacles to clinical investigations. The protein turnover quantitation experiments and computational workflow described here should be widely applicable to large-scale biomolecular investigations of human disease mechanisms with a temporal perspective. PMID:24614109
An integrated conceptual framework for evaluating and improving 'understanding' in informed consent.
Bossert, Sabine; Strech, Daniel
2017-10-17
The development of understandable informed consent (IC) documents has proven to be one of the most important challenges in research with humans as well as in healthcare settings. Therefore, evaluating and improving understanding has been of increasing interest for empirical research on IC. However, several conceptual and practical challenges for the development of understandable IC documents remain unresolved. In this paper, we will outline and systematize some of these challenges. On the basis of our own experiences in empirical user testing of IC documents as well as the relevant literature on understanding in IC, we propose an integrated conceptual model for the development of understandable IC documents. The proposed conceptual model integrates different methods for the participatory improvement of written information, including IC, as well as quantitative methods for measuring understanding in IC. In most IC processes, understandable written information is an important prerequisite for valid IC. To improve the quality of IC documents, a conceptual model for participatory procedures of testing, revising, and retesting can be applied. However, the model presented in this paper needs further theoretical and empirical elaboration and clarification of several conceptual and practical challenges.
Bilinearity in Spatiotemporal Integration of Synaptic Inputs
Li, Songting; Liu, Nan; Zhang, Xiao-hui; Zhou, Douglas; Cai, David
2014-01-01
Neurons process information via integration of synaptic inputs from dendrites. Many experimental results demonstrate dendritic integration could be highly nonlinear, yet few theoretical analyses have been performed to obtain a precise quantitative characterization analytically. Based on asymptotic analysis of a two-compartment passive cable model, given a pair of time-dependent synaptic conductance inputs, we derive a bilinear spatiotemporal dendritic integration rule. The summed somatic potential can be well approximated by the linear summation of the two postsynaptic potentials elicited separately, plus a third additional bilinear term proportional to their product with a proportionality coefficient . The rule is valid for a pair of synaptic inputs of all types, including excitation-inhibition, excitation-excitation, and inhibition-inhibition. In addition, the rule is valid during the whole dendritic integration process for a pair of synaptic inputs with arbitrary input time differences and input locations. The coefficient is demonstrated to be nearly independent of the input strengths but is dependent on input times and input locations. This rule is then verified through simulation of a realistic pyramidal neuron model and in electrophysiological experiments of rat hippocampal CA1 neurons. The rule is further generalized to describe the spatiotemporal dendritic integration of multiple excitatory and inhibitory synaptic inputs. The integration of multiple inputs can be decomposed into the sum of all possible pairwise integration, where each paired integration obeys the bilinear rule. This decomposition leads to a graph representation of dendritic integration, which can be viewed as functionally sparse. PMID:25521832
Challenges in Developing Models Describing Complex Soil Systems
NASA Astrophysics Data System (ADS)
Simunek, J.; Jacques, D.
2014-12-01
Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.
Global, quantitative and dynamic mapping of protein subcellular localization.
Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg Hh
2016-06-09
Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology.
qF-SSOP: real-time optical property corrected fluorescence imaging
Valdes, Pablo A.; Angelo, Joseph P.; Choi, Hak Soo; Gioux, Sylvain
2017-01-01
Fluorescence imaging is well suited to provide image guidance during resections in oncologic and vascular surgery. However, the distorting effects of tissue optical properties on the emitted fluorescence are poorly compensated for on even the most advanced fluorescence image guidance systems, leading to subjective and inaccurate estimates of tissue fluorophore concentrations. Here we present a novel fluorescence imaging technique that performs real-time (i.e., video rate) optical property corrected fluorescence imaging. We perform full field of view simultaneous imaging of tissue optical properties using Single Snapshot of Optical Properties (SSOP) and fluorescence detection. The estimated optical properties are used to correct the emitted fluorescence with a quantitative fluorescence model to provide quantitative fluorescence-Single Snapshot of Optical Properties (qF-SSOP) images with less than 5% error. The technique is rigorous, fast, and quantitative, enabling ease of integration into the surgical workflow with the potential to improve molecular guidance intraoperatively. PMID:28856038
Bashiri, Azadeh; Shahmoradi, Leila; Beigy, Hamid; Savareh, Behrouz A; Nosratabadi, Masood; N Kalhori, Sharareh R; Ghazisaeedi, Marjan
2018-06-01
Quantitative EEG gives valuable information in the clinical evaluation of psychological disorders. The purpose of the present study is to identify the most prominent features of quantitative electroencephalography (QEEG) that affect attention and response control parameters in children with attention deficit hyperactivity disorder. The QEEG features and the Integrated Visual and Auditory-Continuous Performance Test ( IVA-CPT) of 95 attention deficit hyperactivity disorder subjects were preprocessed by Independent Evaluation Criterion for Binary Classification. Then, the importance of selected features in the classification of desired outputs was evaluated using the artificial neural network. Findings uncovered the highest rank of QEEG features in each IVA-CPT parameters related to attention and response control. Using the designed model could help therapists to determine the existence or absence of defects in attention and response control relying on QEEG.
Rusyn, Ivan; Sedykh, Alexander; Guyton, Kathryn Z.; Tropsha, Alexander
2012-01-01
Quantitative structure-activity relationship (QSAR) models are widely used for in silico prediction of in vivo toxicity of drug candidates or environmental chemicals, adding value to candidate selection in drug development or in a search for less hazardous and more sustainable alternatives for chemicals in commerce. The development of traditional QSAR models is enabled by numerical descriptors representing the inherent chemical properties that can be easily defined for any number of molecules; however, traditional QSAR models often have limited predictive power due to the lack of data and complexity of in vivo endpoints. Although it has been indeed difficult to obtain experimentally derived toxicity data on a large number of chemicals in the past, the results of quantitative in vitro screening of thousands of environmental chemicals in hundreds of experimental systems are now available and continue to accumulate. In addition, publicly accessible toxicogenomics data collected on hundreds of chemicals provide another dimension of molecular information that is potentially useful for predictive toxicity modeling. These new characteristics of molecular bioactivity arising from short-term biological assays, i.e., in vitro screening and/or in vivo toxicogenomics data can now be exploited in combination with chemical structural information to generate hybrid QSAR–like quantitative models to predict human toxicity and carcinogenicity. Using several case studies, we illustrate the benefits of a hybrid modeling approach, namely improvements in the accuracy of models, enhanced interpretation of the most predictive features, and expanded applicability domain for wider chemical space coverage. PMID:22387746
ERIC Educational Resources Information Center
Luyt, Russell
2012-01-01
A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watney, W.L.
1992-08-01
Interdisciplinary studies of the Upper Pennsylvanian Lansing and Kansas City groups have been undertaken in order to improve the geologic characterization of petroleum reservoirs and to develop a quantitative understanding of the processes responsible for formation of associated depositional sequences. To this end, concepts and methods of sequence stratigraphy are being used to define and interpret the three-dimensional depositional framework of the Kansas City Group. The investigation includes characterization of reservoir rocks in oil fields in western Kansas, description of analog equivalents in near-surface and surface sites in southeastern Kansas, and construction of regional structural and stratigraphic framework to linkmore » the site specific studies. Geologic inverse and simulation models are being developed to integrate quantitative estimates of controls on sedimentation to produce reconstructions of reservoir-bearing strata in an attempt to enhance our ability to predict reservoir characteristics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watney, W.L.
1992-01-01
Interdisciplinary studies of the Upper Pennsylvanian Lansing and Kansas City groups have been undertaken in order to improve the geologic characterization of petroleum reservoirs and to develop a quantitative understanding of the processes responsible for formation of associated depositional sequences. To this end, concepts and methods of sequence stratigraphy are being used to define and interpret the three-dimensional depositional framework of the Kansas City Group. The investigation includes characterization of reservoir rocks in oil fields in western Kansas, description of analog equivalents in near-surface and surface sites in southeastern Kansas, and construction of regional structural and stratigraphic framework to linkmore » the site specific studies. Geologic inverse and simulation models are being developed to integrate quantitative estimates of controls on sedimentation to produce reconstructions of reservoir-bearing strata in an attempt to enhance our ability to predict reservoir characteristics.« less
Bayram, Jamil D; Zuabi, Shawki; Subbarao, Italo
2011-06-01
Hospital surge capacity in multiple casualty events (MCE) is the core of hospital medical response, and an integral part of the total medical capacity of the community affected. To date, however, there has been no consensus regarding the definition or quantification of hospital surge capacity. The first objective of this study was to quantitatively benchmark the various components of hospital surge capacity pertaining to the care of critically and moderately injured patients in trauma-related MCE. The second objective was to illustrate the applications of those quantitative parameters in local, regional, national, and international disaster planning; in the distribution of patients to various hospitals by prehospital medical services; and in the decision-making process for ambulance diversion. A 2-step approach was adopted in the methodology of this study. First, an extensive literature search was performed, followed by mathematical modeling. Quantitative studies on hospital surge capacity for trauma injuries were used as the framework for our model. The North Atlantic Treaty Organization triage categories (T1-T4) were used in the modeling process for simplicity purposes. Hospital Acute Care Surge Capacity (HACSC) was defined as the maximum number of critical (T1) and moderate (T2) casualties a hospital can adequately care for per hour, after recruiting all possible additional medical assets. HACSC was modeled to be equal to the number of emergency department beds (#EDB), divided by the emergency department time (EDT); HACSC = #EDB/EDT. In trauma-related MCE, the EDT was quantitatively benchmarked to be 2.5 (hours). Because most of the critical and moderate casualties arrive at hospitals within a 6-hour period requiring admission (by definition), the hospital bed surge capacity must match the HACSC at 6 hours to ensure coordinated care, and it was mathematically benchmarked to be 18% of the staffed hospital bed capacity. Defining and quantitatively benchmarking the different components of hospital surge capacity is vital to hospital preparedness in MCE. Prospective studies of our mathematical model are needed to verify its applicability, generalizability, and validity.
Cardozo, Erwing Fabian; Andrade, Adriana; Mellors, John W.; ...
2017-07-05
The kinetics of HIV-1 decay under treatment depends on the class of antiretrovirals used. Mathematical models are useful to interpret the different profiles, providing quantitative information about the kinetics of virus replication and the cell populations contributing to viral decay. We modeled proviral integration in short- and long-lived infected cells to compare viral kinetics under treatment with and without the integrase inhibitor raltegravir (RAL). We fitted the model to data obtained from participants treated with RAL-containing regimes or with a four-drug regimen of protease and reverse transcriptase inhibitors. Our model explains the existence and quantifies the three phases of HIV-1more » RNA decay in RAL-based regimens vs. the two phases observed in therapies without RAL. Our findings indicate that HIV-1 infection is mostly sustained by short-lived infected cells with fast integration and a short viral production period, and by long-lived infected cells with slow integration but an equally short viral production period. We propose that these cells represent activated and resting infected CD4+ T-cells, respectively, and estimate that infection of resting cells represent ~4% of productive reverse transcription events in chronic infection. RAL reveals the kinetics of proviral integration, showing that in short-lived cells the pre-integration population has a half-life of ~7 hours, whereas in long-lived cells this half-life is ~6 weeks. We also show that the efficacy of RAL can be estimated by the difference in viral load at the start of the second phase in protocols with and without RAL. Altogether, we provide a mechanistic model of viral infection that parsimoniously explains the kinetics of viral load decline under multiple classes of antiretrovirals.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardozo, Erwing Fabian; Andrade, Adriana; Mellors, John W.
The kinetics of HIV-1 decay under treatment depends on the class of antiretrovirals used. Mathematical models are useful to interpret the different profiles, providing quantitative information about the kinetics of virus replication and the cell populations contributing to viral decay. We modeled proviral integration in short- and long-lived infected cells to compare viral kinetics under treatment with and without the integrase inhibitor raltegravir (RAL). We fitted the model to data obtained from participants treated with RAL-containing regimes or with a four-drug regimen of protease and reverse transcriptase inhibitors. Our model explains the existence and quantifies the three phases of HIV-1more » RNA decay in RAL-based regimens vs. the two phases observed in therapies without RAL. Our findings indicate that HIV-1 infection is mostly sustained by short-lived infected cells with fast integration and a short viral production period, and by long-lived infected cells with slow integration but an equally short viral production period. We propose that these cells represent activated and resting infected CD4+ T-cells, respectively, and estimate that infection of resting cells represent ~4% of productive reverse transcription events in chronic infection. RAL reveals the kinetics of proviral integration, showing that in short-lived cells the pre-integration population has a half-life of ~7 hours, whereas in long-lived cells this half-life is ~6 weeks. We also show that the efficacy of RAL can be estimated by the difference in viral load at the start of the second phase in protocols with and without RAL. Altogether, we provide a mechanistic model of viral infection that parsimoniously explains the kinetics of viral load decline under multiple classes of antiretrovirals.« less
Analyzing Human-Landscape Interactions: Tools That Integrate
NASA Astrophysics Data System (ADS)
Zvoleff, Alex; An, Li
2014-01-01
Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.
Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa
2014-01-01
Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students' apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course in which we integrated application of prerequisite mathematical skills with biology content and reasoning throughout all aspects of the course. In this paper, we describe the principles of our course design and present illustrative examples of course materials integrating mathematics and biology. We also designed an outcome assessment made up of items testing students' understanding of biology concepts and their ability to apply mathematical skills in biological contexts and administered it as a pre/postcourse test to students in the experimental section and other sections of the same course. Precourse results confirmed students' inability to spontaneously transfer their prerequisite mathematics skills to biological problems. Pre/postcourse outcome assessment comparisons showed that, compared with students in other sections, students in the experimental section made greater gains on integrated math/biology items. They also made comparable gains on biology items, indicating that integrating quantitative skills into an introductory biology course does not have a deleterious effect on students' biology learning.
Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa
2014-01-01
Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students’ apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course in which we integrated application of prerequisite mathematical skills with biology content and reasoning throughout all aspects of the course. In this paper, we describe the principles of our course design and present illustrative examples of course materials integrating mathematics and biology. We also designed an outcome assessment made up of items testing students’ understanding of biology concepts and their ability to apply mathematical skills in biological contexts and administered it as a pre/postcourse test to students in the experimental section and other sections of the same course. Precourse results confirmed students’ inability to spontaneously transfer their prerequisite mathematics skills to biological problems. Pre/postcourse outcome assessment comparisons showed that, compared with students in other sections, students in the experimental section made greater gains on integrated math/biology items. They also made comparable gains on biology items, indicating that integrating quantitative skills into an introductory biology course does not have a deleterious effect on students’ biology learning. PMID:24591504
Research of MPPT for photovoltaic generation based on two-dimensional cloud model
NASA Astrophysics Data System (ADS)
Liu, Shuping; Fan, Wei
2013-03-01
The cloud model is a mathematical representation to fuzziness and randomness in linguistic concepts. It represents a qualitative concept with expected value Ex, entropy En and hyper entropy He, and integrates the fuzziness and randomness of a linguistic concept in a unified way. This model is a new method for transformation between qualitative and quantitative in the knowledge. This paper is introduced MPPT (maximum power point tracking, MPPT) controller based two- dimensional cloud model through analysis of auto-optimization MPPT control of photovoltaic power system and combining theory of cloud model. Simulation result shows that the cloud controller is simple and easy, directly perceived through the senses, and has strong robustness, better control performance.
Agent-based re-engineering of ErbB signaling: a modeling pipeline for integrative systems biology.
Das, Arya A; Ajayakumar Darsana, T; Jacob, Elizabeth
2017-03-01
Experiments in systems biology are generally supported by a computational model which quantitatively estimates the parameters of the system by finding the best fit to the experiment. Mathematical models have proved to be successful in reverse engineering the system. The data generated is interpreted to understand the dynamics of the underlying phenomena. The question we have sought to answer is that - is it possible to use an agent-based approach to re-engineer a biological process, making use of the available knowledge from experimental and modelling efforts? Can the bottom-up approach benefit from the top-down exercise so as to create an integrated modelling formalism for systems biology? We propose a modelling pipeline that learns from the data given by reverse engineering, and uses it for re-engineering the system, to carry out in-silico experiments. A mathematical model that quantitatively predicts co-expression of EGFR-HER2 receptors in activation and trafficking has been taken for this study. The pipeline architecture takes cues from the population model that gives the rates of biochemical reactions, to formulate knowledge-based rules for the particle model. Agent-based simulations using these rules, support the existing facts on EGFR-HER2 dynamics. We conclude that, re-engineering models, built using the results of reverse engineering, opens up the possibility of harnessing the power pack of data which now lies scattered in literature. Virtual experiments could then become more realistic when empowered with the findings of empirical cell biology and modelling studies. Implemented on the Agent Modelling Framework developed in-house. C ++ code templates available in Supplementary material . liz.csir@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Dancik, Yuri; Bigliardi, Paul L; Bigliardi-Qi, Mei
2015-12-01
Animal-based developmental and reproductive toxicological studies involving skin exposure rarely incorporate information on skin permeation kinetics. For practical reasons, animal studies cannot investigate the many factors which can affect human skin permeation and systemic uptake kinetics in real-life scenarios. Traditional route-to-route extrapolation is based on the same types of experiments and requires assumptions regarding route similarity. Pharmacokinetic modeling based on skin physiology and structure is the most efficient way to incorporate the variety of intrinsic skin and exposure-dependent parameters occurring in clinical and occupational settings into one framework. Physiologically-based pharmacokinetic models enable the integration of available in vivo, in vitro and in silico data to quantitatively predict the kinetics of uptake at the site of interest, as needed for 21st century toxicology and risk assessment. As demonstrated herein, proper interpretation and integration of these data is a multidisciplinary endeavor requiring toxicological, risk assessment, mathematical, pharmaceutical, biological and dermatological expertise. Copyright © 2015 Elsevier Inc. All rights reserved.
Low, Lian Leng; Maulod, Adlina; Lee, Kheng Hock
2017-10-08
Poorer health outcomes and disproportionate healthcare use in socioeconomically disadvantaged patients is well established. However, there is sparse literature on effective integrated care interventions that specifically target these high-risk individuals. The Integrated Community of Care (ICoC) is a novel care model that integrates hospital-based transitional care with health and social care in the community for high-risk individuals living in socially deprived communities. This study aims to evaluate the effectiveness of the ICoC in reducing acute hospital use and investigate the implementation process and its effects on clinical outcomes using a mixed-methods participatory action research (PAR) approach. This is a single-centre prospective, controlled, observational study performed in the SingHealth Regional Health System. A total of 250 eligible patients from an urbanised low-income community in Singapore will be enrolled during their index hospitalisation. Our PAR model combines two research components: quantitative and qualitative, at different phases of the intervention. Outcomes of acute hospital use and health-related quality of life are compared with controls, at 30 days and 1 year. The qualitative study aims at developing a more context-specific social ecological model of health behaviour. This model will identify how influences within one's social environment: individual, interpersonal, organisational, community and policy factors affect people's experiences and behaviours during care transitions from hospital to home. Knowledge on the operational aspects of ICoC will enrich our evidence-based strategies to understand the impact of the ICoC. The blending of qualitative and quantitative mixed methods recognises the dynamic implementation processes as well as the complex and evolving needs of community stakeholders in shaping outcomes. Ethics approval was granted by the SingHealth Centralised Institutional Review Board (CIRB 2015/2277). The findings from this study will be disseminated by publications in peer-reviewed journals, scientific meetings and presentations to government policy-makers. NCT02678273. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Maulod, Adlina; Lee, Kheng Hock
2017-01-01
Introduction Poorer health outcomes and disproportionate healthcare use in socioeconomically disadvantaged patients is well established. However, there is sparse literature on effective integrated care interventions that specifically target these high-risk individuals. The Integrated Community of Care (ICoC) is a novel care model that integrates hospital-based transitional care with health and social care in the community for high-risk individuals living in socially deprived communities. This study aims to evaluate the effectiveness of the ICoC in reducing acute hospital use and investigate the implementation process and its effects on clinical outcomes using a mixed-methods participatory action research (PAR) approach. Methods and analysis This is a single-centre prospective, controlled, observational study performed in the SingHealth Regional Health System. A total of 250 eligible patients from an urbanised low-income community in Singapore will be enrolled during their index hospitalisation. Our PAR model combines two research components: quantitative and qualitative, at different phases of the intervention. Outcomes of acute hospital use and health-related quality of life are compared with controls, at 30 days and 1 year. The qualitative study aims at developing a more context-specific social ecological model of health behaviour. This model will identify how influences within one’s social environment: individual, interpersonal, organisational, community and policy factors affect people’s experiences and behaviours during care transitions from hospital to home. Knowledge on the operational aspects of ICoC will enrich our evidence-based strategies to understand the impact of the ICoC. The blending of qualitative and quantitative mixed methods recognises the dynamic implementation processes as well as the complex and evolving needs of community stakeholders in shaping outcomes. Ethics and dissemination Ethics approval was granted by the SingHealth Centralised Institutional Review Board (CIRB 2015/2277). The findings from this study will be disseminated by publications in peer-reviewed journals, scientific meetings and presentations to government policy-makers. Trial registration number NCT02678273 PMID:28993391
NASA Astrophysics Data System (ADS)
Hanan, Lu; Qiushi, Li; Shaobin, Li
2016-12-01
This paper presents an integrated optimization design method in which uniform design, response surface methodology and genetic algorithm are used in combination. In detail, uniform design is used to select the experimental sampling points in the experimental domain and the system performance is evaluated by means of computational fluid dynamics to construct a database. After that, response surface methodology is employed to generate a surrogate mathematical model relating the optimization objective and the design variables. Subsequently, genetic algorithm is adopted and applied to the surrogate model to acquire the optimal solution in the case of satisfying some constraints. The method has been applied to the optimization design of an axisymmetric diverging duct, dealing with three design variables including one qualitative variable and two quantitative variables. The method of modeling and optimization design performs well in improving the duct aerodynamic performance and can be also applied to wider fields of mechanical design and seen as a useful tool for engineering designers, by reducing the design time and computation consumption.
Integrating Quantitative and Qualitative Data in Mixed Methods Research--Challenges and Benefits
ERIC Educational Resources Information Center
Almalki, Sami
2016-01-01
This paper is concerned with investigating the integration of quantitative and qualitative data in mixed methods research and whether, in spite of its challenges, it can be of positive benefit to many investigative studies. The paper introduces the topic, defines the terms with which this subject deals and undertakes a literature review to outline…
Evaluation of an Integrated Curriculum in Physics, Mathematics, Engineering, and Chemistry
NASA Astrophysics Data System (ADS)
Beichner, Robert
1997-04-01
An experimental, student centered, introductory curriculum called IMPEC (for Integrated Mathematics, Physics, Engineering, and Chemistry curriculum) is in its third year of pilot-testing at NCSU. The curriculum is taught by a multidisciplinary team of professors using a combination of traditional lecturing and alternative instructional methods including cooperative learning, activity-based class sessions, and extensive use of computer modeling, simulations, and the world wide web. This talk will discuss the research basis for our design and implementation of the curriculum, the qualitative and quantitative methods we have been using to assess its effectiveness, and the educational outcomes we have noted so far.
Ocean regional circulation model sensitizes to resolution of the lateral boundary conditions
NASA Astrophysics Data System (ADS)
Pham, Van Sy; Hwang, Jin Hwan
2017-04-01
Dynamical downscaling with nested regional oceanographic models is an effective approach for forecasting operationally coastal weather and projecting long term climate on the ocean. Nesting procedures deliver the unwanted in dynamic downscaling due to the differences of numerical grid sizes and updating steps. Therefore, such unavoidable errors restrict the application of the Ocean Regional Circulation Model (ORCMs) in both short-term forecasts and long-term projections. The current work identifies the effects of errors induced by computational limitations during nesting procedures on the downscaled results of the ORCMs. The errors are quantitatively evaluated for each error source and its characteristics by the Big-Brother Experiments (BBE). The BBE separates identified errors from each other and quantitatively assess the amount of uncertainties employing the same model to simulate for both nesting and nested model. Here, we focus on discussing errors resulting from two main matters associated with nesting procedures. They should be the spatial grids' differences and the temporal updating steps. After the diverse cases from separately running of the BBE, a Taylor diagram was adopted to analyze the results and suggest an optimization intern of grid size and updating period and domain sizes. Key words: lateral boundary condition, error, ocean regional circulation model, Big-Brother Experiment. Acknowledgement: This research was supported by grants from the Korean Ministry of Oceans and Fisheries entitled "Development of integrated estuarine management system" and a National Research Foundation of Korea (NRF) Grant (No. 2015R1A5A 7037372) funded by MSIP of Korea. The authors thank the Integrated Research Institute of Construction and Environmental Engineering of Seoul National University for administrative support.
ERIC Educational Resources Information Center
Detering, Brad
2017-01-01
This research study, grounded in the theoretical framework of education change, used the Concerns-Based Adoption Model of change to examine the concerns of Illinois high school teachers and administrators regarding the implementation of 1:1 computing programs. A quantitative study of educators investigated the stages of concern and the mathematics…
The system of technical diagnostics of the industrial safety information network
NASA Astrophysics Data System (ADS)
Repp, P. V.
2017-01-01
This research is devoted to problems of safety of the industrial information network. Basic sub-networks, ensuring reliable operation of the elements of the industrial Automatic Process Control System, were identified. The core tasks of technical diagnostics of industrial information safety were presented. The structure of the technical diagnostics system of the information safety was proposed. It includes two parts: a generator of cyber-attacks and the virtual model of the enterprise information network. The virtual model was obtained by scanning a real enterprise network. A new classification of cyber-attacks was proposed. This classification enables one to design an efficient generator of cyber-attacks sets for testing the virtual modes of the industrial information network. The numerical method of the Monte Carlo (with LPτ - sequences of Sobol), and Markov chain was considered as the design method for the cyber-attacks generation algorithm. The proposed system also includes a diagnostic analyzer, performing expert functions. As an integrative quantitative indicator of the network reliability the stability factor (Kstab) was selected. This factor is determined by the weight of sets of cyber-attacks, identifying the vulnerability of the network. The weight depends on the frequency and complexity of cyber-attacks, the degree of damage, complexity of remediation. The proposed Kstab is an effective integral quantitative measure of the information network reliability.
Biomechanics-based in silico medicine: the manifesto of a new science.
Viceconti, Marco
2015-01-21
In this perspective article we discuss the role of contemporary biomechanics in the light of recent applications such as the development of the so-called Virtual Physiological Human technologies for physiology-based in silico medicine. In order to build Virtual Physiological Human (VPH) models, computer models that capture and integrate the complex systemic dynamics of living organisms across radically different space-time scales, we need to re-formulate a vast body of existing biology and physiology knowledge so that it is formulated as a quantitative hypothesis, which can be expressed in mathematical terms. Once the predictive accuracy of these models is confirmed against controlled experiments and against clinical observations, we will have VPH model that can reliably predict certain quantitative changes in health status of a given patient, but also, more important, we will have a theory, in the true meaning this word has in the scientific method. In this scenario, biomechanics plays a very important role, biomechanics is one of the few areas of life sciences where we attempt to build full mechanistic explanations based on quantitative observations, in other words, we investigate living organisms like physical systems. This is in our opinion a Copernican revolution, around which the scope of biomechanics should be re-defined. Thus, we propose a new definition for our research domain "Biomechanics is the study of living organisms as mechanistic systems". Copyright © 2014 Elsevier Ltd. All rights reserved.
Path analysis of the genetic integration of traits in the sand cricket: a novel use of BLUPs.
Roff, D A; Fairbairn, D J
2011-09-01
This study combines path analysis with quantitative genetics to analyse a key life history trade-off in the cricket, Gryllus firmus. We develop a path model connecting five traits associated with the trade-off between flight capability and reproduction and test this model using phenotypic data and estimates of breeding values (best linear unbiased predictors) from a half-sibling experiment. Strong support by both types of data validates our causal model and indicates concordance between the phenotypic and genetic expression of the trade-off. Comparisons of the trade-off between sexes and wing morphs reveal that these discrete phenotypes are not genetically independent and that the evolutionary trajectories of the two wing morphs are more tightly constrained to covary than those of the two sexes. Our results illustrate the benefits of combining a quantitative genetic analysis, which examines statistical correlations between traits, with a path model that focuses upon the causal components of variation. © 2011 The Authors. Journal of Evolutionary Biology © 2011 European Society For Evolutionary Biology.
Blackboard architecture for medical image interpretation
NASA Astrophysics Data System (ADS)
Davis, Darryl N.; Taylor, Christopher J.
1991-06-01
There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Dongsheng; Lavender, Curt
2015-05-08
Improving yield strength and asymmetry is critical to expand applications of magnesium alloys in industry for higher fuel efficiency and lower CO 2 production. Grain refinement is an efficient method for strengthening low symmetry magnesium alloys, achievable by precipitate refinement. This study provides guidance on how precipitate engineering will improve mechanical properties through grain refinement. Precipitate refinement for improving yield strengths and asymmetry is simulated quantitatively by coupling a stochastic second phase grain refinement model and a modified polycrystalline crystal viscoplasticity φ-model. Using the stochastic second phase grain refinement model, grain size is quantitatively determined from the precipitate size andmore » volume fraction. Yield strengths, yield asymmetry, and deformation behavior are calculated from the modified φ-model. If the precipitate shape and size remain constant, grain size decreases with increasing precipitate volume fraction. If the precipitate volume fraction is kept constant, grain size decreases with decreasing precipitate size during precipitate refinement. Yield strengths increase and asymmetry approves to one with decreasing grain size, contributed by increasing precipitate volume fraction or decreasing precipitate size.« less
Damman, Peter; Holmvang, Lene; Tijssen, Jan G P; Lagerqvist, Bo; Clayton, Tim C; Pocock, Stuart J; Windhausen, Fons; Hirsch, Alexander; Fox, Keith A A; Wallentin, Lars; de Winter, Robbert J
2012-01-01
The aim of this study was to evaluate the independent prognostic value of qualitative and quantitative admission electrocardiographic (ECG) analysis regarding long-term outcomes after non-ST-segment elevation acute coronary syndromes (NSTE-ACS). From the Fragmin and Fast Revascularization During Instability in Coronary Artery Disease (FRISC II), Invasive Versus Conservative Treatment in Unstable Coronary Syndromes (ICTUS), and Randomized Intervention Trial of Unstable Angina 3 (RITA-3) patient-pooled database, 5,420 patients with NSTE-ACS with qualitative ECG data, of whom 2,901 had quantitative data, were included in this analysis. The main outcome was 5-year cardiovascular death or myocardial infarction. Hazard ratios (HRs) were calculated with Cox regression models, and adjustments were made for established outcome predictors. The additional discriminative value was assessed with the category-less net reclassification improvement and integrated discrimination improvement indexes. In the 5,420 patients, the presence of ST-segment depression (≥1 mm; adjusted HR 1.43, 95% confidence interval [CI] 1.25 to 1.63) and left bundle branch block (adjusted HR 1.64, 95% CI 1.18 to 2.28) were independently associated with long-term cardiovascular death or myocardial infarction. Risk increases were short and long term. On quantitative ECG analysis, cumulative ST-segment depression (≥5 mm; adjusted HR 1.34, 95% CI 1.05 to 1.70), the presence of left bundle branch block (adjusted HR 2.15, 95% CI 1.36 to 3.40) or ≥6 leads with inverse T waves (adjusted HR 1.22, 95% CI 0.97 to 1.55) was independently associated with long-term outcomes. No interaction was observed with treatment strategy. No improvements in net reclassification improvement and integrated discrimination improvement were observed after the addition of quantitative characteristics to a model including qualitative characteristics. In conclusion, in the FRISC II, ICTUS, and RITA-3 NSTE-ACS patient-pooled data set, admission ECG characteristics provided long-term prognostic value for cardiovascular death or myocardial infarction. Quantitative ECG characteristics provided no incremental discrimination compared to qualitative data. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Balbi, Stefano; Villa, Ferdinando; Mojtahed, Vahid; Hegetschweiler, Karin Tessa; Giupponi, Carlo
2016-06-01
This article presents a novel methodology to assess flood risk to people by integrating people's vulnerability and ability to cushion hazards through coping and adapting. The proposed approach extends traditional risk assessments beyond material damages; complements quantitative and semi-quantitative data with subjective and local knowledge, improving the use of commonly available information; and produces estimates of model uncertainty by providing probability distributions for all of its outputs. Flood risk to people is modeled using a spatially explicit Bayesian network model calibrated on expert opinion. Risk is assessed in terms of (1) likelihood of non-fatal physical injury, (2) likelihood of post-traumatic stress disorder and (3) likelihood of death. The study area covers the lower part of the Sihl valley (Switzerland) including the city of Zurich. The model is used to estimate the effect of improving an existing early warning system, taking into account the reliability, lead time and scope (i.e., coverage of people reached by the warning). Model results indicate that the potential benefits of an improved early warning in terms of avoided human impacts are particularly relevant in case of a major flood event.
A cognitive perspective on health systems integration: results of a Canadian Delphi study.
Evans, Jenna M; Baker, G Ross; Berta, Whitney; Barnsley, Jan
2014-05-19
Ongoing challenges to healthcare integration point toward the need to move beyond structural and process issues. While we know what needs to be done to achieve integrated care, there is little that informs us as to how. We need to understand how diverse organizations and professionals develop shared knowledge and beliefs - that is, we need to generate knowledge about normative integration. We present a cognitive perspective on integration, based on shared mental model theory, that may enhance our understanding and ability to measure and influence normative integration. The aim of this paper is to validate and improve the Mental Models of Integrated Care (MMIC) Framework, which outlines important knowledge and beliefs whose convergence or divergence across stakeholder groups may influence inter-professional and inter-organizational relations. We used a two-stage web-based modified Delphi process to test the MMIC Framework against expert opinion using a random sample of participants from Canada's National Symposium on Integrated Care. Respondents were asked to rate the framework's clarity, comprehensiveness, usefulness, and importance using seven-point ordinal scales. Spaces for open comments were provided. Descriptive statistics were used to describe the structured responses, while open comments were coded and categorized using thematic analysis. The Kruskall-Wallis test was used to examine cross-group agreement by level of integration experience, current workplace, and current role. In the first round, 90 individuals responded (52% response rate), representing a wide range of professional roles and organization types from across the continuum of care. In the second round, 68 individuals responded (75.6% response rate). The quantitative and qualitative feedback from experts was used to revise the framework. The re-named "Integration Mindsets Framework" consists of a Strategy Mental Model and a Relationships Mental Model, comprising a total of nineteen content areas. The Integration Mindsets Framework draws the attention of researchers and practitioners to how various stakeholders think about and conceptualize integration. A cognitive approach to understanding and measuring normative integration complements dominant cultural approaches and allows for more fine-grained analyses. The framework can be used by managers and leaders to facilitate the interpretation, planning, implementation, management and evaluation of integration initiatives.
Designing a mixed methods study in primary care.
Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V
2004-01-01
Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.
Leung, Janet T Y; Shek, Daniel T L
2011-01-01
This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.
ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra
NASA Astrophysics Data System (ADS)
Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.
2011-08-01
Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.
Methods, Tools and Current Perspectives in Proteogenomics *
Ruggles, Kelly V.; Krug, Karsten; Wang, Xiaojing; Clauser, Karl R.; Wang, Jing; Payne, Samuel H.; Fenyö, David; Zhang, Bing; Mani, D. R.
2017-01-01
With combined technological advancements in high-throughput next-generation sequencing and deep mass spectrometry-based proteomics, proteogenomics, i.e. the integrative analysis of proteomic and genomic data, has emerged as a new research field. Early efforts in the field were focused on improving protein identification using sample-specific genomic and transcriptomic sequencing data. More recently, integrative analysis of quantitative measurements from genomic and proteomic studies have identified novel insights into gene expression regulation, cell signaling, and disease. Many methods and tools have been developed or adapted to enable an array of integrative proteogenomic approaches and in this article, we systematically classify published methods and tools into four major categories, (1) Sequence-centric proteogenomics; (2) Analysis of proteogenomic relationships; (3) Integrative modeling of proteogenomic data; and (4) Data sharing and visualization. We provide a comprehensive review of methods and available tools in each category and highlight their typical applications. PMID:28456751
Integrated modeling approach for optimal management of water, energy and food security nexus
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Vesselinov, Velimir V.
2017-03-01
Water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-period socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. The obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.
Advances in understanding river-groundwater interactions
NASA Astrophysics Data System (ADS)
Brunner, Philip; Therrien, René; Renard, Philippe; Simmons, Craig T.; Franssen, Harrie-Jan Hendricks
2017-09-01
River-groundwater interactions are at the core of a wide range of major contemporary challenges, including the provision of high-quality drinking water in sufficient quantities, the loss of biodiversity in river ecosystems, or the management of environmental flow regimes. This paper reviews state of the art approaches in characterizing and modeling river and groundwater interactions. Our review covers a wide range of approaches, including remote sensing to characterize the streambed, emerging methods to measure exchange fluxes between rivers and groundwater, and developments in several disciplines relevant to the river-groundwater interface. We discuss approaches for automated calibration, and real-time modeling, which improve the simulation and understanding of river-groundwater interactions. Although the integration of these various approaches and disciplines is advancing, major research gaps remain to be filled to allow more complete and quantitative integration across disciplines. New possibilities for generating realistic distributions of streambed properties, in combination with more data and novel data types, have great potential to improve our understanding and predictive capabilities for river-groundwater systems, especially in combination with the integrated simulation of the river and groundwater flow as well as calibration methods. Understanding the implications of different data types and resolution, the development of highly instrumented field sites, ongoing model development, and the ultimate integration of models and data are important future research areas. These developments are required to expand our current understanding to do justice to the complexity of natural systems.
Circuit-Host Coupling Induces Multifaceted Behavioral Modulations of a Gene Switch.
Blanchard, Andrew E; Liao, Chen; Lu, Ting
2018-02-06
Quantitative modeling of gene circuits is fundamentally important to synthetic biology, as it offers the potential to transform circuit engineering from trial-and-error construction to rational design and, hence, facilitates the advance of the field. Currently, typical models regard gene circuits as isolated entities and focus only on the biochemical processes within the circuits. However, such a standard paradigm is getting challenged by increasing experimental evidence suggesting that circuits and their host are intimately connected, and their interactions can potentially impact circuit behaviors. Here we systematically examined the roles of circuit-host coupling in shaping circuit dynamics by using a self-activating gene switch as a model circuit. Through a combination of deterministic modeling, stochastic simulation, and Fokker-Planck equation formalism, we found that circuit-host coupling alters switch behaviors across multiple scales. At the single-cell level, it slows the switch dynamics in the high protein production regime and enlarges the difference between stable steady-state values. At the population level, it favors cells with low protein production through differential growth amplification. Together, the two-level coupling effects induce both quantitative and qualitative modulations of the switch, with the primary component of the effects determined by the circuit's architectural parameters. This study illustrates the complexity and importance of circuit-host coupling in modulating circuit behaviors, demonstrating the need for a new paradigm-integrated modeling of the circuit-host system-for quantitative understanding of engineered gene networks. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
Barani, T.; Bruschi, E.; Pizzocri, D.; ...
2017-01-03
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. Experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of burst release in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which ismore » applied as an extension of diffusion-based models to allow for the burst release effect. The concept and governing equations of the model are presented, and the effect of the newly introduced parameters is evaluated through an analytic sensitivity analysis. Then, the model is assessed for application to integral fuel rod analysis. The approach that we take for model assessment involves implementation in two structurally different fuel performance codes, namely, BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D semi-analytic code). The model is validated against 19 Light Water Reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the qualitative representation of the FGR kinetics and the quantitative predictions of integral fuel rod FGR, relative to the canonical, purely diffusion-based models, with both codes. The overall quantitative improvement of the FGR predictions in the two codes is comparable. Furthermore, calculated radial profiles of xenon concentration are investigated and compared to experimental data, demonstrating the representation of the underlying mechanisms of burst release by the new model.« less
A newly developed integrated cell culture reverse transcriptase quantitative PCR (ICC-RTqPCR) method and its applicability in UV disinfection studies is described. This method utilizes a singular cell culture system coupled with four RTqPCR assays to detect infectious serotypes t...
NASA Astrophysics Data System (ADS)
Vandergoes, Marcus J.; Howarth, Jamie D.; Dunbar, Gavin B.; Turnbull, Jocelyn C.; Roop, Heidi A.; Levy, Richard H.; Li, Xun; Prior, Christine; Norris, Margaret; Keller, Liz D.; Baisden, W. Troy; Ditchburn, Robert; Fitzsimons, Sean J.; Bronk Ramsey, Christopher
2018-05-01
Annually resolved (varved) lake sequences are important palaeoenvironmental archives as they offer a direct incremental dating technique for high-frequency reconstruction of environmental and climate change. Despite the importance of these records, establishing a robust chronology and quantifying its precision and accuracy (estimations of error) remains an essential but challenging component of their development. We outline an approach for building reliable independent chronologies, testing the accuracy of layer counts and integrating all chronological uncertainties to provide quantitative age and error estimates for varved lake sequences. The approach incorporates (1) layer counts and estimates of counting precision; (2) radiometric and biostratigrapic dating techniques to derive independent chronology; and (3) the application of Bayesian age modelling to produce an integrated age model. This approach is applied to a case study of an annually resolved sediment record from Lake Ohau, New Zealand. The most robust age model provides an average error of 72 years across the whole depth range. This represents a fractional uncertainty of ∼5%, higher than the <3% quoted for most published varve records. However, the age model and reported uncertainty represent the best fit between layer counts and independent chronology and the uncertainties account for both layer counting precision and the chronological accuracy of the layer counts. This integrated approach provides a more representative estimate of age uncertainty and therefore represents a statistically more robust chronology.
Zhu, Hao; Sun, Yan; Rajagopal, Gunaretnam; Mondry, Adrian; Dhar, Pawan
2004-01-01
Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods described. PMID:15339335
Gloaguen, Pauline; Alban, Claude; Ravanel, Stéphane; Seigneurin-Berny, Daphné; Matringe, Michel; Ferro, Myriam; Bruley, Christophe; Rolland, Norbert; Vandenbrouck, Yves
2017-01-01
Higher plants, as autotrophic organisms, are effective sources of molecules. They hold great promise for metabolic engineering, but the behavior of plant metabolism at the network level is still incompletely described. Although structural models (stoichiometry matrices) and pathway databases are extremely useful, they cannot describe the complexity of the metabolic context, and new tools are required to visually represent integrated biocurated knowledge for use by both humans and computers. Here, we describe ChloroKB, a Web application (http://chlorokb.fr/) for visual exploration and analysis of the Arabidopsis (Arabidopsis thaliana) metabolic network in the chloroplast and related cellular pathways. The network was manually reconstructed through extensive biocuration to provide transparent traceability of experimental data. Proteins and metabolites were placed in their biological context (spatial distribution within cells, connectivity in the network, participation in supramolecular complexes, and regulatory interactions) using CellDesigner software. The network contains 1,147 reviewed proteins (559 localized exclusively in plastids, 68 in at least one additional compartment, and 520 outside the plastid), 122 proteins awaiting biochemical/genetic characterization, and 228 proteins for which genes have not yet been identified. The visual presentation is intuitive and browsing is fluid, providing instant access to the graphical representation of integrated processes and to a wealth of refined qualitative and quantitative data. ChloroKB will be a significant support for structural and quantitative kinetic modeling, for biological reasoning, when comparing novel data with established knowledge, for computer analyses, and for educational purposes. ChloroKB will be enhanced by continuous updates following contributions from plant researchers. PMID:28442501
Practical Modeling Concepts for Connective Tissue Stem Cell and Progenitor Compartment Kinetics
2003-01-01
Stem cell activation and development is central to skeletal development, maintenance, and repair, as it is for all tissues. However, an integrated model of stem cell proliferation, differentiation, and transit between functional compartments has yet to evolve. In this paper, the authors review current concepts in stem cell biology and progenitor cell growth and differentiation kinetics in the context of bone formation. A cell-based modeling strategy is developed and offered as a tool for conceptual and quantitative exploration of the key kinetic variables and possible organizational hierarchies in bone tissue development and remodeling, as well as in tissue engineering strategies for bone repair. PMID:12975533
NASA Astrophysics Data System (ADS)
Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent
2017-03-01
Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.
Bassingthwaighte, James B; Raymond, Gary M; Dash, Ranjan K; Beard, Daniel A; Nolan, Margaret
2016-01-01
The 'Pathway for Oxygen' is captured in a set of models describing quantitative relationships between fluxes and driving forces for the flux of oxygen from the external air source to the mitochondrial sink at cytochrome oxidase. The intervening processes involve convection, membrane permeation, diffusion of free and heme-bound O2 and enzymatic reactions. While this system's basic elements are simple: ventilation, alveolar gas exchange with blood, circulation of the blood, perfusion of an organ, uptake by tissue, and consumption by chemical reaction, integration of these pieces quickly becomes complex. This complexity led us to construct a tutorial on the ideas and principles; these first PathwayO2 models are simple but quantitative and cover: (1) a 'one-alveolus lung' with airway resistance, lung volume compliance, (2) bidirectional transport of solute gasses like O2 and CO2, (3) gas exchange between alveolar air and lung capillary blood, (4) gas solubility in blood, and circulation of blood through the capillary syncytium and back to the lung, and (5) blood-tissue gas exchange in capillaries. These open-source models are at Physiome.org and provide background for the many respiratory models there.
The Pathway for Oxygen: Tutorial Modelling on Oxygen Transport from Air to Mitochondrion
Bassingthwaighte, James B.; Raymond, Gary M.; Dash, Ranjan K.; Beard, Daniel A.; Nolan, Margaret
2016-01-01
The ‘Pathway for Oxygen’ is captured in a set of models describing quantitative relationships between fluxes and driving forces for the flux of oxygen from the external air source to the mitochondrial sink at cytochrome oxidase. The intervening processes involve convection, membrane permeation, diffusion of free and heme-bound O2 and enzymatic reactions. While this system’s basic elements are simple: ventilation, alveolar gas exchange with blood, circulation of the blood, perfusion of an organ, uptake by tissue, and consumption by chemical reaction, integration of these pieces quickly becomes complex. This complexity led us to construct a tutorial on the ideas and principles; these first PathwayO2 models are simple but quantitative and cover: 1) a ‘one-alveolus lung’ with airway resistance, lung volume compliance, 2) bidirectional transport of solute gasses like O2 and CO2, 3) gas exchange between alveolar air and lung capillary blood, 4) gas solubility in blood, and circulation of blood through the capillary syncytium and back to the lung, and 5) blood-tissue gas exchange in capillaries. These open-source models are at Physiome.org and provide background for the many respiratory models there. PMID:26782201
Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.
Li, Zitong; Sillanpää, Mikko J
2015-12-01
Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Beck, Christopher W.
2018-01-01
Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…
Computational physiology and the Physiome Project.
Crampin, Edmund J; Halstead, Matthew; Hunter, Peter; Nielsen, Poul; Noble, Denis; Smith, Nicolas; Tawhai, Merryn
2004-01-01
Bioengineering analyses of physiological systems use the computational solution of physical conservation laws on anatomically detailed geometric models to understand the physiological function of intact organs in terms of the properties and behaviour of the cells and tissues within the organ. By linking behaviour in a quantitative, mathematically defined sense across multiple scales of biological organization--from proteins to cells, tissues, organs and organ systems--these methods have the potential to link patient-specific knowledge at the two ends of these spatial scales. A genetic profile linked to cardiac ion channel mutations, for example, can be interpreted in relation to body surface ECG measurements via a mathematical model of the heart and torso, which includes the spatial distribution of cardiac ion channels throughout the myocardium and the individual kinetics for each of the approximately 50 types of ion channel, exchanger or pump known to be present in the heart. Similarly, linking molecular defects such as mutations of chloride ion channels in lung epithelial cells to the integrated function of the intact lung requires models that include the detailed anatomy of the lungs, the physics of air flow, blood flow and gas exchange, together with the large deformation mechanics of breathing. Organizing this large body of knowledge into a coherent framework for modelling requires the development of ontologies, markup languages for encoding models, and web-accessible distributed databases. In this article we review the state of the field at all the relevant levels, and the tools that are being developed to tackle such complexity. Integrative physiology is central to the interpretation of genomic and proteomic data, and is becoming a highly quantitative, computer-intensive discipline.
DOE R&D Accomplishments Database
Phelps, M. E.; Hoffman, E. J.; Huang, S. C.; Schelbert, H. R.; Kuhl, D. E.
1978-01-01
Emission computed tomography can provide a quantitative in vivo measurement of regional tissue radionuclide tracer concentrations. This facility when combined with physiologic models and radioactively labeled physiologic tracers that behave in a predictable manner allow measurement of a wide variety of physiologic variables. This integrated technique has been referred to as Physiologic Tomography (PT). PT requires labeled compounds which trace physiologic processes in a known and predictable manner, and physiologic models which are appropriately formulated and validated to derive physiologic variables from ECT data. In order to effectively achieve this goal, PT requires an ECT system that is capable of performing truly quantitative or analytical measurements of tissue tracer concentrations and which has been well characterized in terms of spatial resolution, sensitivity and signal to noise ratios in the tomographic image. This paper illustrates the capabilities of emission computed tomography and provides examples of physiologic tomography for the regional measurement of cerebral and myocardial metabolic rate for glucose, regional measurement of cerebral blood volume, gated cardiac blood pools and capillary perfusion in brain and heart. Studies on patients with stroke and myocardial ischemia are also presented.
Quantitative modeling of the reaction/diffusion kinetics of two-chemistry photopolymers
NASA Astrophysics Data System (ADS)
Kowalski, Benjamin Andrew
Optically driven diffusion in photopolymers is an appealing material platform for a broad range of applications, in which the recorded refractive index patterns serve either as images (e.g. data storage, display holography) or as optical elements (e.g. custom GRIN components, integrated optical devices). A quantitative understanding of the reaction/diffusion kinetics is difficult to obtain directly, but is nevertheless necessary in order to fully exploit the wide array of design freedoms in these materials. A general strategy for characterizing these kinetics is proposed, in which key processes are decoupled and independently measured. This strategy enables prediction of a material's potential refractive index change, solely on the basis of its chemical components. The degree to which a material does not reach this potential reveals the fraction of monomer that has participated in unwanted reactions, reducing spatial resolution and dynamic range. This approach is demonstrated for a model material similar to commercial media, achieving quantitative predictions of index response over three orders of exposure dose (~1 to ~103 mJ cm-2) and three orders of feature size (0.35 to 500 microns). The resulting insights enable guided, rational design of new material formulations with demonstrated performance improvement.
Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories
NASA Astrophysics Data System (ADS)
Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly
The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.
Modulating Wnt Signaling Pathway to Enhance Allograft Integration in Orthopedic Trauma Treatment
2013-10-01
presented below. Quantitative output provides an extensive set of data but we have chosen to present the most relevant parameters that are reflected in...multiple parameters . Most samples have been mechanically tested and data extracted for multiple parameters . Histological evaluation of subset of...Sumner, D. R. Saline Irrigation Does Not Affect Bone Formation or Fixation Strength of Hydroxyapatite /Tricalcium Phosphate-Coated Implants in a Rat Model
Patel, Dhaval D; Anderson, Bradley D
2014-05-05
This study quantitatively explores the mechanisms underpinning the effects of model pharmaceutical polymeric precipitation inhibitors (PPIs) on the crystal growth and, in turn, maintenance of supersaturation of indomethacin, a model poorly water-soluble drug. A recently developed second-derivative UV spectroscopy method and a first-order empirical crystal growth model were used to determine indomethacin crystal growth rates in the presence of model PPIs. All three model PPIs including HP-β-CD, PVP, and HPMC inhibited indomethacin crystal growth at both high and low degrees of supersaturation (S). The bulk viscosity changes in the presence of model PPIs could not explain their crystal growth inhibitory effects. At 0.05% w/w, PVP (133-fold) and HPMC (28-fold) were better crystal growth inhibitors than HP-β-CD at high S. The inhibitory effect of HP-β-CD on the bulk diffusion-controlled indomethacin crystal growth at high S was successfully modeled using reactive diffusion layer theory, which assumes reversible complexation in the diffusion layer. Although HP-β-CD only modestly inhibited indomethacin crystal growth at either high S (∼15%) or low S (∼2-fold), the crystal growth inhibitory effects of PVP and HPMC were more dramatic, particularly at high S (0.05% w/w). The superior crystal growth inhibitory effects of PVP and HPMC as compared with HP-β-CD at high S were attributed to a change in the indomethacin crystal growth rate-limiting step from bulk diffusion to surface integration. Indomethacin crystal growth inhibitory effects of all three model PPIs at low S were attributed to retardation of the rate of surface integration of indomethacin, a phenomenon that may reflect the adsorption of PPIs onto the growing crystal surface. The quantitative approaches outlined in this study should be useful in future studies to develop tools to predict supersaturation maintenance effects of PPIs.
Lightness computation by the human visual system
NASA Astrophysics Data System (ADS)
Rudd, Michael E.
2017-05-01
A model of achromatic color computation by the human visual system is presented, which is shown to account in an exact quantitative way for a large body of appearance matching data collected with simple visual displays. The model equations are closely related to those of the original Retinex model of Land and McCann. However, the present model differs in important ways from Land and McCann's theory in that it invokes additional biological and perceptual mechanisms, including contrast gain control, different inherent neural gains for incremental, and decremental luminance steps, and two types of top-down influence on the perceptual weights applied to local luminance steps in the display: edge classification and spatial integration attentional windowing. Arguments are presented to support the claim that these various visual processes must be instantiated by a particular underlying neural architecture. By pointing to correspondences between the architecture of the model and findings from visual neurophysiology, this paper suggests that edge classification involves a top-down gating of neural edge responses in early visual cortex (cortical areas V1 and/or V2) while spatial integration windowing occurs in cortical area V4 or beyond.
An integrative formal model of motivation and decision making: The MGPM*.
Ballard, Timothy; Yeo, Gillian; Loft, Shayne; Vancouver, Jeffrey B; Neal, Andrew
2016-09-01
We develop and test an integrative formal model of motivation and decision making. The model, referred to as the extended multiple-goal pursuit model (MGPM*), is an integration of the multiple-goal pursuit model (Vancouver, Weinhardt, & Schmidt, 2010) and decision field theory (Busemeyer & Townsend, 1993). Simulations of the model generated predictions regarding the effects of goal type (approach vs. avoidance), risk, and time sensitivity on prioritization. We tested these predictions in an experiment in which participants pursued different combinations of approach and avoidance goals under different levels of risk. The empirical results were consistent with the predictions of the MGPM*. Specifically, participants pursuing 1 approach and 1 avoidance goal shifted priority from the approach to the avoidance goal over time. Among participants pursuing 2 approach goals, those with low time sensitivity prioritized the goal with the larger discrepancy, whereas those with high time sensitivity prioritized the goal with the smaller discrepancy. Participants pursuing 2 avoidance goals generally prioritized the goal with the smaller discrepancy. Finally, all of these effects became weaker as the level of risk increased. We used quantitative model comparison to show that the MGPM* explained the data better than the original multiple-goal pursuit model, and that the major extensions from the original model were justified. The MGPM* represents a step forward in the development of a general theory of decision making during multiple-goal pursuit. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Ghebremichael, Lula T; Veith, Tamie L; Hamlett, James M
2013-01-15
Quantitative risk assessments of pollution and data related to the effectiveness of mitigating best management practices (BMPs) are important aspects of nonpoint source pollution control efforts, particularly those driven by specific water quality objectives and by measurable improvement goals, such as the total maximum daily load (TMDL) requirements. Targeting critical source areas (CSAs) that generate disproportionately high pollutant loads within a watershed is a crucial step in successfully controlling nonpoint source pollution. The importance of watershed simulation models in assisting with the quantitative assessments of CSAs of pollution (relative to their magnitudes and extents) and of the effectiveness of associated BMPs has been well recognized. However, due to the distinct disconnect between the hydrological scale in which these models conduct their evaluation and the farm scale at which feasible BMPs are actually selected and implemented, and due to the difficulty and uncertainty involved in transferring watershed model data to farm fields, there are limited practical applications of these tools in the current nonpoint source pollution control efforts by conservation specialists for delineating CSAs and planning targeting measures. There are also limited approaches developed that can assess impacts of CSA-targeted BMPs on farm productivity and profitability together with the assessment of water quality improvements expected from applying these measures. This study developed a modeling framework that integrates farm economics and environmental aspects (such as identification and mitigation of CSAs) through joint use of watershed- and farm-scale models in a closed feedback loop. The integration of models in a closed feedback loop provides a way for environmental changes to be evaluated with regard to the impact on the practical aspects of farm management and economics, adjusted or reformulated as necessary, and revaluated with respect to effectiveness of environmental mitigation at the farm- and watershed-levels. This paper also outlines steps needed to extract important CSA-related information from a watershed model to help inform targeting decisions at the farm scale. The modeling framework is demonstrated with two unique case studies in the northeastern United States (New York and Vermont), with supporting data from numerous published, location-specific studies at both the watershed and farm scales. Using the integrated modeling framework, it can be possible to compare the costs (in terms of changes required in farm system components or financial compensations for retiring crop lands) and benefits (in terms of measurable water quality improvement goals) of implementing targeted BMPs. This multi-scale modeling approach can be used in the multi-objective task of mitigating CSAs of pollution to meet water quality goals while maintaining farm-level economic viability. Copyright © 2012 Elsevier Ltd. All rights reserved.
2015-01-01
Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7–30.6), 0.6 (0.3–0.9), and 4.7 (2.1–7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches. PMID:26504832
Mansfield, Theodore J; MacDonald Gibson, Jacqueline
2015-01-01
Recently, two quantitative tools have emerged for predicting the health impacts of projects that change population physical activity: the Health Economic Assessment Tool (HEAT) and Dynamic Modeling for Health Impact Assessment (DYNAMO-HIA). HEAT has been used to support health impact assessments of transportation infrastructure projects, but DYNAMO-HIA has not been previously employed for this purpose nor have the two tools been compared. To demonstrate the use of DYNAMO-HIA for supporting health impact assessments of transportation infrastructure projects, we employed the model in three communities (urban, suburban, and rural) in North Carolina. We also compared DYNAMO-HIA and HEAT predictions in the urban community. Using DYNAMO-HIA, we estimated benefit-cost ratios of 20.2 (95% C.I.: 8.7-30.6), 0.6 (0.3-0.9), and 4.7 (2.1-7.1) for the urban, suburban, and rural projects, respectively. For a 40-year time period, the HEAT predictions of deaths avoided by the urban infrastructure project were three times as high as DYNAMO-HIA's predictions due to HEAT's inability to account for changing population health characteristics over time. Quantitative health impact assessment coupled with economic valuation is a powerful tool for integrating health considerations into transportation decision-making. However, to avoid overestimating benefits, such quantitative HIAs should use dynamic, rather than static, approaches.
NASA Astrophysics Data System (ADS)
Holt, Jason; Icarus Allen, J.; Anderson, Thomas R.; Brewin, Robert; Butenschön, Momme; Harle, James; Huse, Geir; Lehodey, Patrick; Lindemann, Christian; Memery, Laurent; Salihoglu, Baris; Senina, Inna; Yool, Andrew
2014-12-01
It has long been recognised that there are strong interactions and feedbacks between climate, upper ocean biogeochemistry and marine food webs, and also that food web structure and phytoplankton community distribution are important determinants of variability in carbon production and export from the euphotic zone. Numerical models provide a vital tool to explore these interactions, given their capability to investigate multiple connected components of the system and the sensitivity to multiple drivers, including potential future conditions. A major driver for ecosystem model development is the demand for quantitative tools to support ecosystem-based management initiatives. The purpose of this paper is to review approaches to the modelling of marine ecosystems with a focus on the North Atlantic Ocean and its adjacent shelf seas, and to highlight the challenges they face and suggest ways forward. We consider the state of the art in simulating oceans and shelf sea physics, planktonic and higher trophic level ecosystems, and look towards building an integrative approach with these existing tools. We note how the different approaches have evolved historically and that many of the previous obstacles to harmonisation may no longer be present. We illustrate this with examples from the on-going and planned modelling effort in the Integrative Modelling Work Package of the EURO-BASIN programme.
Schneck, Karen B; Zhang, Xin; Bauer, Robert; Karlsson, Mats O; Sinha, Vikram P
2013-02-01
A proof of concept study was conducted to investigate the safety and tolerability of a novel oral glucokinase activator, LY2599506, during multiple dose administration to healthy volunteers and subjects with Type 2 diabetes mellitus (T2DM). To analyze the study data, a previously established semi-mechanistic integrated glucose-insulin model was extended to include characterization of glucagon dynamics. The model captured endogenous glucose and insulin dynamics, including the amplifying effects of glucose on insulin production and of insulin on glucose elimination, as well as the inhibitory influence of glucose and insulin on hepatic glucose production. The hepatic glucose production in the model was increased by glucagon and glucagon production was inhibited by elevated glucose concentrations. The contribution of exogenous factors to glycemic response, such as ingestion of carbohydrates in meals, was also included in the model. The effect of LY2599506 on glucose homeostasis in subjects with T2DM was investigated by linking a one-compartment, pharmacokinetic model to the semi-mechanistic, integrated glucose-insulin-glucagon system. Drug effects were included on pancreatic insulin secretion and hepatic glucose production. The relationships between LY2599506, glucose, insulin, and glucagon concentrations were described quantitatively and consequently, the improved understanding of the drug-response system could be used to support further clinical study planning during drug development, such as dose selection.
NASA Astrophysics Data System (ADS)
Zhang, Jianyuan; Hu, Bin; Chen, Wenjuan; Moore, Philip; Xu, Tingting; Dong, Qunxi; Liu, Zhenyu; Luo, Yuejia; Chen, Shanguang
2014-12-01
The focus of the study is the estimation of the effects of microgravity on the central nervous activity and its underlying influencing mechanisms. To validate the microgravity-induced physiological and psychological effects on EEG, quantitative EEG features, cardiovascular indicators, mood state, and cognitive performances data collection was achieved during a 45 day period using a -6°head-down bed rest (HDBR) integrated approach. The results demonstrated significant differences in EEG data, as an increased Theta wave, a decreased Beta wave and a reduced complexity of brain, accompanied with an increased heart rate and pulse rate, decreased positive emotion, and degraded emotion conflict monitoring performance. The canonical correlation analysis (CCA) based cardiovascular and cognitive related EEG model showed the cardiovascular effect on EEG mainly affected bilateral temporal region and the cognitive effect impacted parietal-occipital and frontal regions. The results obtained in the study support the use of an approach which combines a multi-factor influential mechanism hypothesis. The changes in the EEG data may be influenced by both cardiovascular and cognitive effects.
Rinschen, Markus M; Gödel, Markus; Grahammer, Florian; Zschiedrich, Stefan; Helmstädter, Martin; Kretz, Oliver; Zarei, Mostafa; Braun, Daniela A; Dittrich, Sebastian; Pahmeyer, Caroline; Schroder, Patricia; Teetzen, Carolin; Gee, HeonYung; Daouk, Ghaleb; Pohl, Martin; Kuhn, Elisa; Schermer, Bernhard; Küttner, Victoria; Boerries, Melanie; Busch, Hauke; Schiffer, Mario; Bergmann, Carsten; Krüger, Marcus; Hildebrandt, Friedhelm; Dengjel, Joern; Benzing, Thomas; Huber, Tobias B
2018-05-22
Damage to and loss of glomerular podocytes has been identified as the culprit lesion in progressive kidney diseases. Here, we combine mass spectrometry-based proteomics with mRNA sequencing, bioinformatics, and hypothesis-driven studies to provide a comprehensive and quantitative map of mammalian podocytes that identifies unanticipated signaling pathways. Comparison of the in vivo datasets with proteomics data from podocyte cell cultures showed a limited value of available cell culture models. Moreover, in vivo stable isotope labeling by amino acids uncovered surprisingly rapid synthesis of mitochondrial proteins under steady-state conditions that was perturbed under autophagy-deficient, disease-susceptible conditions. Integration of acquired omics dimensions suggested FARP1 as a candidate essential for podocyte function, which could be substantiated by genetic analysis in humans and knockdown experiments in zebrafish. This work exemplifies how the integration of multi-omics datasets can identify a framework of cell-type-specific features relevant for organ health and disease. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Danyluk, Michelle D; Schaffner, Donald W
2011-05-01
This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.
Liu, Gang; Neelamegham, Sriram
2015-01-01
The glycome constitutes the entire complement of free carbohydrates and glycoconjugates expressed on whole cells or tissues. ‘Systems Glycobiology’ is an emerging discipline that aims to quantitatively describe and analyse the glycome. Here, instead of developing a detailed understanding of single biochemical processes, a combination of computational and experimental tools are used to seek an integrated or ‘systems-level’ view. This can explain how multiple biochemical reactions and transport processes interact with each other to control glycome biosynthesis and function. Computational methods in this field commonly build in silico reaction network models to describe experimental data derived from structural studies that measure cell-surface glycan distribution. While considerable progress has been made, several challenges remain due to the complex and heterogeneous nature of this post-translational modification. First, for the in silico models to be standardized and shared among laboratories, it is necessary to integrate glycan structure information and glycosylation-related enzyme definitions into the mathematical models. Second, as glycoinformatics resources grow, it would be attractive to utilize ‘Big Data’ stored in these repositories for model construction and validation. Third, while the technology for profiling the glycome at the whole-cell level has been standardized, there is a need to integrate mass spectrometry derived site-specific glycosylation data into the models. The current review discusses progress that is being made to resolve the above bottlenecks. The focus is on how computational models can bridge the gap between ‘data’ generated in wet-laboratory studies with ‘knowledge’ that can enhance our understanding of the glycome. PMID:25871730
NASA Astrophysics Data System (ADS)
Rashid, A. A.; Sidek, A. A.; Suffian, S. A.; Daud, M. R. C.
2018-01-01
The idea of assimilating green supply chain is to integrate and establish environmental management into the supply chain practices. The study aims to explore how environmental management competitive pressure influences a SME company in Malaysia to incorporate green supply chain integration, which is an efficient platform to develop environmental innovation. This study further advances green supply chain management research in Malaysia by using the method of quantitative analysis to analyze the model developed which data will be collected based on a sample of SMEs in Malaysia in manufacturing sector. The model developed in this study illustrates how environmental management competitive pressure from main competitors affects three fundamental dimensions of green supply chain integration. The research findings suggest that environmental management competitive pressure is a vital driving force for a SME company to incorporate internal and external collaboration in developing green product innovation. From the analysis conducted, the study strongly demonstrated that the best way for a company to counteract competitor’s environmental management success is to first implement strong internal green product development process then move to incorporate external environmental management innovation between their suppliers and customers. The findings also show that internal integration of green product innovation fully mediates the relationship of environmental management competitive pressure and the external integration of green product innovation.
ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.
Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S
2011-08-01
Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.
Strategies for the physiome project.
Bassingthwaighte, J B
2000-08-01
The physiome is the quantitative description of the functioning organism in normal and pathophysiological states. The human physiome can be regarded as the virtual human. It is built upon the morphome, the quantitative description of anatomical structure, chemical and biochemical composition, and material properties of an intact organism, including its genome, proteome, cell, tissue, and organ structures up to those of the whole intact being. The Physiome Project is a multicentric integrated program to design, develop, implement, test and document, archive and disseminate quantitative information, and integrative models of the functional behavior of molecules, organelles, cells, tissues, organs, and intact organisms from bacteria to man. A fundamental and major feature of the project is the databasing of experimental observations for retrieval and evaluation. Technologies allowing many groups to work together are being rapidly developed. Internet II will facilitate this immensely. When problems are huge and complex, a particular working group can be expert in only a small part of the overall project. The strategies to be worked out must therefore include how to pull models composed of many submodules together even when the expertise in each is scattered amongst diverse institutions. The technologies of bioinformatics will contribute greatly to this effort. Developing and implementing code for large-scale systems has many problems. Most of the submodules are complex, requiring consideration of spatial and temporal events and processes. Submodules have to be linked to one another in a way that preserves mass balance and gives an accurate representation of variables in nonlinear complex biochemical networks with many signaling and controlling pathways. Microcompartmentalization vitiates the use of simplified model structures. The stiffness of the systems of equations is computationally costly. Faster computation is needed when using models as thinking tools and for iterative data analysis. Perhaps the most serious problem is the current lack of definitive information on kinetics and dynamics of systems, due in part to the almost total lack of databased observations, but also because, though we are nearly drowning in new information being published each day, either the information required for the modeling cannot be found or has never been obtained. "Simple" things like tissue composition, material properties, and mechanical behavior of cells and tissues are not generally available. The development of comprehensive models of biological systems is a key to pharmaceutics and drug design, for the models will become gradually better predictors of the results of interventions, both genomic and pharmaceutic. Good models will be useful in predicting the side effects and long term effects of drugs and toxins, and when the models are really good, to predict where genomic intervention will be effective and where the multiple redundancies in our biological systems will render a proposed intervention useless. The Physiome Project will provide the integrating scientific basis for the Genes to Health initiative, and make physiological genomics a reality applicable to whole organisms, from bacteria to man.
Strategies for the Physiome Project
Bassingthwaighte, James B.
2010-01-01
The physiome is the quantitative description of the functioning organism in normal and pathophysiological states. The human physiome can be regarded as the virtual human. It is built upon the morphome, the quantitative description of anatomical structure, chemical and biochemical composition, and material properties of an intact organism, including its genome, proteome, cell, tissue, and organ structures up to those of the whole intact being. The Physiome Project is a multicentric integrated program to design, develop, implement, test and document, archive and disseminate quantitative information, and integrative models of the functional behavior of molecules, organelles, cells, tissues, organs, and intact organisms from bacteria to man. A fundamental and major feature of the project is the databasing of experimental observations for retrieval and evaluation. Technologies allowing many groups to work together are being rapidly developed. Internet II will facilitate this immensely. When problems are huge and complex, a particular working group can be expert in only a small part of the overall project. The strategies to be worked out must therefore include how to pull models composed of many submodules together even when the expertise in each is scattered amongst diverse institutions. The technologies of bioinformatics will contribute greatly to this effort. Developing and implementing code for large-scale systems has many problems. Most of the submodules are complex, requiring consideration of spatial and temporal events and processes. Submodules have to be linked to one another in a way that preserves mass balance and gives an accurate representation of variables in nonlinear complex biochemical networks with many signaling and controlling pathways. Microcompartmentalization vitiates the use of simplified model structures. The stiffness of the systems of equations is computationally costly. Faster computation is needed when using models as thinking tools and for iterative data analysis. Perhaps the most serious problem is the current lack of definitive information on kinetics and dynamics of systems, due in part to the almost total lack of databased observations, but also because, though we are nearly drowning in new information being published each day, either the information required for the modeling cannot be found or has never been obtained. “Simple” things like tissue composition, material properties, and mechanical behavior of cells and tissues are not generally available. The development of comprehensive models of biological systems is a key to pharmaceutics and drug design, for the models will become gradually better predictors of the results of interventions, both genomic and pharmaceutic. Good models will be useful in predicting the side effects and long term effects of drugs and toxins, and when the models are really good, to predict where genomic intervention will be effective and where the multiple redundancies in our biological systems will render a proposed intervention useless. The Physiome Project will provide the integrating scientific basis for the Genes to Health initiative, and make physiological genomics a reality applicable to whole organisms, from bacteria to man. PMID:11144666
An affinity-structure database of helix-turn-helix: DNA complexes with a universal coordinate system
DOE Office of Scientific and Technical Information (OSTI.GOV)
AlQuraishi, Mohammed; Tang, Shengdong; Xia, Xide
Molecular interactions between proteins and DNA molecules underlie many cellular processes, including transcriptional regulation, chromosome replication, and nucleosome positioning. Computational analyses of protein-DNA interactions rely on experimental data characterizing known protein-DNA interactions structurally and biochemically. While many databases exist that contain either structural or biochemical data, few integrate these two data sources in a unified fashion. Such integration is becoming increasingly critical with the rapid growth of structural and biochemical data, and the emergence of algorithms that rely on the synthesis of multiple data types to derive computational models of molecular interactions. We have developed an integrated affinity-structure database inmore » which the experimental and quantitative DNA binding affinities of helix-turn-helix proteins are mapped onto the crystal structures of the corresponding protein-DNA complexes. This database provides access to: (i) protein-DNA structures, (ii) quantitative summaries of protein-DNA binding affinities using position weight matrices, and (iii) raw experimental data of protein-DNA binding instances. Critically, this database establishes a correspondence between experimental structural data and quantitative binding affinity data at the single basepair level. Furthermore, we present a novel alignment algorithm that structurally aligns the protein-DNA complexes in the database and creates a unified residue-level coordinate system for comparing the physico-chemical environments at the interface between complexes. Using this unified coordinate system, we compute the statistics of atomic interactions at the protein-DNA interface of helix-turn-helix proteins. We provide an interactive website for visualization, querying, and analyzing this database, and a downloadable version to facilitate programmatic analysis. Lastly, this database will facilitate the analysis of protein-DNA interactions and the development of programmatic computational methods that capitalize on integration of structural and biochemical datasets. The database can be accessed at http://ProteinDNA.hms.harvard.edu.« less
An affinity-structure database of helix-turn-helix: DNA complexes with a universal coordinate system
AlQuraishi, Mohammed; Tang, Shengdong; Xia, Xide
2015-11-19
Molecular interactions between proteins and DNA molecules underlie many cellular processes, including transcriptional regulation, chromosome replication, and nucleosome positioning. Computational analyses of protein-DNA interactions rely on experimental data characterizing known protein-DNA interactions structurally and biochemically. While many databases exist that contain either structural or biochemical data, few integrate these two data sources in a unified fashion. Such integration is becoming increasingly critical with the rapid growth of structural and biochemical data, and the emergence of algorithms that rely on the synthesis of multiple data types to derive computational models of molecular interactions. We have developed an integrated affinity-structure database inmore » which the experimental and quantitative DNA binding affinities of helix-turn-helix proteins are mapped onto the crystal structures of the corresponding protein-DNA complexes. This database provides access to: (i) protein-DNA structures, (ii) quantitative summaries of protein-DNA binding affinities using position weight matrices, and (iii) raw experimental data of protein-DNA binding instances. Critically, this database establishes a correspondence between experimental structural data and quantitative binding affinity data at the single basepair level. Furthermore, we present a novel alignment algorithm that structurally aligns the protein-DNA complexes in the database and creates a unified residue-level coordinate system for comparing the physico-chemical environments at the interface between complexes. Using this unified coordinate system, we compute the statistics of atomic interactions at the protein-DNA interface of helix-turn-helix proteins. We provide an interactive website for visualization, querying, and analyzing this database, and a downloadable version to facilitate programmatic analysis. Lastly, this database will facilitate the analysis of protein-DNA interactions and the development of programmatic computational methods that capitalize on integration of structural and biochemical datasets. The database can be accessed at http://ProteinDNA.hms.harvard.edu.« less
Integrated presentation of ecological risk from multiple stressors
Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman
2016-01-01
Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic. PMID:27782171
Rudd, Michael E.
2014-01-01
Previous work has demonstrated that perceived surface reflectance (lightness) can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatially integrates these steps along paths through the image to compute lightness (Rudd and Zemach, 2004, 2005, 2007). This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013) suggests that human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010) further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer's interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd and Zemach, 2005). Here, I show how the separate influences of grouping and attention on lightness can be modeled in tandem by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013), and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4. PMID:25202253
Rudd, Michael E
2014-01-01
Previous work has demonstrated that perceived surface reflectance (lightness) can be modeled in simple contexts in a quantitatively exact way by assuming that the visual system first extracts information about local, directed steps in log luminance, then spatially integrates these steps along paths through the image to compute lightness (Rudd and Zemach, 2004, 2005, 2007). This method of computing lightness is called edge integration. Recent evidence (Rudd, 2013) suggests that human vision employs a default strategy to integrate luminance steps only along paths from a common background region to the targets whose lightness is computed. This implies a role for gestalt grouping in edge-based lightness computation. Rudd (2010) further showed the perceptual weights applied to edges in lightness computation can be influenced by the observer's interpretation of luminance steps as resulting from either spatial variation in surface reflectance or illumination. This implies a role for top-down factors in any edge-based model of lightness (Rudd and Zemach, 2005). Here, I show how the separate influences of grouping and attention on lightness can be modeled in tandem by a cortical mechanism that first employs top-down signals to spatially select regions of interest for lightness computation. An object-based network computation, involving neurons that code for border-ownership, then automatically sets the neural gains applied to edge signals surviving the earlier spatial selection stage. Only the borders that survive both processing stages are spatially integrated to compute lightness. The model assumptions are consistent with those of the cortical lightness model presented earlier by Rudd (2010, 2013), and with neurophysiological data indicating extraction of local edge information in V1, network computations to establish figure-ground relations and border ownership in V2, and edge integration to encode lightness and darkness signals in V4.
State-of-the-lagoon reports as vehicles of cross-disciplinary integration.
Zaucha, Jacek; Davoudi, Simin; Slob, Adriaan; Bouma, Geiske; van Meerkerk, Ingmar; Oen, Amy Mp; Breedveld, Gijs D
2016-10-01
An integrative approach across disciplines is needed for sustainable lagoon and estuary management as identified by integrated coastal zone management. The ARCH research project (Architecture and roadmap to manage multiple pressures on lagoons) has taken initial steps to overcome the boundaries between disciplines and focus on cross-disciplinary integration by addressing the driving forces, challenges, and problems at various case study sites. A model was developed as a boundary-spanning activity to produce joint knowledge and understanding. The backbone of the model is formed by the interaction between the natural and human systems, including economy and governance-based subsystems. The model was used to create state-of-the-lagoon reports for 10 case study sites (lagoons and estuarine coastal areas), with a geographical distribution covering all major seas surrounding Europe. The reports functioned as boundary objects to build joint knowledge. The experiences related to the framing of the model and its subsequent implementation at the case study sites have resulted in key recommendations on how to address the challenges of cross-disciplinary work required for the proper management of complex social-ecological systems such as lagoons, estuarine areas, and other land-sea regions. Cross-disciplinary integration is initially resource intensive and time consuming; one should set aside the required resources and invest efforts at the forefront. It is crucial to create engagement among the group of researchers by focusing on a joint, appealing overall concept that will stimulate cross-sectoral thinking and focusing on the identified problems as a link between collected evidence and future management needs. Different methods for collecting evidence should be applied including both quantitative (jointly agreed indicators) and qualitative (narratives) information. Cross-disciplinary integration is facilitated by functional boundary objects. Integration offers important rewards in terms of developing a better understanding and subsequently improved management of complex social-ecological systems. Integr Environ Assess Manag 2016;12:690-700. © 2016 SETAC. © 2016 SETAC.
Automated simulation as part of a design workstation
NASA Technical Reports Server (NTRS)
Cantwell, E.; Shenk, T.; Robinson, P.; Upadhye, R.
1990-01-01
A development project for a design workstation for advanced life-support systems incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulations, such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The paper reports on the Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components.
Structure-Property Correlations in Al-Li Alloy Integrally Stiffened Extrusions
NASA Technical Reports Server (NTRS)
Hales, Stephen J.; Hafley, Robert A.
2001-01-01
The objective of this investigation was to establish the relationship between mechanical property anisotropy, microstructure and crystallographic texture in integrally 'T'-stiffened extruded panels fabricated from the Al-Li alloys 2195, 2098 and 2096. In-plane properties were measured as a function of orientation at two locations in the panels, namely mid-way between (Skin), and directly beneath (Base), the integral 'T' stiffeners. The 2195 extrusion exhibited the best combination of strength and toughness, but was the most anisotropic. The 2098 extrusion exhibited lower strength and comparable toughness, but was more isotropic than 2195. The 2096 extrusion exhibited the lowest strength and poor toughness, but was the most isotropic. All three alloys exhibited highly elongated grain structures and similar location-dependent variations in grain morphology. The textural characteristics comprised a beta + <100> fiber texture, similar to rolled product, in the Skin regions and alpha <111> + <100> fiber texture, comparable to axisymmetric extruded product, in the Base regions. In an attempt to quantitatively correlate texture with yield strength anisotropy, the original 'full constraint' Taylor model and a variant of the 'relaxed constraint' model, explored by Wert et al., were applied to the data. A comparison of the results revealed that the Wert model was consistently more accurate than the Taylor model.
Sun, WaiChing; Chen, Qiushi; Ostien, Jakob T.
2013-11-22
A stabilized enhanced strain finite element procedure for poromechanics is fully integrated with an elasto-plastic cap model to simulate the hydro-mechanical interactions of fluid-infiltrating porous rocks with associative and non-associative plastic flow. We present a quantitative analysis on how macroscopic plastic volumetric response caused by pore collapse and grain rearrangement affects the seepage of pore fluid, and vice versa. Results of finite element simulations imply that the dissipation of excess pore pressure may significantly affect the stress path and thus alter the volumetric plastic responses.
EM Modelling of RF Propagation Through Plasma Plumes
NASA Astrophysics Data System (ADS)
Pandolfo, L.; Bandinelli, M.; Araque Quijano, J. L.; Vecchi, G.; Pawlak, H.; Marliani, F.
2012-05-01
Electric propulsion is a commercially attractive solution for attitude and position control of geostationary satellites. Hall-effect ion thrusters generate a localized plasma flow in the surrounding of the satellite, whose impact on the communication system needs to be qualitatively and quantitatively assessed. An electromagnetic modelling tool has been developed and integrated into the Antenna Design Framework- ElectroMagnetic Satellite (ADF-EMS). The system is able to guide the user from the plume definition phases through plume installation and simulation. A validation activity has been carried out and the system has been applied to the plume modulation analysis of SGEO/Hispasat mission.
Hertäg, Loreen; Durstewitz, Daniel; Brunel, Nicolas
2014-01-01
Computational models offer a unique tool for understanding the network-dynamical mechanisms which mediate between physiological and biophysical properties, and behavioral function. A traditional challenge in computational neuroscience is, however, that simple neuronal models which can be studied analytically fail to reproduce the diversity of electrophysiological behaviors seen in real neurons, while detailed neuronal models which do reproduce such diversity are intractable analytically and computationally expensive. A number of intermediate models have been proposed whose aim is to capture the diversity of firing behaviors and spike times of real neurons while entailing the simplest possible mathematical description. One such model is the exponential integrate-and-fire neuron with spike rate adaptation (aEIF) which consists of two differential equations for the membrane potential (V) and an adaptation current (w). Despite its simplicity, it can reproduce a wide variety of physiologically observed spiking patterns, can be fit to physiological recordings quantitatively, and, once done so, is able to predict spike times on traces not used for model fitting. Here we compute the steady-state firing rate of aEIF in the presence of Gaussian synaptic noise, using two approaches. The first approach is based on the 2-dimensional Fokker-Planck equation that describes the (V,w)-probability distribution, which is solved using an expansion in the ratio between the time constants of the two variables. The second is based on the firing rate of the EIF model, which is averaged over the distribution of the w variable. These analytically derived closed-form expressions were tested on simulations from a large variety of model cells quantitatively fitted to in vitro electrophysiological recordings from pyramidal cells and interneurons. Theoretical predictions closely agreed with the firing rate of the simulated cells fed with in-vivo-like synaptic noise.
Aerosol Mapping From Space: Strengths, Limitations, and Applications
NASA Technical Reports Server (NTRS)
Kahn, Ralph
2010-01-01
The aerosol data products from the NASA Earth Observing System's MISR and MODIS instruments provide significant advances in regional and global aerosol optical depth (AOD) mapping, aerosol type measurement, and source plume characterization from space. These products have been and are being used for many applications, ranging from regional air quality assessment, to aerosol air mass type identification and evolution, to wildfire smoke injection height and aerosol transport model validation. However, retrieval uncertainties and coverage gaps still limit the quantitative constraints these satellite data place on some important questions, such as global-scale long-term trends and direct aerosol radiative forcing. Major advances in these areas seem to require a different paradigm, involving the integration of satellite with suborbital data and with models. This presentation will briefly summarize where we stand, and what incremental improvements we can expect, with the current MISR and MODIS aerosol products, and will then elaborate on some initial steps aimed at the necessary integration of satellite data with data from other sources and with chemical transport models.
A statistical approach to the brittle fracture of a multi-phase solid
NASA Technical Reports Server (NTRS)
Liu, W. K.; Lua, Y. I.; Belytschko, T.
1991-01-01
A stochastic damage model is proposed to quantify the inherent statistical distribution of the fracture toughness of a brittle, multi-phase solid. The model, based on the macrocrack-microcrack interaction, incorporates uncertainties in locations and orientations of microcracks. Due to the high concentration of microcracks near the macro-tip, a higher order analysis based on traction boundary integral equations is formulated first for an arbitrary array of cracks. The effects of uncertainties in locations and orientations of microcracks at a macro-tip are analyzed quantitatively by using the boundary integral equations method in conjunction with the computer simulation of the random microcrack array. The short range interactions resulting from surrounding microcracks closet to the main crack tip are investigated. The effects of microcrack density parameter are also explored in the present study. The validity of the present model is demonstrated by comparing its statistical output with the Neville distribution function, which gives correct fits to sets of experimental data from multi-phase solids.
van Ditmarsch, Dave; Xavier, João B
2011-06-17
Online spectrophotometric measurements allow monitoring dynamic biological processes with high-time resolution. Contrastingly, numerous other methods require laborious treatment of samples and can only be carried out offline. Integrating both types of measurement would allow analyzing biological processes more comprehensively. A typical example of this problem is acquiring quantitative data on rhamnolipid secretion by the opportunistic pathogen Pseudomonas aeruginosa. P. aeruginosa cell growth can be measured by optical density (OD600) and gene expression can be measured using reporter fusions with a fluorescent protein, allowing high time resolution monitoring. However, measuring the secreted rhamnolipid biosurfactants requires laborious sample processing, which makes this an offline measurement. Here, we propose a method to integrate growth curve data with endpoint measurements of secreted metabolites that is inspired by a model of exponential cell growth. If serial diluting an inoculum gives reproducible time series shifted in time, then time series of endpoint measurements can be reconstructed using calculated time shifts between dilutions. We illustrate the method using measured rhamnolipid secretion by P. aeruginosa as endpoint measurements and we integrate these measurements with high-resolution growth curves measured by OD600 and expression of rhamnolipid synthesis genes monitored using a reporter fusion. Two-fold serial dilution allowed integrating rhamnolipid measurements at a ~0.4 h-1 frequency with high-time resolved data measured at a 6 h-1 frequency. We show how this simple method can be used in combination with mutants lacking specific genes in the rhamnolipid synthesis or quorum sensing regulation to acquire rich dynamic data on P. aeruginosa virulence regulation. Additionally, the linear relation between the ratio of inocula and the time-shift between curves produces high-precision measurements of maximum specific growth rates, which were determined with a precision of ~5.4%. Growth curve synchronization allows integration of rich time-resolved data with endpoint measurements to produce time-resolved quantitative measurements. Such data can be valuable to unveil the dynamic regulation of virulence in P. aeruginosa. More generally, growth curve synchronization can be applied to many biological systems thus helping to overcome a key obstacle in dynamic regulation: the scarceness of quantitative time-resolved data.
Women’s Sleep Disorders: Integrative Care
Frange, Cristina; Banzoli, Carolina Vicente; Colombo, Ana Elisa; Siegler, Marcele; Coelho, Glaury; Bezerra, Andréia Gomes; Csermak, Marcelo; Naufel, Maria Fernanda; Cesar-Netto, Cristiana; Andersen, Monica Levy; Girão, Manoel João Batista Castelo; Tufik, Sergio; Hachul, Helena
2017-01-01
The integrative care model is rooted in a biopsychosocial approach. Integrative is a term which refers to increasing the harmony and coherence of your whole being, and integrative care is therefore focused on the person, not on either the disease or a therapy. It is provided collaboratively by a health team comprising physicians, psychologists, physiotherapists, acupuncturists, and meditation, nutrition, and floral therapy. Previous studies have demonstrated that interventions based on the integrative care model improved womens lifestyle and quality of life. Our aim was to describe the use of complementary and alternative medicine (CAM) alongside traditional medicine among women with sleep conditions in our Womens Sleep Disorders Integrative Treatment Outpatient Clinic. We are sharing our experiences and clinical practice as the model we developed seems to have both physical and psychological benefits for women with sleep problems. We discuss the wide range of benefits that result from this type of complex intervention, and the contextual factors that may influence these benefits. This will inform future practitioners and we hope to contribute to quantitative research in the clinical setting. The study highlights the importance of treating sleep complaints with a caring relationship and a CAM approach, alongside conventional medicine. Exploration of the lived experience of CAM and its meaning enables healthcare professionals to gain insights into the patients needs, preferences, and values. Gynecologists, clinicians, and health care providers should support and guide patients in their decision to use CAM by providing evidence-based and comprehensive advice on the potential benefits, risks and related safety issues of this approach. PMID:29410750
Helmlinger, Gabriel; Al-Huniti, Nidal; Aksenov, Sergey; Peskov, Kirill; Hallow, Karen M; Chu, Lulu; Boulton, David; Eriksson, Ulf; Hamrén, Bengt; Lambert, Craig; Masson, Eric; Tomkinson, Helen; Stanski, Donald
2017-11-15
Modeling & simulation (M&S) methodologies are established quantitative tools, which have proven to be useful in supporting the research, development (R&D), regulatory approval, and marketing of novel therapeutics. Applications of M&S help design efficient studies and interpret their results in context of all available data and knowledge to enable effective decision-making during the R&D process. In this mini-review, we focus on two sets of modeling approaches: population-based models, which are well-established within the pharmaceutical industry today, and fall under the discipline of clinical pharmacometrics (PMX); and systems dynamics models, which encompass a range of models of (patho-)physiology amenable to pharmacological intervention, of signaling pathways in biology, and of substance distribution in the body (today known as physiologically-based pharmacokinetic models) - which today may be collectively referred to as quantitative systems pharmacology models (QSP). We next describe the convergence - or rather selected integration - of PMX and QSP approaches into 'middle-out' drug-disease models, which retain selected mechanistic aspects, while remaining parsimonious, fit-for-purpose, and able to address variability and the testing of covariates. We further propose development opportunities for drug-disease systems models, to increase their utility and applicability throughout the preclinical and clinical spectrum of pharmaceutical R&D. Copyright © 2017 Elsevier B.V. All rights reserved.
Designing A Mixed Methods Study In Primary Care
Creswell, John W.; Fetters, Michael D.; Ivankova, Nataliya V.
2004-01-01
BACKGROUND Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. METHODS We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. RESULTS Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. DISCUSSION We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research. PMID:15053277
Cognitive niches: an ecological model of strategy selection.
Marewski, Julian N; Schooler, Lael J
2011-07-01
How do people select among different strategies to accomplish a given task? Across disciplines, the strategy selection problem represents a major challenge. We propose a quantitative model that predicts how selection emerges through the interplay among strategies, cognitive capacities, and the environment. This interplay carves out for each strategy a cognitive niche, that is, a limited number of situations in which the strategy can be applied, simplifying strategy selection. To illustrate our proposal, we consider selection in the context of 2 theories: the simple heuristics framework and the ACT-R (adaptive control of thought-rational) architecture of cognition. From the heuristics framework, we adopt the thesis that people make decisions by selecting from a repertoire of simple decision strategies that exploit regularities in the environment and draw on cognitive capacities, such as memory and time perception. ACT-R provides a quantitative theory of how these capacities adapt to the environment. In 14 simulations and 10 experiments, we consider the choice between strategies that operate on the accessibility of memories and those that depend on elaborate knowledge about the world. Based on Internet statistics, our model quantitatively predicts people's familiarity with and knowledge of real-world objects, the distributional characteristics of the associated speed of memory retrieval, and the cognitive niches of classic decision strategies, including those of the fluency, recognition, integration, lexicographic, and sequential-sampling heuristics. In doing so, the model specifies when people will be able to apply different strategies and how accurate, fast, and effortless people's decisions will be.
Lin, Kai; Zhang, Lanwei; Han, Xue; Meng, Zhaoxu; Zhang, Jianming; Wu, Yifan; Cheng, Dayou
2018-03-28
In this study, Qula casein derived from yak milk casein was hydrolyzed using a two-enzyme combination approach, and high angiotensin I-converting enzyme (ACE) inhibitory activity peptides were screened by quantitative structure-activity relationship (QSAR) modeling integrated with molecular docking analysis. Hydrolysates (<3 kDa) derived from combinations of thermolysin + alcalase and thermolysin + proteinase K demonstrated high ACE inhibitory activities. Peptide sequences in hydrolysates derived from these two combinations were identified by liquid chromatography-tandem mass spectrometry (LC-MS/MS). On the basis of the QSAR modeling prediction, a total of 16 peptides were selected for molecular docking analysis. The docking study revealed that four of the peptides (KFPQY, MPFPKYP, MFPPQ, and QWQVL) bound the active site of ACE. These four novel peptides were chemically synthesized, and their IC 50 was determined. Among these peptides, KFPQY showed the highest ACE inhibitory activity (IC 50 = 12.37 ± 0.43 μM). Our study indicated that Qula casein presents an excellent source to produce ACE inhibitory peptides.
Bloksgaard, Maria; Leurgans, Thomas M; Spronck, Bart; Heusinkveld, Maarten H G; Thorsted, Bjarne; Rosenstand, Kristoffer; Nissen, Inger; Hansen, Ulla M; Brewer, Jonathan R; Bagatolli, Luis A; Rasmussen, Lars M; Irmukhamedov, Akhmadjon; Reesink, Koen D; De Mey, Jo G R
2017-07-01
The impact of disease-related changes in the extracellular matrix (ECM) on the mechanical properties of human resistance arteries largely remains to be established. Resistance arteries from both pig and human parietal pericardium (PRA) display a different ECM microarchitecture compared with frequently used rodent mesenteric arteries. We hypothesized that the biaxial mechanics of PRA mirror pressure-induced changes in the ECM microarchitecture. This was tested using isolated pig PRA as a model system, integrating vital imaging, pressure myography, and mathematical modeling. Collagenase and elastase digestions were applied to evaluate the load-bearing roles of collagen and elastin, respectively. The incremental elastic modulus linearly related to the straightness of adventitial collagen fibers circumferentially and longitudinally (both R 2 ≥ 0.99), whereas there was a nonlinear relationship to the internal elastic lamina elastin fiber branching angles. Mathematical modeling suggested a collagen recruitment strain (means ± SE) of 1.1 ± 0.2 circumferentially and 0.20 ± 0.01 longitudinally, corresponding to a pressure of ~40 mmHg, a finding supported by the vital imaging. The integrated method was tested on human PRA to confirm its validity. These showed limited circumferential distensibility and elongation and a collagen recruitment strain of 0.8 ± 0.1 circumferentially and 0.06 ± 0.02 longitudinally, reached at a distending pressure below 20 mmHg. This was confirmed by vital imaging showing negligible microarchitectural changes of elastin and collagen upon pressurization. In conclusion, we show here, for the first time in resistance arteries, a quantitative relationship between pressure-induced changes in the extracellular matrix and the arterial wall mechanics. The strength of the integrated methods invites for future detailed studies of microvascular pathologies. NEW & NOTEWORTHY This is the first study to quantitatively relate pressure-induced microstructural changes in resistance arteries to the mechanics of their wall. Principal findings using a pig model system were confirmed in human arteries. The combined methods provide a strong tool for future hypothesis-driven studies of microvascular pathologies. Copyright © 2017 the American Physiological Society.
Dallmann, André; Ince, Ibrahim; Coboeken, Katrin; Eissing, Thomas; Hempel, Georg
2017-09-18
Physiologically based pharmacokinetic modeling is considered a valuable tool for predicting pharmacokinetic changes in pregnancy to subsequently guide in-vivo pharmacokinetic trials in pregnant women. The objective of this study was to extend and verify a previously developed physiologically based pharmacokinetic model for pregnant women for the prediction of pharmacokinetics of drugs metabolized via several cytochrome P450 enzymes. Quantitative information on gestation-specific changes in enzyme activity available in the literature was incorporated in a pregnancy physiologically based pharmacokinetic model and the pharmacokinetics of eight drugs metabolized via one or multiple cytochrome P450 enzymes was predicted. The tested drugs were caffeine, midazolam, nifedipine, metoprolol, ondansetron, granisetron, diazepam, and metronidazole. Pharmacokinetic predictions were evaluated by comparison with in-vivo pharmacokinetic data obtained from the literature. The pregnancy physiologically based pharmacokinetic model successfully predicted the pharmacokinetics of all tested drugs. The observed pregnancy-induced pharmacokinetic changes were qualitatively and quantitatively reasonably well predicted for all drugs. Ninety-seven percent of the mean plasma concentrations predicted in pregnant women fell within a twofold error range and 63% within a 1.25-fold error range. For all drugs, the predicted area under the concentration-time curve was within a 1.25-fold error range. The presented pregnancy physiologically based pharmacokinetic model can quantitatively predict the pharmacokinetics of drugs that are metabolized via one or multiple cytochrome P450 enzymes by integrating prior knowledge of the pregnancy-related effect on these enzymes. This pregnancy physiologically based pharmacokinetic model may thus be used to identify potential exposure changes in pregnant women a priori and to eventually support informed decision making when clinical trials are designed in this special population.
Designing and encoding models for synthetic biology.
Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas
2009-08-06
A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology 'loop'.
ERIC Educational Resources Information Center
Maree, Jacobus G.
2018-01-01
This article reports on the value of career construction counselling (integrating qualitative and quantitative strategies and associated techniques) for a young person. The participant was purposefully selected from a number of people who had sought career counselling in a private practice context. An intrinsic, single-case study design was…
Accurate object tracking system by integrating texture and depth cues
NASA Astrophysics Data System (ADS)
Chen, Ju-Chin; Lin, Yu-Hang
2016-03-01
A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
Using mixed methods research in medical education: basic guidelines for researchers.
Schifferdecker, Karen E; Reed, Virginia A
2009-07-01
Mixed methods research involves the collection, analysis and integration of both qualitative and quantitative data in a single study. The benefits of a mixed methods approach are particularly evident when studying new questions or complex initiatives and interactions, which is often the case in medical education research. Basic guidelines for when to use mixed methods research and how to design a mixed methods study in medical education research are not readily available. The purpose of this paper is to remedy that situation by providing an overview of mixed methods research, research design models relevant for medical education research, examples of each research design model in medical education research, and basic guidelines for medical education researchers interested in mixed methods research. Mixed methods may prove superior in increasing the integrity and applicability of findings when studying new or complex initiatives and interactions in medical education research. They deserve an increased presence and recognition in medical education research.
Methods for assessing geodiversity
NASA Astrophysics Data System (ADS)
Zwoliński, Zbigniew; Najwer, Alicja; Giardino, Marco
2017-04-01
The accepted systematics of geodiversity assessment methods will be presented in three categories: qualitative, quantitative and qualitative-quantitative. Qualitative methods are usually descriptive methods that are suited to nominal and ordinal data. Quantitative methods use a different set of parameters and indicators to determine the characteristics of geodiversity in the area being researched. Qualitative-quantitative methods are a good combination of the collection of quantitative data (i.e. digital) and cause-effect data (i.e. relational and explanatory). It seems that at the current stage of the development of geodiversity research methods, qualitative-quantitative methods are the most advanced and best assess the geodiversity of the study area. Their particular advantage is the integration of data from different sources and with different substantive content. Among the distinguishing features of the quantitative and qualitative-quantitative methods for assessing geodiversity are their wide use within geographic information systems, both at the stage of data collection and data integration, as well as numerical processing and their presentation. The unresolved problem for these methods, however, is the possibility of their validation. It seems that currently the best method of validation is direct filed confrontation. Looking to the next few years, the development of qualitative-quantitative methods connected with cognitive issues should be expected, oriented towards ontology and the Semantic Web.
Integrative eQTL analysis of tumor and host omics data in individuals with bladder cancer.
Pineda, Silvia; Van Steen, Kristel; Malats, Núria
2017-09-01
Integrative analyses of several omics data are emerging. The data are usually generated from the same source material (i.e., tumor sample) representing one level of regulation. However, integrating different regulatory levels (i.e., blood) with those from tumor may also reveal important knowledge about the human genetic architecture. To model this multilevel structure, an integrative-expression quantitative trait loci (eQTL) analysis applying two-stage regression (2SR) was proposed. This approach first regressed tumor gene expression levels with tumor markers and the adjusted residuals from the previous model were then regressed with the germline genotypes measured in blood. Previously, we demonstrated that penalized regression methods in combination with a permutation-based MaxT method (Global-LASSO) is a promising tool to fix some of the challenges that high-throughput omics data analysis imposes. Here, we assessed whether Global-LASSO can also be applied when tumor and blood omics data are integrated. We further compared our strategy with two 2SR-approaches, one using multiple linear regression (2SR-MLR) and other using LASSO (2SR-LASSO). We applied the three models to integrate genomic, epigenomic, and transcriptomic data from tumor tissue with blood germline genotypes from 181 individuals with bladder cancer included in the TCGA Consortium. Global-LASSO provided a larger list of eQTLs than the 2SR methods, identified a previously reported eQTLs in prostate stem cell antigen (PSCA), and provided further clues on the complexity of APBEC3B loci, with a minimal false-positive rate not achieved by 2SR-MLR. It also represents an important contribution for omics integrative analysis because it is easy to apply and adaptable to any type of data. © 2017 WILEY PERIODICALS, INC.
Simulated linear test applied to quantitative proteomics.
Pham, T V; Jimenez, C R
2016-09-01
Omics studies aim to find significant changes due to biological or functional perturbation. However, gene and protein expression profiling experiments contain inherent technical variation. In discovery proteomics studies where the number of samples is typically small, technical variation plays an important role because it contributes considerably to the observed variation. Previous methods place both technical and biological variations in tightly integrated mathematical models that are difficult to adapt for different technological platforms. Our aim is to derive a statistical framework that allows the inclusion of a wide range of technical variability. We introduce a new method called the simulated linear test, or the s-test, that is easy to implement and easy to adapt for different models of technical variation. It generates virtual data points from the observed values according to a pre-defined technical distribution and subsequently employs linear modeling for significance analysis. We demonstrate the flexibility of the proposed approach by deriving a new significance test for quantitative discovery proteomics for which missing values have been a major issue for traditional methods such as the t-test. We evaluate the result on two label-free (phospho) proteomics datasets based on ion-intensity quantitation. Available at http://www.oncoproteomics.nl/software/stest.html : t.pham@vumc.nl. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
A quantitative study on magnesium alloy stent biodegradation.
Gao, Yuanming; Wang, Lizhen; Gu, Xuenan; Chu, Zhaowei; Guo, Meng; Fan, Yubo
2018-06-06
Insufficient scaffolding time in the process of rapid corrosion is the main problem of magnesium alloy stent (MAS). Finite element method had been used to investigate corrosion of MAS. However, related researches mostly described all elements suffered corrosion in view of one-dimensional corrosion. Multi-dimensional corrosions significantly influence mechanical integrity of MAS structures such as edges and corners. In this study, the effects of multi-dimensional corrosion were studied using experiment quantitatively, then a phenomenological corrosion model was developed to consider these effects. We implemented immersion test with magnesium alloy (AZ31B) cubes, which had different numbers of exposed surfaces to analyze differences of dimension. It was indicated that corrosion rates of cubes are almost proportional to their exposed-surface numbers, especially when pitting corrosions are not marked. The cubes also represented the hexahedron elements in simulation. In conclusion, corrosion rate of every element accelerates by increasing corrosion-surface numbers in multi-dimensional corrosion. The damage ratios among elements with the same size are proportional to the ratios of corrosion-surface numbers under uniform corrosion. The finite element simulation using proposed model provided more details of changes of morphology and mechanics in scaffolding time by removing 25.7% of elements of MAS. The proposed corrosion model reflected the effects of multi-dimension on corrosions. It would be used to predict degradation process of MAS quantitatively. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gottschalk, Ian P.; Hermans, Thomas; Knight, Rosemary; Caers, Jef; Cameron, David A.; Regnery, Julia; McCray, John E.
2017-12-01
Geophysical data have proven to be very useful for lithological characterization. However, quantitatively integrating the information gained from acquiring geophysical data generally requires colocated lithological and geophysical data for constructing a rock-physics relationship. In this contribution, the issue of integrating noncolocated geophysical and lithological data is addressed, and the results are applied to simulate groundwater flow in a heterogeneous aquifer in the Prairie Waters Project North Campus aquifer recharge site, Colorado. Two methods of constructing a rock-physics transform between electrical resistivity tomography (ERT) data and lithology measurements are assessed. In the first approach, a maximum likelihood estimation (MLE) is used to fit a bimodal lognormal distribution to horizontal crosssections of the ERT resistivity histogram. In the second approach, a spatial bootstrap is applied to approximate the rock-physics relationship. The rock-physics transforms provide soft data for multiple point statistics (MPS) simulations. Subsurface models are used to run groundwater flow and tracer test simulations. Each model's uncalibrated, predicted breakthrough time is evaluated based on its agreement with measured subsurface travel time values from infiltration basins to selected groundwater recovery wells. We find that incorporating geophysical information into uncalibrated flow models reduces the difference with observed values, as compared to flow models without geophysical information incorporated. The integration of geophysical data also narrows the variance of predicted tracer breakthrough times substantially. Accuracy is highest and variance is lowest in breakthrough predictions generated by the MLE-based rock-physics transform. Calibrating the ensemble of geophysically constrained models would help produce a suite of realistic flow models for predictive purposes at the site. We find that the success of breakthrough predictions is highly sensitive to the definition of the rock-physics transform; it is therefore important to model this transfer function accurately.
ERIC Educational Resources Information Center
Small, Christine J.; Newtoff, Kiersten N.
2013-01-01
Undergraduate biology education is undergoing dramatic changes, emphasizing student training in the "tools and practices" of science, particularly quantitative and problem-solving skills. We redesigned a freshman ecology lab to emphasize the importance of scientific inquiry and quantitative reasoning in biology. This multi-week investigation uses…
Exploring Phytoplankton Population Investigation Growth to Enhance Quantitative Literacy
ERIC Educational Resources Information Center
Baumgartner, Erin; Biga, Lindsay; Bledsoe, Karen; Dawson, James; Grammer, Julie; Howard, Ava; Snyder, Jeffrey
2015-01-01
Quantitative literacy is essential to biological literacy (and is one of the core concepts in "Vision and Change in Undergraduate Biology Education: A Call to Action"; AAAS 2009). Building quantitative literacy is a challenging endeavor for biology instructors. Integrating mathematical skills into biological investigations can help build…
Integrating Quantitative and Ethnographic Methods to Describe the Classroom. Report No. 5083.
ERIC Educational Resources Information Center
Malitz, David; And Others
The debate between proponents of ethnographic and quantitative methodology in classroom observation is reviewed, and the respective strengths and weaknesses of the two approaches are discussed. These methodologies are directly compared in a study that conducted simultaneous ethnographic and quantitative observations on nine classrooms. It is…
Air Force Personnel Can Improve Compliance With the Berry Amendment and Buy American Act
2016-02-24
accountability , integrity, and efficiency; advises the Secretary of Defense and Congress; and informs the public. Vision Our vision is to be a model...of personnel from the DoD OIG’s Quantitative Methods Division. Prior Coverage During the last 5 years, the Government Accountability Office (GAO... disclosures . The designated ombudsman is the DoD Hotline Director. For more information on your rights and remedies against retaliation, visit www.dodig.mil
Integrating prediction, provenance, and optimization into high energy workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schram, M.; Bansal, V.; Friese, R. D.
We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.
Introductory life science mathematics and quantitative neuroscience courses.
Duffus, Dwight; Olifer, Andrei
2010-01-01
We describe two sets of courses designed to enhance the mathematical, statistical, and computational training of life science undergraduates at Emory College. The first course is an introductory sequence in differential and integral calculus, modeling with differential equations, probability, and inferential statistics. The second is an upper-division course in computational neuroscience. We provide a description of each course, detailed syllabi, examples of content, and a brief discussion of the main issues encountered in developing and offering the courses.
NASA Technical Reports Server (NTRS)
Hoehler, Tori M.
2010-01-01
The remarkable challenges and possibilities of the coming few decades will compel the biogeochemical and astrobiological sciences to characterize the interactions between biology and its environment in a fundamental, mechanistic, and quantitative fashion. The clear need for integrative and scalable biology-environment models is exemplified in the Earth sciences by the challenge of effectively addressing anthropogenic global change, and in the space sciences by the challenge of mounting a well-constrained yet sufficiently adaptive and inclusive search for life beyond Earth. Our understanding of the life-planet interaction is still, however, largely empirical. A variety of approaches seek to move from empirical to mechanistic descriptions. One approach focuses on the relationship between biology and energy, which is at once universal (all life requires energy), unique (life manages energy flow in a fashion not seen in abiotic systems), and amenable to characterization and quantification in thermodynamic terms. Simultaneously, a focus on energy flow addresses a critical point of interface between life and its geological, chemical, and physical environment. Characterizing and quantifying this relationship for life on Earth will support the development of integrative and predictive models for biology-environment dynamics. Understanding this relationship at its most fundamental level holds potential for developing concepts of habitability and biosignatures that can optimize astrobiological exploration strategies and are extensible to all life.
Zhu, Xin-Guang; Lynch, Jonathan P; LeBauer, David S; Millar, Andrew J; Stitt, Mark; Long, Stephen P
2016-05-01
A paradigm shift is needed and timely in moving plant modelling from largely isolated efforts to a connected community endeavour that can take full advantage of advances in computer science and in mechanistic understanding of plant processes. Plants in silico (Psi) envisions a digital representation of layered dynamic modules, linking from gene networks and metabolic pathways through to cellular organization, tissue, organ and whole plant development, together with resource capture and use efficiency in dynamic competitive environments, ultimately allowing a mechanistically rich simulation of the plant or of a community of plants in silico. The concept is to integrate models or modules from different layers of organization spanning from genome to phenome to ecosystem in a modular framework allowing the use of modules of varying mechanistic detail representing the same biological process. Developments in high-performance computing, functional knowledge of plants, the internet and open-source version controlled software make achieving the concept realistic. Open source will enhance collaboration and move towards testing and consensus on quantitative theoretical frameworks. Importantly, Psi provides a quantitative knowledge framework where the implications of a discovery at one level, for example, single gene function or developmental response, can be examined at the whole plant or even crop and natural ecosystem levels. © 2015 The Authors Plant, Cell & Environment Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Balbi, S.; Villa, F.; Mojtahed, V.; Hegetschweiler, K. T.; Giupponi, C.
2015-10-01
This article presents a novel methodology to assess flood risk to people by integrating people's vulnerability and ability to cushion hazards through coping and adapting. The proposed approach extends traditional risk assessments beyond material damages; complements quantitative and semi-quantitative data with subjective and local knowledge, improving the use of commonly available information; produces estimates of model uncertainty by providing probability distributions for all of its outputs. Flood risk to people is modeled using a spatially explicit Bayesian network model calibrated on expert opinion. Risk is assessed in terms of: (1) likelihood of non-fatal physical injury; (2) likelihood of post-traumatic stress disorder; (3) likelihood of death. The study area covers the lower part of the Sihl valley (Switzerland) including the city of Zurich. The model is used to estimate the benefits of improving an existing Early Warning System, taking into account the reliability, lead-time and scope (i.e. coverage of people reached by the warning). Model results indicate that the potential benefits of an improved early warning in terms of avoided human impacts are particularly relevant in case of a major flood event: about 75 % of fatalities, 25 % of injuries and 18 % of post-traumatic stress disorders could be avoided.
Zhang, Tisheng; Niu, Xiaoji; Ban, Yalong; Zhang, Hongping; Shi, Chuang; Liu, Jingnan
2015-01-01
A GNSS/INS deeply-coupled system can improve the satellite signals tracking performance by INS aiding tracking loops under dynamics. However, there was no literature available on the complete modeling of the INS branch in the INS-aided tracking loop, which caused the lack of a theoretical tool to guide the selections of inertial sensors, parameter optimization and quantitative analysis of INS-aided PLLs. This paper makes an effort on the INS branch in modeling and parameter optimization of phase-locked loops (PLLs) based on the scalar-based GNSS/INS deeply-coupled system. It establishes the transfer function between all known error sources and the PLL tracking error, which can be used to quantitatively evaluate the candidate inertial measurement unit (IMU) affecting the carrier phase tracking error. Based on that, a steady-state error model is proposed to design INS-aided PLLs and to analyze their tracking performance. Based on the modeling and error analysis, an integrated deeply-coupled hardware prototype is developed, with the optimization of the aiding information. Finally, the performance of the INS-aided PLLs designed based on the proposed steady-state error model is evaluated through the simulation and road tests of the hardware prototype. PMID:25569751
Paillet, Frederick L.; Crowder, R.E.
1996-01-01
Quantitative analysis of geophysical logs in ground-water studies often involves at least as broad a range of applications and variation in lithology as is typically encountered in petroleum exploration, making such logs difficult to calibrate and complicating inversion problem formulation. At the same time, data inversion and analysis depend on inversion model formulation and refinement, so that log interpretation cannot be deferred to a geophysical log specialist unless active involvement with interpretation can be maintained by such an expert over the lifetime of the project. We propose a generalized log-interpretation procedure designed to guide hydrogeologists in the interpretation of geophysical logs, and in the integration of log data into ground-water models that may be systematically refined and improved in an iterative way. The procedure is designed to maximize the effective use of three primary contributions from geophysical logs: (1) The continuous depth scale of the measurements along the well bore; (2) The in situ measurement of lithologic properties and the correlation with hydraulic properties of the formations over a finite sample volume; and (3) Multiple independent measurements that can potentially be inverted for multiple physical or hydraulic properties of interest. The approach is formulated in the context of geophysical inversion theory, and is designed to be interfaced with surface geophysical soundings and conventional hydraulic testing. The step-by-step procedures given in our generalized interpretation and inversion technique are based on both qualitative analysis designed to assist formulation of the interpretation model, and quantitative analysis used to assign numerical values to model parameters. The approach bases a decision as to whether quantitative inversion is statistically warranted by formulating an over-determined inversion. If no such inversion is consistent with the inversion model, quantitative inversion is judged not possible with the given data set. Additional statistical criteria such as the statistical significance of regressions are used to guide the subsequent calibration of geophysical data in terms of hydraulic variables in those situations where quantitative data inversion is considered appropriate.
Zhang, Yu-Tian; Xiao, Mei-Feng; Deng, Kai-Wen; Yang, Yan-Tao; Zhou, Yi-Qun; Zhou, Jin; He, Fu-Yuan; Liu, Wen-Long
2018-06-01
Nowadays, to research and formulate an efficiency extraction system for Chinese herbal medicine, scientists have always been facing a great challenge for quality management, so that the transitivity of Q-markers in quantitative analysis of TCM was proposed by Prof. Liu recently. In order to improve the quality of extraction from raw medicinal materials for clinical preparations, a series of integrated mathematic models for transitivity of Q-markers in quantitative analysis of TCM were established. Buyanghuanwu decoction (BYHWD) was a commonly TCMs prescription, which was used to prevent and treat the ischemic heart and brain diseases. In this paper, we selected BYHWD as an extraction experimental subject to study the quantitative transitivity of TCM. Based on theory of Fick's Rule and Noyes-Whitney equation, novel kinetic models were established for extraction of active components. Meanwhile, fitting out kinetic equations of extracted models and then calculating the inherent parameters in material piece and Q-marker quantitative transfer coefficients, which were considered as indexes to evaluate transitivity of Q-markers in quantitative analysis of the extraction process of BYHWD. HPLC was applied to screen and analyze the potential Q-markers in the extraction process. Fick's Rule and Noyes-Whitney equation were adopted for mathematically modeling extraction process. Kinetic parameters were fitted and calculated by the Statistical Program for Social Sciences 20.0 software. The transferable efficiency was described and evaluated by potential Q-markers transfer trajectory via transitivity availability AUC, extraction ratio P, and decomposition ratio D respectively. The Q-marker was identified with AUC, P, D. Astragaloside IV, laetrile, paeoniflorin, and ferulic acid were studied as potential Q-markers from BYHWD. The relative technologic parameters were presented by mathematic models, which could adequately illustrate the inherent properties of raw materials preparation and affection of Q-markers transitivity in equilibrium processing. AUC, P, D for potential Q-markers of AST-IV, laetrile, paeoniflorin, and FA were obtained, with the results of 289.9 mAu s, 46.24%, 22.35%; 1730 mAu s, 84.48%, 1.963%; 5600 mAu s, 70.22%, 0.4752%; 7810 mAu s, 24.29%, 4.235%, respectively. The results showed that the suitable Q-markers were laetrile and paeoniflorin in our study, which exhibited acceptable traceability and transitivity in the extraction process of TCMs. Therefore, these novel mathematic models might be developed as a new standard to control TCMs quality process from raw medicinal materials to product manufacturing. Copyright © 2018 Elsevier GmbH. All rights reserved.
Binfa, Lorena; Pantoja, Loreto; Ortiz, Jovita; Gurovich, Marcela; Cavada, Gabriel
2013-10-01
during 2007 the Chilean Ministry of Public Health introduced the Model of Integrated and Humanized Health Services, in addition to the Clinical Guide for Humanized Care during Delivery. Three years after its implementation, a study was conducted (i) to describe selected clinical outcomes of women who received care within this model, (ii) to identify the degree of maternal-newborn well-being and (iii) to explore the perception of this humanised attention during labour and delivery by both the professional staff (obstetricians and midwives) and consumers. a cross-sectional, descriptive study using both quantitative and qualitative methods was conducted with 508 women who delivered in two major hospitals within the National Health System in the metropolitan area of Santiago, Chile, from September 2010 until June 2011. The quantitative methods included a validated survey of maternal well-being and an adapted version of the American College of Nurse-Midwives (ACNM) standardised antepartum and intrapartum data set. The qualitative methods included six focus groups discussions (FGDs), with midwives, obstetricians and consumers. Additionally, two in depth interviews were carried out with the directors of the maternity units. the quantitative findings showed poor implementation of the guidelines: 92.7% of the women had medically induced labours (artificial rupture of the membranes and received oxytocin and epidural anaesthesia), and almost one-third of the women reported discontent with the care they received. The qualitative findings showed that the main complaint perceived by the midwives was that the health system was highly hierarchical and medicalised and that the obstetricians were not engaged in this modality of assistance. The women (consumers) highlighted that professionals (midwives and obstetricians) were highly technically skilled, and they felt confident in their assistance. However, women complained about receiving inadequate personal treatment from these professionals. The obstetricians showed no self-critique, stating that they always expressed concern for their patients and that they provided humanised professional assistance. by illuminating the main strengths and weakness with regard to the application of the model, these findings can help to inform strategies and actions to improve its implementation. © 2013 Elsevier Ltd. All rights reserved.
Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less
Integrated Modeling Approach for Optimal Management of Water, Energy and Food Security Nexus
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
2016-12-28
We report that water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-periodmore » socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. Lastly, the obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.« less
Global, quantitative and dynamic mapping of protein subcellular localization
Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg HH
2016-01-01
Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology. DOI: http://dx.doi.org/10.7554/eLife.16950.001 PMID:27278775
Hlusko, Leslea J; Schmitt, Christopher A; Monson, Tesla A; Brasil, Marianne F; Mahaney, Michael C
2016-08-16
Developmental genetics research on mice provides a relatively sound understanding of the genes necessary and sufficient to make mammalian teeth. However, mouse dentitions are highly derived compared with human dentitions, complicating the application of these insights to human biology. We used quantitative genetic analyses of data from living nonhuman primates and extensive osteological and paleontological collections to refine our assessment of dental phenotypes so that they better represent how the underlying genetic mechanisms actually influence anatomical variation. We identify ratios that better characterize the output of two dental genetic patterning mechanisms for primate dentitions. These two newly defined phenotypes are heritable with no measurable pleiotropic effects. When we consider how these two phenotypes vary across neontological and paleontological datasets, we find that the major Middle Miocene taxonomic shift in primate diversity is characterized by a shift in these two genetic outputs. Our results build on the mouse model by combining quantitative genetics and paleontology, and thereby elucidate how genetic mechanisms likely underlie major events in primate evolution.
An integrated biomechanical modeling approach to the ergonomic evaluation of drywall installation.
Yuan, Lu; Buchholz, Bryan; Punnett, Laura; Kriebel, David
2016-03-01
Three different methodologies: work sampling, computer simulation and biomechanical modeling, were integrated to study the physical demands of drywall installation. PATH (Posture, Activity, Tools, and Handling), a work-sampling based method, was used to quantify the percent of time that the drywall installers were conducting different activities with different body segment (trunk, arm, and leg) postures. Utilizing Monte-Carlo simulation to convert the categorical PATH data into continuous variables as inputs for the biomechanical models, the required muscle contraction forces and joint reaction forces at the low back (L4/L5) and shoulder (glenohumeral and sternoclavicular joints) were estimated for a typical eight-hour workday. To demonstrate the robustness of this modeling approach, a sensitivity analysis was conducted to examine the impact of some quantitative assumptions that have been made to facilitate the modeling approach. The results indicated that the modeling approach seemed to be the most sensitive to both the distribution of work cycles for a typical eight-hour workday and the distribution and values of Euler angles that are used to determine the "shoulder rhythm." Other assumptions including the distribution of trunk postures did not appear to have a significant impact on the model outputs. It was concluded that the integrated approach might provide an applicable examination of physical loads during the non-routine construction work, especially for those operations/tasks that have certain patterns/sequences for the workers to follow. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J
2015-03-15
This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.
Hong, Weizhe; Kennedy, Ann; Burgos-Artizzu, Xavier P; Zelikowsky, Moriel; Navonne, Santiago G; Perona, Pietro; Anderson, David J
2015-09-22
A lack of automated, quantitative, and accurate assessment of social behaviors in mammalian animal models has limited progress toward understanding mechanisms underlying social interactions and their disorders such as autism. Here we present a new integrated hardware and software system that combines video tracking, depth sensing, and machine learning for automatic detection and quantification of social behaviors involving close and dynamic interactions between two mice of different coat colors in their home cage. We designed a hardware setup that integrates traditional video cameras with a depth camera, developed computer vision tools to extract the body "pose" of individual animals in a social context, and used a supervised learning algorithm to classify several well-described social behaviors. We validated the robustness of the automated classifiers in various experimental settings and used them to examine how genetic background, such as that of Black and Tan Brachyury (BTBR) mice (a previously reported autism model), influences social behavior. Our integrated approach allows for rapid, automated measurement of social behaviors across diverse experimental designs and also affords the ability to develop new, objective behavioral metrics.
Hong, Weizhe; Kennedy, Ann; Burgos-Artizzu, Xavier P.; Zelikowsky, Moriel; Navonne, Santiago G.; Perona, Pietro; Anderson, David J.
2015-01-01
A lack of automated, quantitative, and accurate assessment of social behaviors in mammalian animal models has limited progress toward understanding mechanisms underlying social interactions and their disorders such as autism. Here we present a new integrated hardware and software system that combines video tracking, depth sensing, and machine learning for automatic detection and quantification of social behaviors involving close and dynamic interactions between two mice of different coat colors in their home cage. We designed a hardware setup that integrates traditional video cameras with a depth camera, developed computer vision tools to extract the body “pose” of individual animals in a social context, and used a supervised learning algorithm to classify several well-described social behaviors. We validated the robustness of the automated classifiers in various experimental settings and used them to examine how genetic background, such as that of Black and Tan Brachyury (BTBR) mice (a previously reported autism model), influences social behavior. Our integrated approach allows for rapid, automated measurement of social behaviors across diverse experimental designs and also affords the ability to develop new, objective behavioral metrics. PMID:26354123
NASA Astrophysics Data System (ADS)
SATO, H.; Iwahana, G.; Ohta, T.
2013-12-01
Siberian larch forest is the largest coniferous forest region in the world. In this vast region, larch often forms nearly pure stands, regenerated by recurrent fire. This region is characterized by a short and dry growing season; the annual mean precipitation for Yakutsk was only about 240 mm. To maintain forest ecosystem under such small precipitation, underlying permafrost and seasonal soil freezing-thawing-cycle have been supposed to play important roles; (1) frozen ground inhibits percolation of soil water into deep soil layers, and (2) excess soil water at the end of growing season can be carried over until the next growing season as ice, and larch trees can use the melt water. As a proof for this explanation, geographical distribution of Siberian larch region highly coincides with continuous and discontinuous permafrost zone. Recent observations and simulation studies suggests that existences of larch forest and permafrost in subsurface layer are co-dependent; permafrost maintains the larch forest by enhancing water use efficiency of trees, while larch forest maintains permafrost by inhibiting solar radiation and preventing heat exchanges between soil and atmosphere. Owing to such complexity and absence of enough ecosystem data available, current-generation Earth System Models significantly diverse in their prediction of structure and key ecosystem functions in Siberian larch forest under changing climate. Such uncertainty should in turn expand uncertainty over predictions of climate, because Siberian larch forest should have major role in the global carbon balance with its huge area and vast potential carbon pool within the biomass and soil, and changes in boreal forest albedo can have a considerable effect on Northern Hemisphere climate. In this study, we developed an integrated ecosystem model, which treats interactions between plant-dynamics and freeze-thaw cycles. This integrated model contains a dynamic global vegetation model SEIB-DGVM, which simulates plant and carbon dynamics. It also contains a one-dimensional land surface model NOAH 2.7.1, which simulates soil moisture (both liquid and frozen), soil temperature, snowpack depth and density, canopy water content, and the energy and water fluxes. This integrated model quantitatively reconstructs post-fire development of forest structure (i.e. LAI and biomass) and organic soil layer, which dampens heat exchanges between soil and atmosphere. With the post-fire development of LAI and the soil organic layer, the integrated model also quantitatively reconstructs changes in seasonal maximum of active layer depth. The integrated model is then driven by the IPCC A1B scenario of rising atmospheric CO2, and by climate changes during the twenty-first century resulting from the change in CO2. This simulation suggests that forecasted global warming would causes decay of Siberian larch ecosystem, but such responses could be delayed by "memory effect" of the soil organic layer for hundreds of years.
NASA Astrophysics Data System (ADS)
Love, Curtis Clinton
New hybrid educational programs are evolving to challenge traditional definitions of distance education. One such program is the Integrated Science (IS) program of The University of Alabama's Center for Communication and Educational Technology (CCET), which was developed to address concerns about scientific illiteracy in middle school education. IS relies on a multilayered use of communication technologies (primarily videotape and e-mail) for delivery of student instruction, as a delivery vehicle for curriculum materials, and as a feedback mechanism. The IS program serves to enhance classroom science instruction by providing professionally developed videotaped educational lectures and curriculum materials used by classroom science teachers. To date, such hybrid forms of distance education have seldom been examined. Using both qualitative and quantitative methodologies, this study examines 64 IS classrooms visited from October 1992 to April 1995 by researchers at the Institute for Communication Research at The University of Alabama. Detailed qualitative information was gathered from each classroom by student, teacher, and administrator interviews; focus groups; questionnaires; and recording observations of classroom activity. From the reports of the site visits, key components of the IS classroom experience thought to be predictors of the success of the program for individual classrooms are identified. Exemplars of both positive and negative components are provided in narrative form. A model is posited to describe the potential relationships between the various components and their impact on the overall success of the IS program in an individual classroom. Quantitative assessments were made of the 21 key variables identified in the qualitative data that appeared to enhance the likelihood of success for the IS program in an individual classroom. Accounting for 90% of the variance in the regression model, the factor with the greatest predictive potential for success of Integrated Science was "how effective the teacher was in using classroom management skills." The results suggest that despite extensive research and curriculum development, use of sophisticated communication technologies, high video production standards, and expertise of IS video instructors, ultimately the classroom teacher plays the most critical role in determining a class's success and in achieving the goals of the Integrated Science program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang Shuli; Yeh Chiatsung; Budd, William W.
2009-02-15
Sustainability indicators have been widely developed to monitor and assess sustainable development. They are expected to guide political decision-making based on their capability to represent states and trends of development. However, using indicators to assess the sustainability of urban strategies and policies has limitations - as they neither reflect the systemic interactions among them, nor provide normative indications in what direction they should be developed. This paper uses a semi-quantitative systematic model tool (Sensitivity Model Tools, SM) to analyze the role of urban development in Taiwan's sustainability. The results indicate that the natural environment in urban area is one ofmore » the most critical components and the urban economic production plays a highly active role in affecting Taiwan's sustainable development. The semi-quantitative simulation model integrates sustainability indicators and urban development policy to provide decision-makers with information about the impacts of their decisions on urban development. The system approach incorporated by this paper can be seen as a necessary, but not sufficient, condition for a sustainability assessment. The participatory process of expert participants for providing judgments on the relations between indicator variables is also discussed.« less
XRF map identification problems based on a PDE electrodeposition model
NASA Astrophysics Data System (ADS)
Sgura, Ivonne; Bozzini, Benedetto
2017-04-01
In this paper we focus on the following map identification problem (MIP): given a morphochemical reaction-diffusion (RD) PDE system modeling an electrodepostion process, we look for a time t *, belonging to the transient dynamics and a set of parameters \\mathbf{p} , such that the PDE solution, for the morphology h≤ft(x,y,{{t}\\ast};\\mathbf{p}\\right) and for the chemistry θ ≤ft(x,y,{{t}\\ast};\\mathbf{p}\\right) approximates a given experimental map M *. Towards this aim, we introduce a numerical algorithm using singular value decomposition (SVD) and Frobenius norm to give a measure of error distance between experimental maps for h and θ and simulated solutions of the RD-PDE system on a fixed time integration interval. The technique proposed allows quantitative use of microspectroscopy images, such as XRF maps. Specifically, in this work we have modelled the morphology and manganese distributions of nanostructured components of innovative batteries and we have followed their changes resulting from ageing under operating conditions. The availability of quantitative information on space-time evolution of active materials in terms of model parameters will allow dramatic improvements in knowledge-based optimization of battery fabrication and operation.
Tadeo, Irene; Piqueras, Marta; Montaner, David; Villamón, Eva; Berbegall, Ana P; Cañete, Adela; Navarro, Samuel; Noguera, Rosa
2014-02-01
Risk classification and treatment stratification for cancer patients is restricted by our incomplete picture of the complex and unknown interactions between the patient's organism and tumor tissues (transformed cells supported by tumor stroma). Moreover, all clinical factors and laboratory studies used to indicate treatment effectiveness and outcomes are by their nature a simplification of the biological system of cancer, and cannot yet incorporate all possible prognostic indicators. A multiparametric analysis on 184 tumor cylinders was performed. To highlight the benefit of integrating digitized medical imaging into this field, we present the results of computational studies carried out on quantitative measurements, taken from stromal and cancer cells and various extracellular matrix fibers interpenetrated by glycosaminoglycans, and eight current approaches to risk stratification systems in patients with primary and nonprimary neuroblastoma. New tumor tissue indicators from both fields, the cellular and the extracellular elements, emerge as reliable prognostic markers for risk stratification and could be used as molecular targets of specific therapies. The key to dealing with personalized therapy lies in the mathematical modeling. The use of bioinformatics in patient-tumor-microenvironment data management allows a predictive model in neuroblastoma.
Overview: early history of crop growth and photosynthesis modeling.
El-Sharkawy, Mabrouk A
2011-02-01
As in industrial and engineering systems, there is a need to quantitatively study and analyze the many constituents of complex natural biological systems as well as agro-ecosystems via research-based mechanistic modeling. This objective is normally addressed by developing mathematically built descriptions of multilevel biological processes to provide biologists a means to integrate quantitatively experimental research findings that might lead to a better understanding of the whole systems and their interactions with surrounding environments. Aided with the power of computational capacities associated with computer technology then available, pioneering cropping systems simulations took place in the second half of the 20th century by several research groups across continents. This overview summarizes that initial pioneering effort made to simulate plant growth and photosynthesis of crop canopies, focusing on the discovery of gaps that exist in the current scientific knowledge. Examples are given for those gaps where experimental research was needed to improve the validity and application of the constructed models, so that their benefit to mankind was enhanced. Such research necessitates close collaboration among experimentalists and model builders while adopting a multidisciplinary/inter-institutional approach. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lin, T.; Lin, Z.; Lim, S.
2017-12-01
We present an integrated modeling framework to simulate groundwater level change under the dramatic increase of hydraulic fracturing water use in the Bakken Shale oil production area. The framework combines the agent-based model (ABM) with the Fox Hills-Hell Creek (FH-HC) groundwater model. In development of the ABM, institution theory is used to model the regulation policies from the North Dakota State Water Commission, while evolutionary programming and cognitive maps are used to model the social structure that emerges from the behavior of competing individual water businesses. Evolutionary programming allows individuals to select an appropriate strategy when annually applying for potential water use permits; whereas cognitive maps endow agent's ability and willingness to compete for more water sales. All agents have their own influence boundaries that inhibit their competitive behavior toward their neighbors but not to non-neighbors. The decision-making process is constructed and parameterized with both quantitative and qualitative information, i.e., empirical water use data and knowledge gained from surveys with stakeholders. By linking institution theory, evolutionary programming, and cognitive maps, our approach addresses a higher complexity of the real decision making process. Furthermore, this approach is a new exploration for modeling the dynamics of Coupled Human and Natural System. After integrating ABM with the FH-HC model, drought and limited water accessibility scenarios are simulated to predict FH-HC ground water level changes in the future. The integrated modeling framework of ABM and FH-HC model can be used to support making scientifically sound policies in water allocation and management.
A cognitive perspective on health systems integration: results of a Canadian Delphi study
2014-01-01
Background Ongoing challenges to healthcare integration point toward the need to move beyond structural and process issues. While we know what needs to be done to achieve integrated care, there is little that informs us as to how. We need to understand how diverse organizations and professionals develop shared knowledge and beliefs – that is, we need to generate knowledge about normative integration. We present a cognitive perspective on integration, based on shared mental model theory, that may enhance our understanding and ability to measure and influence normative integration. The aim of this paper is to validate and improve the Mental Models of Integrated Care (MMIC) Framework, which outlines important knowledge and beliefs whose convergence or divergence across stakeholder groups may influence inter-professional and inter-organizational relations. Methods We used a two-stage web-based modified Delphi process to test the MMIC Framework against expert opinion using a random sample of participants from Canada’s National Symposium on Integrated Care. Respondents were asked to rate the framework’s clarity, comprehensiveness, usefulness, and importance using seven-point ordinal scales. Spaces for open comments were provided. Descriptive statistics were used to describe the structured responses, while open comments were coded and categorized using thematic analysis. The Kruskall-Wallis test was used to examine cross-group agreement by level of integration experience, current workplace, and current role. Results In the first round, 90 individuals responded (52% response rate), representing a wide range of professional roles and organization types from across the continuum of care. In the second round, 68 individuals responded (75.6% response rate). The quantitative and qualitative feedback from experts was used to revise the framework. The re-named “Integration Mindsets Framework” consists of a Strategy Mental Model and a Relationships Mental Model, comprising a total of nineteen content areas. Conclusions The Integration Mindsets Framework draws the attention of researchers and practitioners to how various stakeholders think about and conceptualize integration. A cognitive approach to understanding and measuring normative integration complements dominant cultural approaches and allows for more fine-grained analyses. The framework can be used by managers and leaders to facilitate the interpretation, planning, implementation, management and evaluation of integration initiatives. PMID:24885659
Information-theoretic approach to interactive learning
NASA Astrophysics Data System (ADS)
Still, S.
2009-01-01
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer's world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process's causal organization in the presence of the learner's actions. A fundamental consequence of the proposed principle is that the learner's optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.
Li, Jing; Kim, Seongho; Shields, Anthony F; Douglas, Kirk A; McHugh, Christopher I; Lawhorn-Crews, Jawana M; Wu, Jianmei; Mangner, Thomas J; LoRusso, Patricia M
2016-11-01
FAU, a pyrimidine nucleotide analogue, is a prodrug bioactivated by intracellular thymidylate synthase to form FMAU, which is incorporated into DNA, causing cell death. This study presents a model-based approach to integrating dynamic positron emission tomography (PET) and conventional plasma pharmacokinetic studies to characterize the plasma and tissue pharmacokinetics of FAU and FMAU. Twelve cancer patients were enrolled into a phase 1 study, where conventional plasma pharmacokinetic evaluation of therapeutic FAU (50-1600 mg/m 2 ) and dynamic PET assessment of 18 F-FAU were performed. A parent-metabolite population pharmacokinetic model was developed to simultaneously fit PET-derived tissue data and conventional plasma pharmacokinetic data. The developed model enabled separation of PET-derived total tissue concentrations into the parent drug and metabolite components. The model provides quantitative, mechanistic insights into the bioactivation of FAU and retention of FMAU in normal and tumor tissues and has potential utility to predict tumor responsiveness to FAU treatment. © 2016, The American College of Clinical Pharmacology.
Multi-object segmentation framework using deformable models for medical imaging analysis.
Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel
2016-08-01
Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed framework has a wide range of applications especially in the presence of adjacent structures of interest or under intra-structure inhomogeneities giving excellent quantitative results.
NASA Astrophysics Data System (ADS)
Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine
2016-04-01
Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The probability to obtain a safety factor below 1 represents the probability of occurrence of a landslide for a given triggering event. The dispersion of the distribution gives the uncertainty of the result. Finally, a map is created, displaying a probability of occurrence for each computing cell of the studied area. In order to take into account the land-uses change, a complementary module integrating the vegetation effects on soil properties has been recently developed. Last years, the model has been applied at different scales for different geomorphological environments: (i) at regional scale (1:50,000-1:25,000) in French West Indies and French Polynesian islands (ii) at local scale (i.e.1:10,000) for two complex mountainous areas; (iii) at the site-specific scale (1:2,000) for one landslide. For each study the 3D geotechnical model has been adapted. The different studies have allowed : (i) to discuss the different factors included in the model especially the initial 3D geotechnical models; (ii) to precise the location of probable failure following different hydrological scenarii; (iii) to test the effects of climatic change and land-use on slopes for two cases. In that way, future changes in temperature, precipitation and vegetation cover can be analyzed, permitting to address the impacts of global change on landslides. Finally, results show that it is possible to obtain reliable information about future slope failures at different scale of work for different scenarii with an integrated approach. The final information about landslide susceptibility (i.e. probability of failure) can be integrated in landslide hazard assessment and could be an essential information source for future land planning. As it has been performed in the ANR Project SAMCO (Society Adaptation for coping with Mountain risks in a global change COntext), this analysis constitutes a first step in the chain for risk assessment for different climate and economical development scenarios, to evaluate the resilience of mountainous areas.
Modeling a dielectric elastomer as driven by triboelectric nanogenerator
NASA Astrophysics Data System (ADS)
Chen, Xiangyu; Jiang, Tao; Wang, Zhong Lin
2017-01-01
By integrating a triboelectric nanogenerator (TENG) and a thin film dielectric elastomer actuator (DEA), the DEA can be directly powered and controlled by the output of the TENG, which demonstrates a self-powered actuation system toward various practical applications in the fields of electronic skin and soft robotics. This paper describes a method to construct a physical model for this integrated TENG-DEA system on the basis of nonequilibrium thermodynamics and electrostatics induction theory. The model can precisely simulate the influences from both the viscoelasticity and current leakage to the output performance of the TENG, which can help us to better understand the interaction between TENG and DEA devices. Accordingly, the established electric field, the deformation strain of the DEA, and the output current from the TENG are systemically analyzed by using this model. A comparison between real measurements and simulation results confirms that the proposed model can predict the dynamic response of the DEA driven by contact-electrification and can also quantitatively analyze the relaxation of the tribo-induced strain due to the leakage behavior. Hence, the proposed model in this work could serve as a guidance for optimizing the devices in the future studies.
Absolute quantitation of isoforms of post-translationally modified proteins in transgenic organism.
Li, Yaojun; Shu, Yiwei; Peng, Changchao; Zhu, Lin; Guo, Guangyu; Li, Ning
2012-08-01
Post-translational modification isoforms of a protein are known to play versatile biological functions in diverse cellular processes. To measure the molar amount of each post-translational modification isoform (P(isf)) of a target protein present in the total protein extract using mass spectrometry, a quantitative proteomic protocol, absolute quantitation of isoforms of post-translationally modified proteins (AQUIP), was developed. A recombinant ERF110 gene overexpression transgenic Arabidopsis plant was used as the model organism for demonstration of the proof of concept. Both Ser-62-independent (14)N-coded synthetic peptide standards and (15)N-coded ERF110 protein standard isolated from the heavy nitrogen-labeled transgenic plants were employed simultaneously to determine the concentration of all isoforms (T(isf)) of ERF110 in the whole plant cell lysate, whereas a pair of Ser-62-dependent synthetic peptide standards were used to quantitate the Ser-62 phosphosite occupancy (R(aqu)). The P(isf) was finally determined by integrating the two empirically measured variables using the following equation: P(isf) = T(isf) · R(aqu). The absolute amount of Ser-62-phosphorylated isoform of ERF110 determined using AQUIP was substantiated with a stable isotope labeling in Arabidopsis-based relative and accurate quantitative proteomic approach. The biological role of the Ser-62-phosphorylated isoform was demonstrated in transgenic plants.
A Quantitative ADME-base Tool for Exploring Human ...
Exposure to a wide range of chemicals through our daily habits and routines is ubiquitous and largely unavoidable within modern society. The potential for human exposure, however, has not been quantified for the vast majority of chemicals with wide commercial use. Creative advances in exposure science are needed to support efficient and effective evaluation and management of chemical risks, particularly for chemicals in consumer products. The U.S. Environmental Protection Agency Office of Research and Development is developing, or collaborating in the development of, scientifically-defensible methods for making quantitative or semi-quantitative exposure predictions. The Exposure Prioritization (Ex Priori) model is a simplified, quantitative visual dashboard that provides a rank-ordered internalized dose metric to simultaneously explore exposures across chemical space (not chemical by chemical). Diverse data streams are integrated within the interface such that different exposure scenarios for “individual,” “population,” or “professional” time-use profiles can be interchanged to tailor exposure and quantitatively explore multi-chemical signatures of exposure, internalized dose (uptake), body burden, and elimination. Ex Priori has been designed as an adaptable systems framework that synthesizes knowledge from various domains and is amenable to new knowledge/information. As such, it algorithmically captures the totality of exposure across pathways. It
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christensen, Craig
Opportunities for combining energy efficiency, demand response, and energy storage with PV are often missed, because the required knowledge and expertise for these different technologies exist in separate organizations or individuals. Furthermore, there is a lack of quantitative tools to optimize energy efficiency, demand response and energy storage with PV, especially for existing buildings. Our goal is to develop a modeling tool, BEopt-CA (Ex), with capabilities to facilitate identification and implementation of a balanced integration of energy efficiency (EE), demand response (DR), and energy storage (ES) with photovoltaics (PV) within the residential retrofit market. To achieve this goal, we willmore » adapt and extend an existing tool -- BEopt -- that is designed to identify optimal combinations of efficiency and PV in new home designs. In addition, we will develop multifamily residential modeling capabilities for use in California, to facilitate integration of distributed solar power into the grid in order to maximize its value to California ratepayers. The project is follow-on research that leverages previous California Solar Initiative RD&D investment in the BEopt software. BEopt facilitates finding the least cost combination of energy efficiency and renewables to support integrated DSM (iDSM) and Zero Net Energy (ZNE) in California residential buildings. However, BEopt is currently focused on modeling single-family houses and does not include satisfactory capabilities for modeling multifamily homes. The project brings BEopt's existing modeling and optimization capabilities to multifamily buildings, including duplexes, triplexes, townhouses, flats, and low-rise apartment buildings.« less
Microstructure development in Kolmogorov, Johnson-Mehl, and Avrami nucleation and growth kinetics
NASA Astrophysics Data System (ADS)
Pineda, Eloi; Crespo, Daniel
1999-08-01
A statistical model with the ability to evaluate the microstructure developed in nucleation and growth kinetics is built in the framework of the Kolmogorov, Johnson-Mehl, and Avrami theory. A populational approach is used to compute the observed grain-size distribution. The impingement process which delays grain growth is analyzed, and the effective growth rate of each population is estimated considering the previous grain history. The proposed model is integrated for a wide range of nucleation and growth protocols, including constant nucleation, pre-existing nuclei, and intermittent nucleation with interface or diffusion-controlled grain growth. The results are compared with Monte Carlo simulations, giving quantitative agreement even in cases where previous models fail.
Features and functions of nonlinear spatial integration by retinal ganglion cells.
Gollisch, Tim
2013-11-01
Ganglion cells in the vertebrate retina integrate visual information over their receptive fields. They do so by pooling presynaptic excitatory inputs from typically many bipolar cells, which themselves collect inputs from several photoreceptors. In addition, inhibitory interactions mediated by horizontal cells and amacrine cells modulate the structure of the receptive field. In many models, this spatial integration is assumed to occur in a linear fashion. Yet, it has long been known that spatial integration by retinal ganglion cells also incurs nonlinear phenomena. Moreover, several recent examples have shown that nonlinear spatial integration is tightly connected to specific visual functions performed by different types of retinal ganglion cells. This work discusses these advances in understanding the role of nonlinear spatial integration and reviews recent efforts to quantitatively study the nature and mechanisms underlying spatial nonlinearities. These new insights point towards a critical role of nonlinearities within ganglion cell receptive fields for capturing responses of the cells to natural and behaviorally relevant visual stimuli. In the long run, nonlinear phenomena of spatial integration may also prove important for implementing the actual neural code of retinal neurons when designing visual prostheses for the eye. Copyright © 2012 Elsevier Ltd. All rights reserved.
Titz, Alexandra; Döll, Petra
2009-02-01
Widespread presence of human pharmaceuticals in water resources across the globe is documented. While some, but certainly not enough, research on the occurrence, fate and effect of pharmaceuticals in water resources has been carried out, a holistic risk management strategy is missing. The transdisciplinary research project "start" aimed to develop an integrative strategy by the participation of experts representing key actors in the problem field "pharmaceuticals in drinking water". In this paper, we describe a novel modelling method, actor modelling with the semi-quantitative software DANA (Dynamic Actor Network Analysis), and its application in support of identifying an integrative risk management strategy. Based on the individual perceptions of different actors, the approach allows the identification of optimal strategies. Actors' perceptions were elicited by participatory model building and interviews, and were then modelled in perception graphs. Actor modelling indicated that an integrative strategy that targets environmentally-responsible prescription, therapy, and disposal of pharmaceuticals on one hand, and the development of environmentally-friendly pharmaceuticals on the other hand, will likely be most effective for reducing the occurrence of pharmaceuticals in drinking water (at least in Germany where the study was performed). However, unlike most other actors, the pharmaceutical industry itself does not perceive that the production of environmentally-friendly pharmaceuticals is an action that helps to achieve its goals, but contends that continued development of highly active pharmaceutical ingredients will help to reduce the occurrence of pharmaceuticals in the water cycle. Investment in advanced waste or drinking water treatment is opposed by both the wastewater treatment company and the drinking water supplier, and is not mentioned as appropriate by the other actors. According to our experience, actor modelling is a useful method to suggest effective and realisable integrative risk management strategies in complex problem fields that involve many societal actors.
A new organismal systems biology: how animals walk the tight rope between stability and change.
Padilla, Dianna K; Tsukimura, Brian
2014-07-01
The amount of knowledge in the biological sciences is growing at an exponential rate. Simultaneously, the incorporation of new technologies in gathering scientific information has greatly accelerated our capacity to ask, and answer, new questions. How do we, as organismal biologists, meet these challenges, and develop research strategies that will allow us to address the grand challenge question: how do organisms walk the tightrope between stability and change? Organisms and organismal systems are complex, and multi-scale in both space and time. It is clear that addressing major questions about organismal biology will not come from "business as usual" approaches. Rather, we require the collaboration of a wide range of experts and integration of biological information with more quantitative approaches traditionally found in engineering and applied mathematics. Research programs designed to address grand challenge questions will require deep knowledge and expertise within subfields of organismal biology, collaboration and integration among otherwise disparate areas of research, and consideration of organisms as integrated systems. Our ability to predict which features of complex integrated systems provide the capacity to be robust in changing environments is poorly developed. A predictive organismal biology is needed, but will require more quantitative approaches than are typical in biology, including complex systems-modeling approaches common to engineering. This new organismal systems biology will have reciprocal benefits for biologists, engineers, and mathematicians who address similar questions, including those working on control theory and dynamical systems biology, and will develop the tools we need to address the grand challenge questions of the 21st century. © The Author 2014. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.
Quantifying patterns of research interest evolution
NASA Astrophysics Data System (ADS)
Jia, Tao; Wang, Dashun; Szymanski, Boleslaw
Changing and shifting research interest is an integral part of a scientific career. Despite extensive investigations of various factors that influence a scientist's choice of research topics, quantitative assessments of mechanisms that give rise to macroscopic patterns characterizing research interest evolution of individual scientists remain limited. Here we perform a large-scale analysis of extensive publication records, finding that research interest change follows a reproducible pattern characterized by an exponential distribution. We identify three fundamental features responsible for the observed exponential distribution, which arise from a subtle interplay between exploitation and exploration in research interest evolution. We develop a random walk based model, which adequately reproduces our empirical observations. Our study presents one of the first quantitative analyses of macroscopic patterns governing research interest change, documenting a high degree of regularity underlying scientific research and individual careers.
NASA Technical Reports Server (NTRS)
Sibonga, J. D.; Truskowski, P.
2010-01-01
This slide presentation reviews the concerns that astronauts in long duration flights might have a greater risk of bone fracture as they age than the general population. A panel of experts was convened to review the information and recommend mechanisms to monitor the health of bones in astronauts. The use of Quantitative Computed Tomography (QCT) scans for risk surveillance to detect the clinical trigger and to inform countermeasure evaluation is reviewed. An added benefit of QCT is that it facilitates an individualized estimation of bone strength by Finite Element Modeling (FEM), that can inform approaches for bone rehabilitation. The use of FEM is reviewed as a process that arrives at a composite number to estimate bone strength, because it integrates multiple factors.
NASA Astrophysics Data System (ADS)
Emori, Seita; Takahashi, Kiyoshi; Yamagata, Yoshiki; Oki, Taikan; Mori, Shunsuke; Fujigaki, Yuko
2013-04-01
With the aim of proposing strategies of global climate risk management, we have launched a five-year research project called ICA-RUS (Integrated Climate Assessment - Risks, Uncertainties and Society). In this project with the phrase "risk management" in its title, we aspire for a comprehensive assessment of climate change risks, explicit consideration of uncertainties, utilization of best available information, and consideration of every possible conditions and options. We also regard the problem as one of decision-making at the human level, which involves social value judgments and adapts to future changes in circumstances. The ICA-RUS project consists of the following five themes: 1) Synthesis of global climate risk management strategies, 2) Optimization of land, water and ecosystem uses for climate risk management, 3) Identification and analysis of critical climate risks, 4) Evaluation of climate risk management options under technological, social and economic uncertainties and 5) Interactions between scientific and social rationalities in climate risk management (see also: http://www.nies.go.jp/ica-rus/en/). For the integration of quantitative knowledge of climate change risks and responses, we apply a tool named AIM/Impact [Policy], which consists of an energy-economic model, a simplified climate model and impact projection modules. At the same time, in order to make use of qualitative knowledge as well, we hold monthly project meetings for the discussion of risk management strategies and publish annual reports based on the quantitative and qualitative information. To enhance the comprehensiveness of the analyses, we maintain an inventory of risks and risk management options. The inventory is revised iteratively through interactive meetings with stakeholders such as policymakers, government officials and industrial representatives.
Gloaguen, Pauline; Bournais, Sylvain; Alban, Claude; Ravanel, Stéphane; Seigneurin-Berny, Daphné; Matringe, Michel; Tardif, Marianne; Kuntz, Marcel; Ferro, Myriam; Bruley, Christophe; Rolland, Norbert; Vandenbrouck, Yves; Curien, Gilles
2017-06-01
Higher plants, as autotrophic organisms, are effective sources of molecules. They hold great promise for metabolic engineering, but the behavior of plant metabolism at the network level is still incompletely described. Although structural models (stoichiometry matrices) and pathway databases are extremely useful, they cannot describe the complexity of the metabolic context, and new tools are required to visually represent integrated biocurated knowledge for use by both humans and computers. Here, we describe ChloroKB, a Web application (http://chlorokb.fr/) for visual exploration and analysis of the Arabidopsis ( Arabidopsis thaliana ) metabolic network in the chloroplast and related cellular pathways. The network was manually reconstructed through extensive biocuration to provide transparent traceability of experimental data. Proteins and metabolites were placed in their biological context (spatial distribution within cells, connectivity in the network, participation in supramolecular complexes, and regulatory interactions) using CellDesigner software. The network contains 1,147 reviewed proteins (559 localized exclusively in plastids, 68 in at least one additional compartment, and 520 outside the plastid), 122 proteins awaiting biochemical/genetic characterization, and 228 proteins for which genes have not yet been identified. The visual presentation is intuitive and browsing is fluid, providing instant access to the graphical representation of integrated processes and to a wealth of refined qualitative and quantitative data. ChloroKB will be a significant support for structural and quantitative kinetic modeling, for biological reasoning, when comparing novel data with established knowledge, for computer analyses, and for educational purposes. ChloroKB will be enhanced by continuous updates following contributions from plant researchers. © 2017 American Society of Plant Biologists. All Rights Reserved.
Comparison of two weighted integration models for the cueing task: linear and likelihood
NASA Technical Reports Server (NTRS)
Shimozaki, Steven S.; Eckstein, Miguel P.; Abbey, Craig K.
2003-01-01
In a task in which the observer must detect a signal at two locations, presenting a precue that predicts the location of a signal leads to improved performance with a valid cue (signal location matches the cue), compared to an invalid cue (signal location does not match the cue). The cue validity effect has often been explained with a limited capacity attentional mechanism improving the perceptual quality at the cued location. Alternatively, the cueing effect can also be explained by unlimited capacity models that assume a weighted combination of noisy responses across the two locations. We compare two weighted integration models, a linear model and a sum of weighted likelihoods model based on a Bayesian observer. While qualitatively these models are similar, quantitatively they predict different cue validity effects as the signal-to-noise ratios (SNR) increase. To test these models, 3 observers performed in a cued discrimination task of Gaussian targets with an 80% valid precue across a broad range of SNR's. Analysis of a limited capacity attentional switching model was also included and rejected. The sum of weighted likelihoods model best described the psychophysical results, suggesting that human observers approximate a weighted combination of likelihoods, and not a weighted linear combination.
Optimal cue integration in ants.
Wystrach, Antoine; Mangan, Michael; Webb, Barbara
2015-10-07
In situations with redundant or competing sensory information, humans have been shown to perform cue integration, weighting different cues according to their certainty in a quantifiably optimal manner. Ants have been shown to merge the directional information available from their path integration (PI) and visual memory, but as yet it is not clear that they do so in a way that reflects the relative certainty of the cues. In this study, we manipulate the variance of the PI home vector by allowing ants (Cataglyphis velox) to run different distances and testing their directional choice when the PI vector direction is put in competition with visual memory. Ants show progressively stronger weighting of their PI direction as PI length increases. The weighting is quantitatively predicted by modelling the expected directional variance of home vectors of different lengths and assuming optimal cue integration. However, a subsequent experiment suggests ants may not actually compute an internal estimate of the PI certainty, but are using the PI home vector length as a proxy. © 2015 The Author(s).
Chromosomal integration of adenoviral vector DNA in vivo.
Stephen, Sam Laurel; Montini, Eugenio; Sivanandam, Vijayshankar Ganesh; Al-Dhalimy, Muhseen; Kestler, Hans A; Finegold, Milton; Grompe, Markus; Kochanek, Stefan
2010-10-01
So far there has been no report of any clinical or preclinical evidence for chromosomal vector integration following adenovirus (Ad) vector-mediated gene transfer in vivo. We used liver gene transfer with high-capacity Ad vectors in the FAH(Deltaexon5) mouse model to analyze homologous and heterologous recombination events between vector and chromosomal DNA. Intravenous injection of Ad vectors either expressing a fumarylacetoacetate hydrolase (FAH) cDNA or carrying part of the FAH genomic locus resulted in liver nodules of FAH-expressing hepatocytes, demonstrating chromosomal vector integration. Analysis of junctions between vector and chromosomal DNA following heterologous recombination indicated integration of the vector genome through its termini. Heterologous recombination occurred with a median frequency of 6.72 x 10(-5) per transduced hepatocyte, while homologous recombination occurred more rarely with a median frequency of 3.88 x 10(-7). This study has established quantitative and qualitative data on recombination of adenoviral vector DNA with genomic DNA in vivo, contributing to a risk-benefit assessment of the biosafety of Ad vector-mediated gene transfer.
Students' Perception of Technology Use in Nursing Education.
Williamson, Kathleen M; Muckle, Janelle
2018-02-01
Technology is an integral part of a nurse's practice; therefore, it is necessary for technology to be integrated into the nursing curriculum for students. Nursing schools are shifting paradigms by integrating technology into the teaching environment to foster active and meaningful learning experiences. Factors related to external influences on individual beliefs, attitudes, and intention to use need to be studied so nurse educators can support the integration of technology into pedagogy. The Technology Acceptance Model was used to evaluate student perceptions of usefulness and ease of use of technology, while matriculated in a baccalaureate level nursing program. Quantitative and qualitative data were collected to uncover how nursing students (N = 375) perceived the usefulness and ease of use of technology while in nursing school. Almost every student (99.7%) owned a smartphone, and 95% were reasonably comfortable using various technologies. Selecting and incorporating technological tools to successfully support learning is essential to overcome challenges and support the innovative delivery of content and use of technology by students.
Lee, Christina; Rowlands, Ingrid J
2015-02-01
To discuss an example of mixed methods in health psychology, involving separate quantitative and qualitative studies of women's mental health in relation to miscarriage, in which the two methods produced different but complementary results, and to consider ways in which the findings can be integrated. We describe two quantitative projects involving statistical analysis of data from 998 young women who had had miscarriages, and 8,083 who had not, across three waves of the Australian Longitudinal Study on Women's Health. We also describe a qualitative project involving thematic analysis of interviews with nine Australian women who had had miscarriages. The quantitative analyses indicate that the main differences between young women who do and do not experience miscarriage relate to social disadvantage (and thus likelihood of relatively early pregnancy) and to a lifestyle that makes pregnancy likely: Once these factors are accounted for, there are no differences in mental health. Further, longitudinal modelling demonstrates that women who have had miscarriages show a gradual increase in mental health over time, with the exception of women with prior diagnoses of anxiety, depression, or both. By contrast, qualitative analysis of the interviews indicates that women who have had miscarriages experience deep emotional responses and a long and difficult process of coming to terms with their loss. A contextual model of resilience provides a possible framework for understanding these apparently disparate results. Considering positive mental health as including the ability to deal constructively with negative life events, and consequent emotional distress, offers a model that distinguishes between poor mental health and the processes of coping with major life events. In the context of miscarriage, women's efforts to struggle with difficult emotions, and search for meaning, can be viewed as pathways to resilience rather than to psychological distress. Statement of contribution What is already known on this subject? Quantitative research shows that women who miscarry usually experience moderate depression and anxiety, which persists for around 6 months. Qualitative research shows that women who miscarry frequently experience deep grief, which can last for years. What does this study add? We consider ways in which these disparate findings might triangulate. The results suggest a need to distinguish between poor mental health and the experience of loss and grief. Adjusting to miscarriage is often emotionally challenging but not always associated with poor mental health. © 2014 The British Psychological Society.