Sample records for existing models quantitatively

  1. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  2. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  3. Teaching Note--"By Any Means Necessary!" Infusing Socioeconomic Justice Content into Quantitative Research Course Work

    ERIC Educational Resources Information Center

    Slayter, Elspeth M.

    2017-01-01

    Existing research suggests a majority of faculty include social justice content in research courses but not through the use of existing quantitative data for in-class activities that foster mastery of data analysis and interpretation and curiosity about social justice-related topics. By modeling data-driven dialogue and the deconstruction of…

  4. A generalised individual-based algorithm for modelling the evolution of quantitative herbicide resistance in arable weed populations.

    PubMed

    Liu, Chun; Bridges, Melissa E; Kaundun, Shiv S; Glasgow, Les; Owen, Micheal Dk; Neve, Paul

    2017-02-01

    Simulation models are useful tools for predicting and comparing the risk of herbicide resistance in weed populations under different management strategies. Most existing models assume a monogenic mechanism governing herbicide resistance evolution. However, growing evidence suggests that herbicide resistance is often inherited in a polygenic or quantitative fashion. Therefore, we constructed a generalised modelling framework to simulate the evolution of quantitative herbicide resistance in summer annual weeds. Real-field management parameters based on Amaranthus tuberculatus (Moq.) Sauer (syn. rudis) control with glyphosate and mesotrione in Midwestern US maize-soybean agroecosystems demonstrated that the model can represent evolved herbicide resistance in realistic timescales. Sensitivity analyses showed that genetic and management parameters were impactful on the rate of quantitative herbicide resistance evolution, whilst biological parameters such as emergence and seed bank mortality were less important. The simulation model provides a robust and widely applicable framework for predicting the evolution of quantitative herbicide resistance in summer annual weed populations. The sensitivity analyses identified weed characteristics that would favour herbicide resistance evolution, including high annual fecundity, large resistance phenotypic variance and pre-existing herbicide resistance. Implications for herbicide resistance management and potential use of the model are discussed. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  5. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  6. A Transformative Model for Undergraduate Quantitative Biology Education

    ERIC Educational Resources Information Center

    Usher, David C.; Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The "BIO2010" report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3)…

  7. Current issues with standards in the measurement and documentation of human skeletal anatomy.

    PubMed

    Magee, Justin; McClelland, Brian; Winder, John

    2012-09-01

    Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18-65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing international published standards relating to engineering drawing and visual communication. Large variations are also evident in standards or guidelines used for global coordinate systems across biomechanics, ergonomics, software systems and 3D software applications. This paper identifies where established good practice exists and suggests additional recommendations, informing an improved communication protocol, to assist reconstruction of skeletal anatomy using 3D digital modeling. © 2012 The Authors. Journal of Anatomy © 2012 Anatomical Society.

  8. Current issues with standards in the measurement and documentation of human skeletal anatomy

    PubMed Central

    Magee, Justin; McClelland, Brian; Winder, John

    2012-01-01

    Digital modeling of human anatomy has become increasingly important and relies on well-documented quantitative anatomy literature. This type of documentation is common for the spine and pelvis; however, significant issues exist due to the lack of standardization in measurement and technique. Existing literature on quantitative anatomy for the spine and pelvis of white adults (aged 18–65 years, separated into decadal categories) was reviewed from the disciplines of anatomy, manipulative therapy, anthropometrics, occupational ergonomics, biomechanics and forensic science. The data were unified into a single normative model of the sub-axial spine. Two-dimensional orthographic drawings were produced from the 590 individual measurements identified, which informed the development of a 3D digital model. A similar review of full range of motion data was conducted as a meta-analysis and the results were applied to the existing model, providing an inter-connected, articulated digital spine. During these data analysis processes several inconsistencies were observed accompanied by an evidential lack of standardization with measurement and recording of data. These have been categorized as: anatomical terminology; scaling of measurements; measurement methodology, dimension and anatomical reference positions; global coordinate systems. There is inconsistency in anatomical terminology where independent researchers use the same terms to describe different aspects of anatomy or different terms for the same anatomy. Published standards exist for measurement methods of the human body regarding spatial interaction, anthropometric databases, automotive applications, clothing industries and for computer manikins, but none exists for skeletal anatomy. Presentation of measurements often lacks formal structure in clinical publications, seldom providing geometric reference points, therefore making digital reconstruction difficult. Published quantitative data does not follow existing international published standards relating to engineering drawing and visual communication. Large variations are also evident in standards or guidelines used for global coordinate systems across biomechanics, ergonomics, software systems and 3D software applications. This paper identifies where established good practice exists and suggests additional recommendations, informing an improved communication protocol, to assist reconstruction of skeletal anatomy using 3D digital modeling. PMID:22747678

  9. A Way Forward Commentary

    EPA Science Inventory

    Models for predicting adverse outcomes can help reduce and focus animal testing with new and existing chemicals. This short "thought starter" describes how quantitative-structure activity relationship and systems biology models can be used to help define toxicity pathways and li...

  10. Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1994-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.

  11. A cost-effectiveness comparison of existing and Landsat-aided snow water content estimation systems

    NASA Technical Reports Server (NTRS)

    Sharp, J. M.; Thomas, R. W.

    1975-01-01

    This study describes how Landsat imagery can be cost-effectively employed to augment an operational hydrologic model. Attention is directed toward the estimation of snow water content, a major predictor variable in the volumetric runoff forecasting model presently used by the California Department of Water Resources. A stratified double sampling scheme is supplemented with qualitative and quantitative analyses of existing operations to develop a comparison between the existing and satellite-aided approaches to snow water content estimation. Results show a decided advantage for the Landsat-aided approach.

  12. Lack of quantitative training among early-career ecologists: a survey of the problem and potential solutions

    PubMed Central

    Ezard, Thomas H.G.; Jørgensen, Peter S.; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J.; Poisot, Timothée

    2014-01-01

    Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was “too low” in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice. PMID:24688862

  13. The Structure of Psychopathology: Toward an Expanded Quantitative Empirical Model

    PubMed Central

    Wright, Aidan G.C.; Krueger, Robert F.; Hobbs, Megan J.; Markon, Kristian E.; Eaton, Nicholas R.; Slade, Tim

    2013-01-01

    There has been substantial recent interest in the development of a quantitative, empirically based model of psychopathology. However, the majority of pertinent research has focused on analyses of diagnoses, as described in current official nosologies. This is a significant limitation because existing diagnostic categories are often heterogeneous. In the current research, we aimed to redress this limitation of the existing literature, and to directly compare the fit of categorical, continuous, and hybrid (i.e., combined categorical and continuous) models of syndromes derived from indicators more fine-grained than diagnoses. We analyzed data from a large representative epidemiologic sample (the 2007 Australian National Survey of Mental Health and Wellbeing; N = 8,841). Continuous models provided the best fit for each syndrome we observed (Distress, Obsessive Compulsivity, Fear, Alcohol Problems, Drug Problems, and Psychotic Experiences). In addition, the best fitting higher-order model of these syndromes grouped them into three broad spectra: Internalizing, Externalizing, and Psychotic Experiences. We discuss these results in terms of future efforts to refine emerging empirically based, dimensional-spectrum model of psychopathology, and to use the model to frame psychopathology research more broadly. PMID:23067258

  14. Technical manual for basic version of the Markov chain nest productivity model (MCnest)

    EPA Science Inventory

    The Markov Chain Nest Productivity Model (or MCnest) integrates existing toxicity information from three standardized avian toxicity tests with information on species life history and the timing of pesticide applications relative to the timing of avian breeding seasons to quantit...

  15. User’s manual for basic version of MCnest Markov chain nest productivity model

    EPA Science Inventory

    The Markov Chain Nest Productivity Model (or MCnest) integrates existing toxicity information from three standardized avian toxicity tests with information on species life history and the timing of pesticide applications relative to the timing of avian breeding seasons to quantit...

  16. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  17. A Review of Energy Models with Particular Reference to Employment and Manpower Analysis.

    ERIC Educational Resources Information Center

    Eckstein, Albert J.; Heien, Dale M.

    To analyze the application of quantitative models to energy-employment issues, the energy problem was viewed in three distinct, but related, phases: the post-embargo shock effects, the intermediate-term process of adjustment, and the long-run equilibrium. Against this background eighteen existing energy models (government supported as well as…

  18. Military and Veteran Student Achievement in Postsecondary Education: A Structural Equation Model Using the Community College Survey of Men (CCSM)

    ERIC Educational Resources Information Center

    De LaGarza, Thomas R.; Manuel, Marcus A.; Wood, J. Luke; Harris, Frank, III

    2016-01-01

    Few quantitative studies exist on veteran success in postsecondary education, and existing qualitative research has also not accurately identified factors related to veteran achievement or pathways to success in postsecondary education. In this article, the Community College Survey of Men (CCSM) evaluates predictors of student success for…

  19. Quantitative interpretation of Great Lakes remote sensing data

    NASA Technical Reports Server (NTRS)

    Shook, D. F.; Salzman, J.; Svehla, R. A.; Gedney, R. T.

    1980-01-01

    The paper discusses the quantitative interpretation of Great Lakes remote sensing water quality data. Remote sensing using color information must take into account (1) the existence of many different organic and inorganic species throughout the Great Lakes, (2) the occurrence of a mixture of species in most locations, and (3) spatial variations in types and concentration of species. The radiative transfer model provides a potential method for an orderly analysis of remote sensing data and a physical basis for developing quantitative algorithms. Predictions and field measurements of volume reflectances are presented which show the advantage of using a radiative transfer model. Spectral absorptance and backscattering coefficients for two inorganic sediments are reported.

  20. Genetic and Environmental Influences on Behavior: Capturing All the Interplay

    ERIC Educational Resources Information Center

    Johnson, Wendy

    2007-01-01

    Basic quantitative genetic models of human behavioral variation have made clear that individual differences in behavior cannot be understood without acknowledging the importance of genetic influences. Yet these basic models estimate average, population-level genetic and environmental influences, obscuring differences that might exist within the…

  1. Perceptions and receptivity of non-spousal family support: A mixed methods study of psychological distress among older, church-going African American men

    PubMed Central

    Watkins, Daphne C.; Wharton, Tracy; Mitchell, Jamie A.; Matusko, Niki; Kales, Helen

    2016-01-01

    The purpose of this study was to explore the role of non-spousal family support on mental health among older, church-going African American men. The mixed methods objective was to employ a design that used existing qualitative and quantitative data to explore the interpretive context within which social and cultural experiences occur. Qualitative data (n=21) were used to build a conceptual model that was tested using quantitative data (n= 401). Confirmatory factor analysis indicated an inverse association between non-spousal family support and distress. The comparative fit index, Tucker-Lewis fit index, and root mean square error of approximation indicated good model fit. This study offers unique methodological approaches to using existing, complementary data sources to understand the health of African American men. PMID:28943829

  2. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  3. A comparison of operational and LANDSAT-aided snow water content estimation systems. [Feather River Basin, California

    NASA Technical Reports Server (NTRS)

    Sharp, J. M.; Thomas, R. W.

    1975-01-01

    How LANDSAT imagery can be cost effectively employed to augment an operational hydrologic model is described. Attention is directed toward the estimation of snow water content, a major predictor variable in the volumetric runoff forecasting model. A stratified double sampling scheme is supplemented with qualitative and quantitative analyses of existing operations to develop a comparison between the existing and satellite-aided approaches to snow water content estimation. Results show a decided advantage for the LANDSAT-aided approach.

  4. Logistic regression models of factors influencing the location of bioenergy and biofuels plants

    Treesearch

    T.M. Young; R.L. Zaretzki; J.H. Perdue; F.M. Guess; X. Liu

    2011-01-01

    Logistic regression models were developed to identify significant factors that influence the location of existing wood-using bioenergy/biofuels plants and traditional wood-using facilities. Logistic models provided quantitative insight for variables influencing the location of woody biomass-using facilities. Availability of "thinnings to a basal area of 31.7m2/ha...

  5. A cascading failure model for analyzing railway accident causation

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Tao; Li, Ke-Ping

    2018-01-01

    In this paper, a new cascading failure model is proposed for quantitatively analyzing the railway accident causation. In the model, the loads of nodes are redistributed according to the strength of the causal relationships between the nodes. By analyzing the actual situation of the existing prevention measures, a critical threshold of the load parameter in the model is obtained. To verify the effectiveness of the proposed cascading model, simulation experiments of a train collision accident are performed. The results show that the cascading failure model can describe the cascading process of the railway accident more accurately than the previous models, and can quantitatively analyze the sensitivities and the influence of the causes. In conclusion, this model can assist us to reveal the latent rules of accident causation to reduce the occurrence of railway accidents.

  6. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  7. Challenges in Developing Models Describing Complex Soil Systems

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Jacques, D.

    2014-12-01

    Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.

  8. A Comprehensive Review of Existing Risk Assessment Models in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Amini, Ahmad; Jamil, Norziana

    2018-05-01

    Cloud computing is a popular paradigm in information technology and computing as it offers numerous advantages in terms of economical saving and minimal management effort. Although elasticity and flexibility brings tremendous benefits, it still raises many information security issues due to its unique characteristic that allows ubiquitous computing. Therefore, the vulnerabilities and threats in cloud computing have to be identified and proper risk assessment mechanism has to be in place for better cloud computing management. Various quantitative and qualitative risk assessment models have been proposed but up to our knowledge, none of them is suitable for cloud computing environment. This paper, we compare and analyse the strengths and weaknesses of existing risk assessment models. We then propose a new risk assessment model that sufficiently address all the characteristics of cloud computing, which was not appeared in the existing models.

  9. A review of quantitative structure-property relationships for the fate of ionizable organic chemicals in water matrices and identification of knowledge gaps.

    PubMed

    Nolte, Tom M; Ragas, Ad M J

    2017-03-22

    Many organic chemicals are ionizable by nature. After use and release into the environment, various fate processes determine their concentrations, and hence exposure to aquatic organisms. In the absence of suitable data, such fate processes can be estimated using Quantitative Structure-Property Relationships (QSPRs). In this review we compiled available QSPRs from the open literature and assessed their applicability towards ionizable organic chemicals. Using quantitative and qualitative criteria we selected the 'best' QSPRs for sorption, (a)biotic degradation, and bioconcentration. The results indicate that many suitable QSPRs exist, but some critical knowledge gaps remain. Specifically, future focus should be directed towards the development of QSPR models for biodegradation in wastewater and sediment systems, direct photolysis and reaction with singlet oxygen, as well as additional reactive intermediates. Adequate QSPRs for bioconcentration in fish exist, but more accurate assessments can be achieved using pharmacologically based toxicokinetic (PBTK) models. No adequate QSPRs exist for bioconcentration in non-fish species. Due to the high variability of chemical and biological species as well as environmental conditions in QSPR datasets, accurate predictions for specific systems and inter-dataset conversions are problematic, for which standardization is needed. For all QSPR endpoints, additional data requirements involve supplementing the current chemical space covered and accurately characterizing the test systems used.

  10. A new mean estimator using auxiliary variables for randomized response models

    NASA Astrophysics Data System (ADS)

    Ozgul, Nilgun; Cingi, Hulya

    2013-10-01

    Randomized response models are commonly used in surveys dealing with sensitive questions such as abortion, alcoholism, sexual orientation, drug taking, annual income, tax evasion to ensure interviewee anonymity and reduce nonrespondents rates and biased responses. Starting from the pioneering work of Warner [7], many versions of RRM have been developed that can deal with quantitative responses. In this study, new mean estimator is suggested for RRM including quantitative responses. The mean square error is derived and a simulation study is performed to show the efficiency of the proposed estimator to other existing estimators in RRM.

  11. A sampling framework for incorporating quantitative mass spectrometry data in protein interaction analysis.

    PubMed

    Tucker, George; Loh, Po-Ru; Berger, Bonnie

    2013-10-04

    Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.

  12. A quantitative systems physiology model of renal function and blood pressure regulation: Model description.

    PubMed

    Hallow, K M; Gebremichael, Y

    2017-06-01

    Renal function plays a central role in cardiovascular, kidney, and multiple other diseases, and many existing and novel therapies act through renal mechanisms. Even with decades of accumulated knowledge of renal physiology, pathophysiology, and pharmacology, the dynamics of renal function remain difficult to understand and predict, often resulting in unexpected or counterintuitive therapy responses. Quantitative systems pharmacology modeling of renal function integrates this accumulated knowledge into a quantitative framework, allowing evaluation of competing hypotheses, identification of knowledge gaps, and generation of new experimentally testable hypotheses. Here we present a model of renal physiology and control mechanisms involved in maintaining sodium and water homeostasis. This model represents the core renal physiological processes involved in many research questions in drug development. The model runs in R and the code is made available. In a companion article, we present a case study using the model to explore mechanisms and pharmacology of salt-sensitive hypertension. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  13. Toward a Theoretical Model of Decision-Making and Resistance to Change among Higher Education Online Course Designers

    ERIC Educational Resources Information Center

    Dodd, Bucky J.

    2013-01-01

    Online course design is an emerging practice in higher education, yet few theoretical models currently exist to explain or predict how the diffusion of innovations occurs in this space. This study used a descriptive, quantitative survey research design to examine theoretical relationships between decision-making style and resistance to change…

  14. Toward a descriptive model of galactic cosmic rays in the heliosphere

    NASA Technical Reports Server (NTRS)

    Mewaldt, R. A.; Cummings, A. C.; Adams, James H., Jr.; Evenson, Paul; Fillius, W.; Jokipii, J. R.; Mckibben, R. B.; Robinson, Paul A., Jr.

    1988-01-01

    Researchers review the elements that enter into phenomenological models of the composition, energy spectra, and the spatial and temporal variations of galactic cosmic rays, including the so-called anomalous cosmic ray component. Starting from an existing model, designed to describe the behavior of cosmic rays in the near-Earth environment, researchers suggest possible updates and improvements to this model, and then propose a quantitative approach for extending such a model into other regions of the heliosphere.

  15. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  16. A network-based approach for semi-quantitative knowledge mining and its application to yield variability

    NASA Astrophysics Data System (ADS)

    Schauberger, Bernhard; Rolinski, Susanne; Müller, Christoph

    2016-12-01

    Variability of crop yields is detrimental for food security. Under climate change its amplitude is likely to increase, thus it is essential to understand the underlying causes and mechanisms. Crop models are the primary tool to project future changes in crop yields under climate change. A systematic overview of drivers and mechanisms of crop yield variability (YV) can thus inform crop model development and facilitate improved understanding of climate change impacts on crop yields. Yet there is a vast body of literature on crop physiology and YV, which makes a prioritization of mechanisms for implementation in models challenging. Therefore this paper takes on a novel approach to systematically mine and organize existing knowledge from the literature. The aim is to identify important mechanisms lacking in models, which can help to set priorities in model improvement. We structure knowledge from the literature in a semi-quantitative network. This network consists of complex interactions between growing conditions, plant physiology and crop yield. We utilize the resulting network structure to assign relative importance to causes of YV and related plant physiological processes. As expected, our findings confirm existing knowledge, in particular on the dominant role of temperature and precipitation, but also highlight other important drivers of YV. More importantly, our method allows for identifying the relevant physiological processes that transmit variability in growing conditions to variability in yield. We can identify explicit targets for the improvement of crop models. The network can additionally guide model development by outlining complex interactions between processes and by easily retrieving quantitative information for each of the 350 interactions. We show the validity of our network method as a structured, consistent and scalable dictionary of literature. The method can easily be applied to many other research fields.

  17. General Model Study of Scour at Proposed Pier Extensions - Santa Ana River at BNSF Bridge, Corona, California

    DTIC Science & Technology

    2017-11-01

    model of the bridge piers, other related structures, and the adjacent channel. Data from the model provided a qualitative and quantitative evaluation of...minus post-test lidar survey . ......................... 42 Figure 38. Test 1 (30,000 cfs existing conditions) pre- minus post-test lidar survey ...43 Figure 39. Test 7 (15,000 cfs original proposed conditions) pre- minus post-test lidar survey

  18. Assessing crown fire potential by linking models of surface and crown fire behavior

    Treesearch

    Joe H. Scott; Elizabeth D. Reinhardt

    2001-01-01

    Fire managers are increasingly concerned about the threat of crown fires, yet only now are quantitative methods for assessing crown fire hazard being developed. Links among existing mathematical models of fire behavior are used to develop two indices of crown fire hazard-the Torching Index and Crowning Index. These indices can be used to ordinate different forest...

  19. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  20. A Review of Mathematical Models for Leukemia and Lymphoma

    PubMed Central

    Clapp, Geoffrey; Levy, Doron

    2014-01-01

    Recently, there has been significant activity in the mathematical community, aimed at developing quantitative tools for studying leukemia and lymphoma. Mathematical models have been applied to evaluate existing therapies and to suggest novel therapies. This article reviews the recent contributions of mathematical modeling to leukemia and lymphoma research. These developments suggest that mathematical modeling has great potential in this field. Collaboration between mathematicians, clinicians, and experimentalists can significantly improve leukemia and lymphoma therapy. PMID:26744598

  1. Fuzzy object modeling

    NASA Astrophysics Data System (ADS)

    Udupa, Jayaram K.; Odhner, Dewey; Falcao, Alexandre X.; Ciesielski, Krzysztof C.; Miranda, Paulo A. V.; Vaideeswaran, Pavithra; Mishra, Shipra; Grevera, George J.; Saboury, Babak; Torigian, Drew A.

    2011-03-01

    To make Quantitative Radiology (QR) a reality in routine clinical practice, computerized automatic anatomy recognition (AAR) becomes essential. As part of this larger goal, we present in this paper a novel fuzzy strategy for building bodywide group-wise anatomic models. They have the potential to handle uncertainties and variability in anatomy naturally and to be integrated with the fuzzy connectedness framework for image segmentation. Our approach is to build a family of models, called the Virtual Quantitative Human, representing normal adult subjects at a chosen resolution of the population variables (gender, age). Models are represented hierarchically, the descendents representing organs contained in parent organs. Based on an index of fuzziness of the models, 32 thorax data sets, and 10 organs defined in them, we found that the hierarchical approach to modeling can effectively handle the non-linear relationships in position, scale, and orientation that exist among organs in different patients.

  2. Multiscale digital Arabidopsis predicts individual organ and whole-organism growth.

    PubMed

    Chew, Yin Hoon; Wenden, Bénédicte; Flis, Anna; Mengin, Virginie; Taylor, Jasper; Davey, Christopher L; Tindal, Christopher; Thomas, Howard; Ougham, Helen J; de Reffye, Philippe; Stitt, Mark; Williams, Mathew; Muetzelfeldt, Robert; Halliday, Karen J; Millar, Andrew J

    2014-09-30

    Understanding how dynamic molecular networks affect whole-organism physiology, analogous to mapping genotype to phenotype, remains a key challenge in biology. Quantitative models that represent processes at multiple scales and link understanding from several research domains can help to tackle this problem. Such integrated models are more common in crop science and ecophysiology than in the research communities that elucidate molecular networks. Several laboratories have modeled particular aspects of growth in Arabidopsis thaliana, but it was unclear whether these existing models could productively be combined. We test this approach by constructing a multiscale model of Arabidopsis rosette growth. Four existing models were integrated with minimal parameter modification (leaf water content and one flowering parameter used measured data). The resulting framework model links genetic regulation and biochemical dynamics to events at the organ and whole-plant levels, helping to understand the combined effects of endogenous and environmental regulators on Arabidopsis growth. The framework model was validated and tested with metabolic, physiological, and biomass data from two laboratories, for five photoperiods, three accessions, and a transgenic line, highlighting the plasticity of plant growth strategies. The model was extended to include stochastic development. Model simulations gave insight into the developmental control of leaf production and provided a quantitative explanation for the pleiotropic developmental phenotype caused by overexpression of miR156, which was an open question. Modular, multiscale models, assembling knowledge from systems biology to ecophysiology, will help to understand and to engineer plant behavior from the genome to the field.

  3. Quantitative physiologically based modeling of subjective fatigue during sleep deprivation.

    PubMed

    Fulcher, B D; Phillips, A J K; Robinson, P A

    2010-05-21

    A quantitative physiologically based model of the sleep-wake switch is used to predict variations in subjective fatigue-related measures during total sleep deprivation. The model includes the mutual inhibition of the sleep-active neurons in the hypothalamic ventrolateral preoptic area (VLPO) and the wake-active monoaminergic brainstem populations (MA), as well as circadian and homeostatic drives. We simulate sleep deprivation by introducing a drive to the MA, which we call wake effort, to maintain the system in a wakeful state. Physiologically this drive is proposed to be afferent from the cortex or the orexin group of the lateral hypothalamus. It is hypothesized that the need to exert this effort to maintain wakefulness at high homeostatic sleep pressure correlates with subjective fatigue levels. The model's output indeed exhibits good agreement with existing clinical time series of subjective fatigue-related measures, supporting this hypothesis. Subjective fatigue, adrenaline, and body temperature variations during two 72h sleep deprivation protocols are reproduced by the model. By distinguishing a motivation-dependent orexinergic contribution to the wake-effort drive, the model can be extended to interpret variation in performance levels during sleep deprivation in a way that is qualitatively consistent with existing, clinically derived results. The example of sleep deprivation thus demonstrates the ability of physiologically based sleep modeling to predict psychological measures from the underlying physiological interactions that produce them. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  4. Chemical Sensor Array Response Modeling Using Quantitative Structure-Activity Relationships Technique

    NASA Astrophysics Data System (ADS)

    Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.

    We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.

  5. Quantitative photoacoustic elasticity and viscosity imaging for cirrhosis detection

    NASA Astrophysics Data System (ADS)

    Wang, Qian; Shi, Yujiao; Yang, Fen; Yang, Sihua

    2018-05-01

    Elasticity and viscosity assessments are essential for understanding and characterizing the physiological and pathological states of tissue. In this work, by establishing a photoacoustic (PA) shear wave model, an approach for quantitative PA elasticity imaging based on measurement of the rise time of the thermoelastic displacement was developed. Thus, using an existing PA viscoelasticity imaging method that features a phase delay measurement, quantitative PA elasticity imaging and viscosity imaging can be obtained in a simultaneous manner. The method was tested and validated by imaging viscoelastic agar phantoms prepared at different agar concentrations, and the imaging data were in good agreement with rheometry results. Ex vivo experiments on liver pathological models demonstrated the capability for cirrhosis detection, and the results were consistent with the corresponding histological results. This method expands the scope of conventional PA imaging and has potential to become an important alternative imaging modality.

  6. New horizons in mouse immunoinformatics: reliable in silico prediction of mouse class I histocompatibility major complex peptide binding affinity.

    PubMed

    Hattotuwagama, Channa K; Guan, Pingping; Doytchinova, Irini A; Flower, Darren R

    2004-11-21

    Quantitative structure-activity relationship (QSAR) analysis is a main cornerstone of modern informatic disciplines. Predictive computational models, based on QSAR technology, of peptide-major histocompatibility complex (MHC) binding affinity have now become a vital component of modern day computational immunovaccinology. Historically, such approaches have been built around semi-qualitative, classification methods, but these are now giving way to quantitative regression methods. The additive method, an established immunoinformatics technique for the quantitative prediction of peptide-protein affinity, was used here to identify the sequence dependence of peptide binding specificity for three mouse class I MHC alleles: H2-D(b), H2-K(b) and H2-K(k). As we show, in terms of reliability the resulting models represent a significant advance on existing methods. They can be used for the accurate prediction of T-cell epitopes and are freely available online ( http://www.jenner.ac.uk/MHCPred).

  7. Platform-independent and label-free quantitation of proteomic data using MS1 extracted ion chromatograms in skyline: application to protein acetylation and phosphorylation.

    PubMed

    Schilling, Birgit; Rardin, Matthew J; MacLean, Brendan X; Zawadzka, Anna M; Frewen, Barbara E; Cusack, Michael P; Sorensen, Dylan J; Bereman, Michael S; Jing, Enxuan; Wu, Christine C; Verdin, Eric; Kahn, C Ronald; Maccoss, Michael J; Gibson, Bradford W

    2012-05-01

    Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models.

  8. Platform-independent and Label-free Quantitation of Proteomic Data Using MS1 Extracted Ion Chromatograms in Skyline

    PubMed Central

    Schilling, Birgit; Rardin, Matthew J.; MacLean, Brendan X.; Zawadzka, Anna M.; Frewen, Barbara E.; Cusack, Michael P.; Sorensen, Dylan J.; Bereman, Michael S.; Jing, Enxuan; Wu, Christine C.; Verdin, Eric; Kahn, C. Ronald; MacCoss, Michael J.; Gibson, Bradford W.

    2012-01-01

    Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models. PMID:22454539

  9. Returners and explorers dichotomy in human mobility

    PubMed Central

    Pappalardo, Luca; Simini, Filippo; Rinzivillo, Salvatore; Pedreschi, Dino; Giannotti, Fosca; Barabási, Albert-László

    2015-01-01

    The availability of massive digital traces of human whereabouts has offered a series of novel insights on the quantitative patterns characterizing human mobility. In particular, numerous recent studies have lead to an unexpected consensus: the considerable variability in the characteristic travelled distance of individuals coexists with a high degree of predictability of their future locations. Here we shed light on this surprising coexistence by systematically investigating the impact of recurrent mobility on the characteristic distance travelled by individuals. Using both mobile phone and GPS data, we discover the existence of two distinct classes of individuals: returners and explorers. As existing models of human mobility cannot explain the existence of these two classes, we develop more realistic models able to capture the empirical findings. Finally, we show that returners and explorers play a distinct quantifiable role in spreading phenomena and that a correlation exists between their mobility patterns and social interactions. PMID:26349016

  10. Balancing energy and entropy: A minimalist model for the characterization of protein folding landscapes

    PubMed Central

    Das, Payel; Matysiak, Silvina; Clementi, Cecilia

    2005-01-01

    Coarse-grained models have been extremely valuable in promoting our understanding of protein folding. However, the quantitative accuracy of existing simplified models is strongly hindered either from the complete removal of frustration (as in the widely used Gō-like models) or from the compromise with the minimal frustration principle and/or realistic protein geometry (as in the simple on-lattice models). We present a coarse-grained model that “naturally” incorporates sequence details and energetic frustration into an overall minimally frustrated folding landscape. The model is coupled with an optimization procedure to design the parameters of the protein Hamiltonian to fold into a desired native structure. The application to the study of src-Src homology 3 domain shows that this coarse-grained model contains the main physical-chemical ingredients that are responsible for shaping the folding landscape of this protein. The results illustrate the importance of nonnative interactions and energetic heterogeneity for a quantitative characterization of folding mechanisms. PMID:16006532

  11. Pulsar distances and the galactic distribution of free electrons

    NASA Technical Reports Server (NTRS)

    Taylor, J. H.; Cordes, J. M.

    1993-01-01

    The present quantitative model for Galactic free electron distribution abandons the assumption of axisymmetry and explicitly incorporates spiral arms; their shapes and locations are derived from existing radio and optical observations of H II regions. The Gum Nebula's dispersion-measure contributions are also explicitly modeled. Adjustable quantities are calibrated by reference to three different types of data. The new model is estimated to furnish distance estimates to known pulsars that are accurate to about 25 percent.

  12. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  13. Comparison of 3D quantitative structure-activity relationship methods: Analysis of the in vitro antimalarial activity of 154 artemisinin analogues by hypothetical active-site lattice and comparative molecular field analysis

    NASA Astrophysics Data System (ADS)

    Woolfrey, John R.; Avery, Mitchell A.; Doweyko, Arthur M.

    1998-03-01

    Two three-dimensional quantitative structure-activity relationship (3D-QSAR) methods, comparative molecular field analysis (CoMFA) and hypothetical active site lattice (HASL), were compared with respect to the analysis of a training set of 154 artemisinin analogues. Five models were created, including a complete HASL and two trimmed versions, as well as two CoMFA models (leave-one-out standard CoMFA and the guided-region selection protocol). Similar r2 and q2 values were obtained by each method, although some striking differences existed between CoMFA contour maps and the HASL output. Each of the four predictive models exhibited a similar ability to predict the activity of a test set of 23 artemisinin analogues, although some differences were noted as to which compounds were described well by either model.

  14. The Intransitivity of Educational Preferences

    ERIC Educational Resources Information Center

    Smith, Debra Candace

    2013-01-01

    This study sought to answer the question of whether the existence of cycles in education are random events, or if cycles in education are likely to be expected on a regular basis due to intransitive decision-making patterns of stakeholders. This was a quantitative study, modeled after two previously conducted studies (Davis, 1958/59; May, 1954),…

  15. Principals' Leadership Behaviors as Perceived by Teachers in At-Risk Middle Schools

    ERIC Educational Resources Information Center

    Johnson, R. Anthony

    2011-01-01

    A need for greater understanding of teachers' (N = 530) perceptions of the leadership behaviors of principals in Title I middle schools (n = 13) is prevalent exists. The researcher used the "Audit of Principal Effectiveness" survey to collect data. The researcher also used Hierarchical Linear Modeling as the quantitative analysis.…

  16. Extensive characterization of human tear fluid collected using different techniques unravels the presence of novel lipid amphiphiles1[S

    PubMed Central

    Lam, Sin Man; Tong, Louis; Duan, Xinrui; Petznick, Andrea; Wenk, Markus R.; Shui, Guanghou

    2014-01-01

    The tear film covers the anterior eye and the precise balance of its various constituting components is critical for maintaining ocular health. The composition of the tear film amphiphilic lipid sublayer, in particular, has largely remained a matter of contention due to the limiting concentrations of these lipid amphiphiles in tears that render their detection and accurate quantitation tedious. Using systematic and sensitive lipidomic approaches, we validated different tear collection techniques and report the most comprehensive human tear lipidome to date; comprising more than 600 lipid species from 17 major lipid classes. Our study confers novel insights to the compositional details of the existent tear film model, in particular the disputable amphiphilic lipid sublayer constituents, by demonstrating the presence of cholesteryl sulfate, O-acyl-ω-hydroxyfatty acids, and various sphingolipids and phospholipids in tears. The discovery and quantitation of the relative abundance of various tear lipid amphiphiles reported herein are expected to have a profound impact on the current understanding of the existent human tear film model. PMID:24287120

  17. Chapter 8: US geological survey Circum-Arctic Resource Appraisal (CARA): Introduction and summary of organization and methods

    USGS Publications Warehouse

    Charpentier, R.R.; Gautier, D.L.

    2011-01-01

    The USGS has assessed undiscovered petroleum resources in the Arctic through geological mapping, basin analysis and quantitative assessment. The new map compilation provided the base from which geologists subdivided the Arctic for burial history modelling and quantitative assessment. The CARA was a probabilistic, geologically based study that used existing USGS methodology, modified somewhat for the circumstances of the Arctic. The assessment relied heavily on analogue modelling, with numerical input as lognormal distributions of sizes and numbers of undiscovered accumulations. Probabilistic results for individual assessment units were statistically aggregated taking geological dependencies into account. Fourteen papers in this Geological Society volume present summaries of various aspects of the CARA. ?? 2011 The Geological Society of London.

  18. Image-Based Quantification of Plant Immunity and Disease.

    PubMed

    Laflamme, Bradley; Middleton, Maggie; Lo, Timothy; Desveaux, Darrell; Guttman, David S

    2016-12-01

    Measuring the extent and severity of disease is a critical component of plant pathology research and crop breeding. Unfortunately, existing visual scoring systems are qualitative, subjective, and the results are difficult to transfer between research groups, while existing quantitative methods can be quite laborious. Here, we present plant immunity and disease image-based quantification (PIDIQ), a quantitative, semi-automated system to rapidly and objectively measure disease symptoms in a biologically relevant context. PIDIQ applies an ImageJ-based macro to plant photos in order to distinguish healthy tissue from tissue that has yellowed due to disease. It can process a directory of images in an automated manner and report the relative ratios of healthy to diseased leaf area, thereby providing a quantitative measure of plant health that can be statistically compared with appropriate controls. We used the Arabidopsis thaliana-Pseudomonas syringae model system to show that PIDIQ is able to identify both enhanced plant health associated with effector-triggered immunity as well as elevated disease symptoms associated with effector-triggered susceptibility. Finally, we show that the quantitative results provided by PIDIQ correspond to those obtained via traditional in planta pathogen growth assays. PIDIQ provides a simple and effective means to nondestructively quantify disease from whole plants and we believe it will be equally effective for monitoring disease on excised leaves and stems.

  19. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime.

    PubMed

    Fitterer, Jessica L; Nelson, Trisalyn A

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks).

  20. A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime

    PubMed Central

    Fitterer, Jessica L.; Nelson, Trisalyn A.

    2015-01-01

    Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016

  1. [Modeling continuous scaling of NDVI based on fractal theory].

    PubMed

    Luan, Hai-Jun; Tian, Qing-Jiu; Yu, Tao; Hu, Xin-Li; Huang, Yan; Du, Ling-Tong; Zhao, Li-Min; Wei, Xi; Han, Jie; Zhang, Zhou-Wei; Li, Shao-Peng

    2013-07-01

    Scale effect was one of the very important scientific problems of remote sensing. The scale effect of quantitative remote sensing can be used to study retrievals' relationship between different-resolution images, and its research became an effective way to confront the challenges, such as validation of quantitative remote sensing products et al. Traditional up-scaling methods cannot describe scale changing features of retrievals on entire series of scales; meanwhile, they are faced with serious parameters correction issues because of imaging parameters' variation of different sensors, such as geometrical correction, spectral correction, etc. Utilizing single sensor image, fractal methodology was utilized to solve these problems. Taking NDVI (computed by land surface radiance) as example and based on Enhanced Thematic Mapper Plus (ETM+) image, a scheme was proposed to model continuous scaling of retrievals. Then the experimental results indicated that: (a) For NDVI, scale effect existed, and it could be described by fractal model of continuous scaling; (2) The fractal method was suitable for validation of NDVI. All of these proved that fractal was an effective methodology of studying scaling of quantitative remote sensing.

  2. Quantitative workflow based on NN for weighting criteria in landfill suitability mapping

    NASA Astrophysics Data System (ADS)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul

    2017-10-01

    Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.

  3. Structure of polyacrylic acid and polymethacrylic acid solutions : a small angle neutron scattering study

    NASA Astrophysics Data System (ADS)

    Moussaid, A.; Schosseler, F.; Munch, J. P.; Candau, S. J.

    1993-04-01

    The intensity scattered from polyacrylic acid and polymethacrylic acid solutions has been measured by small angle neutron scattering experiemnts. The influence of polymer concentration, ionization degree, temperature and salt content has been investigated. Results are in qualitative agreement with a model which predicts the existence of microphases in the unstable region of the phase diagram. Quantitative comparison with the theory is performed by fitting the theoretical structure factor to the experimental data. For a narrow range of ionizaiton degrees nearly quantitative agreement with the theory is found for the polyacrylic acide system.

  4. Quantitative risk stratification in Markov chains with limiting conditional distributions.

    PubMed

    Chan, David C; Pollett, Philip K; Weinstein, Milton C

    2009-01-01

    Many clinical decisions require patient risk stratification. The authors introduce the concept of limiting conditional distributions, which describe the equilibrium proportion of surviving patients occupying each disease state in a Markov chain with death. Such distributions can quantitatively describe risk stratification. The authors first establish conditions for the existence of a positive limiting conditional distribution in a general Markov chain and describe a framework for risk stratification using the limiting conditional distribution. They then apply their framework to a clinical example of a treatment indicated for high-risk patients, first to infer the risk of patients selected for treatment in clinical trials and then to predict the outcomes of expanding treatment to other populations of risk. For the general chain, a positive limiting conditional distribution exists only if patients in the earliest state have the lowest combined risk of progression or death. The authors show that in their general framework, outcomes and population risk are interchangeable. For the clinical example, they estimate that previous clinical trials have selected the upper quintile of patient risk for this treatment, but they also show that expanded treatment would weakly dominate this degree of targeted treatment, and universal treatment may be cost-effective. Limiting conditional distributions exist in most Markov models of progressive diseases and are well suited to represent risk stratification quantitatively. This framework can characterize patient risk in clinical trials and predict outcomes for other populations of risk.

  5. Boundary cooled rocket engines for space storable propellants

    NASA Technical Reports Server (NTRS)

    Kesselring, R. C.; Mcfarland, B. L.; Knight, R. M.; Gurnitz, R. N.

    1972-01-01

    An evaluation of an existing analytical heat transfer model was made to develop the technology of boundary film/conduction cooled rocket thrust chambers to the space storable propellant combination oxygen difluoride/diborane. Critical design parameters were identified and their importance determined. Test reduction methods were developed to enable data obtained from short duration hot firings with a thin walled (calorimeter) chamber to be used quantitatively evaluate the heat absorbing capability of the vapor film. The modification of the existing like-doublet injector was based on the results obtained from the calorimeter firings.

  6. Health impact assessment – A survey on quantifying tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org

    Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less

  7. The use of mode of action information in risk assessment: quantitative key events/dose-response framework for modeling the dose-response for key events.

    PubMed

    Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig

    2014-08-01

    The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.

  8. Focal Point Theory Models for Dissecting Dynamic Duality Problems of Microbial Infections

    PubMed Central

    Huang, S.-H.; Zhou, W.; Jong, A.

    2008-01-01

    Extending along the dynamic continuum from conflict to cooperation, microbial infections always involve symbiosis (Sym) and pathogenesis (Pat). There exists a dynamic Sym-Pat duality (DSPD) in microbial infection that is the most fundamental problem in infectomics. DSPD is encoded by the genomes of both the microbes and their hosts. Three focal point (FP) theory-based game models (pure cooperative, dilemma, and pure conflict) are proposed for resolving those problems. Our health is associated with the dynamic interactions of three microbial communities (nonpathogenic microbiota (NP) (Cooperation), conditional pathogens (CP) (Dilemma), and unconditional pathogens (UP) (Conflict)) with the hosts at different health statuses. Sym and Pat can be quantitated by measuring symbiotic index (SI), which is quantitative fitness for the symbiotic partnership, and pathogenic index (PI), which is quantitative damage to the symbiotic partnership, respectively. Symbiotic point (SP), which bears analogy to FP, is a function of SI and PI. SP-converting and specific pathogen-targeting strategies can be used for the rational control of microbial infections. PMID:18350122

  9. The Relationship between Agriculture Knowledge Bases for Teaching and Sources of Knowledge

    ERIC Educational Resources Information Center

    Rice, Amber H.; Kitchel, Tracy

    2015-01-01

    The purpose of this study was to describe the agriculture knowledge bases for teaching of agriculture teachers and to see if a relationship existed between years of teaching experience, sources of knowledge, and development of pedagogical content knowledge (PCK), using quantitative methods. A model of PCK from mathematics was utilized as a…

  10. The International School Effectiveness Research Programme ISERP. First Results of the Quantitative Study.

    ERIC Educational Resources Information Center

    Creemers, Bert P. M.; And Others

    The International School Effectiveness Research Programme (ISERP) is an example of the exchange of research and research results in the field of educational effectiveness. It aims to build on existing models of good practice and to avoid the variations in approach that limit the transferability of data within and between countries. A number of…

  11. A Quantitative and Model-Driven Approach to Assessing Higher Education in the United States of America

    ERIC Educational Resources Information Center

    Huang, Zuqing; Qiu, Robin G.

    2016-01-01

    University ranking or higher education assessment in general has been attracting more and more public attention over the years. However, the subjectivity-based evaluation index and indicator selections and weights that are widely adopted in most existing ranking systems have been called into question. In other words, the objectivity and…

  12. Is the Class Schedule the Only Difference between Morning and Afternoon Shift Schools in Mexico?

    ERIC Educational Resources Information Center

    Cardenas Denham, Sergio

    2009-01-01

    Double-shift schooling has been implemented in Mexico for several decades as a strategy to achieve universal access to basic education. This study provides evidence on the existence of social inequalities related to the implementation of this schooling model. Using quantitative data from several databases including the National Census, the…

  13. A transformative model for undergraduate quantitative biology education.

    PubMed

    Usher, David C; Driscoll, Tobin A; Dhurjati, Prasad; Pelesko, John A; Rossi, Louis F; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions.

  14. A Transformative Model for Undergraduate Quantitative Biology Education

    PubMed Central

    Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions. PMID:20810949

  15. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    PubMed

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  16. A Bayesian network approach to knowledge integration and representation of farm irrigation: 1. Model development

    NASA Astrophysics Data System (ADS)

    Wang, Q. J.; Robertson, D. E.; Haines, C. L.

    2009-02-01

    Irrigation is important to many agricultural businesses but also has implications for catchment health. A considerable body of knowledge exists on how irrigation management affects farm business and catchment health. However, this knowledge is fragmentary; is available in many forms such as qualitative and quantitative; is dispersed in scientific literature, technical reports, and the minds of individuals; and is of varying degrees of certainty. Bayesian networks allow the integration of dispersed knowledge into quantitative systems models. This study describes the development, validation, and application of a Bayesian network model of farm irrigation in the Shepparton Irrigation Region of northern Victoria, Australia. In this first paper we describe the process used to integrate a range of sources of knowledge to develop a model of farm irrigation. We describe the principal model components and summarize the reaction to the model and its development process by local stakeholders. Subsequent papers in this series describe model validation and the application of the model to assess the regional impact of historical and future management intervention.

  17. Science advancements key to increasing management value of life stage monitoring networks for endangered Sacramento River winter-run Chinook salmon in California

    USGS Publications Warehouse

    Johnson, Rachel C.; Windell, Sean; Brandes, Patricia L.; Conrad, J. Louise; Ferguson, John; Goertler, Pascale A. L.; Harvey, Brett N.; Heublein, Joseph; Isreal, Joshua A.; Kratville, Daniel W.; Kirsch, Joseph E.; Perry, Russell W.; Pisciotto, Joseph; Poytress, William R.; Reece, Kevin; Swart, Brycen G.

    2017-01-01

    A robust monitoring network that provides quantitative information about the status of imperiled species at key life stages and geographic locations over time is fundamental for sustainable management of fisheries resources. For anadromous species, management actions in one geographic domain can substantially affect abundance of subsequent life stages that span broad geographic regions. Quantitative metrics (e.g., abundance, movement, survival, life history diversity, and condition) at multiple life stages are needed to inform how management actions (e.g., hatcheries, harvest, hydrology, and habitat restoration) influence salmon population dynamics. The existing monitoring network for endangered Sacramento River winterrun Chinook Salmon (SRWRC, Oncorhynchus tshawytscha) in California’s Central Valley was compared to conceptual models developed for each life stage and geographic region of the life cycle to identify relevant SRWRC metrics. We concluded that the current monitoring network was insufficient to diagnose when (life stage) and where (geographic domain) chronic or episodic reductions in SRWRC cohorts occur, precluding within- and among-year comparisons. The strongest quantitative data exist in the Upper Sacramento River, where abundance estimates are generated for adult spawners and emigrating juveniles. However, once SRWRC leave the upper river, our knowledge of their identity, abundance, and condition diminishes, despite the juvenile monitoring enterprise. We identified six system-wide recommended actions to strengthen the value of data generated from the existing monitoring network to assess resource management actions: (1) incorporate genetic run identification; (2) develop juvenile abundance estimates; (3) collect data for life history diversity metrics at multiple life stages; (4) expand and enhance real-time fish survival and movement monitoring; (5) collect fish condition data; and (6) provide timely public access to monitoring data in open data formats. To illustrate how updated technologies can enhance the existing monitoring to provide quantitative data on SRWRC, we provide examples of how each recommendation can address specific management issues.

  18. Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes

    PubMed Central

    Zhang, Hong; Pei, Yun

    2016-01-01

    Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions. PMID:27529266

  19. Social dynamics of science.

    PubMed

    Sun, Xiaoling; Kaur, Jasleen; Milojević, Staša; Flammini, Alessandro; Menczer, Filippo

    2013-01-01

    The birth and decline of disciplines are critical to science and society. How do scientific disciplines emerge? No quantitative model to date allows us to validate competing theories on the different roles of endogenous processes, such as social collaborations, and exogenous events, such as scientific discoveries. Here we propose an agent-based model in which the evolution of disciplines is guided mainly by social interactions among agents representing scientists. Disciplines emerge from splitting and merging of social communities in a collaboration network. We find that this social model can account for a number of stylized facts about the relationships between disciplines, scholars, and publications. These results provide strong quantitative support for the key role of social interactions in shaping the dynamics of science. While several "science of science" theories exist, this is the first account for the emergence of disciplines that is validated on the basis of empirical data.

  20. Social Dynamics of Science

    PubMed Central

    Sun, Xiaoling; Kaur, Jasleen; Milojević, Staša; Flammini, Alessandro; Menczer, Filippo

    2013-01-01

    The birth and decline of disciplines are critical to science and society. How do scientific disciplines emerge? No quantitative model to date allows us to validate competing theories on the different roles of endogenous processes, such as social collaborations, and exogenous events, such as scientific discoveries. Here we propose an agent-based model in which the evolution of disciplines is guided mainly by social interactions among agents representing scientists. Disciplines emerge from splitting and merging of social communities in a collaboration network. We find that this social model can account for a number of stylized facts about the relationships between disciplines, scholars, and publications. These results provide strong quantitative support for the key role of social interactions in shaping the dynamics of science. While several “science of science” theories exist, this is the first account for the emergence of disciplines that is validated on the basis of empirical data. PMID:23323212

  1. Social Dynamics of Science

    NASA Astrophysics Data System (ADS)

    Sun, Xiaoling; Kaur, Jasleen; Milojević, Staša; Flammini, Alessandro; Menczer, Filippo

    2013-01-01

    The birth and decline of disciplines are critical to science and society. How do scientific disciplines emerge? No quantitative model to date allows us to validate competing theories on the different roles of endogenous processes, such as social collaborations, and exogenous events, such as scientific discoveries. Here we propose an agent-based model in which the evolution of disciplines is guided mainly by social interactions among agents representing scientists. Disciplines emerge from splitting and merging of social communities in a collaboration network. We find that this social model can account for a number of stylized facts about the relationships between disciplines, scholars, and publications. These results provide strong quantitative support for the key role of social interactions in shaping the dynamics of science. While several ``science of science'' theories exist, this is the first account for the emergence of disciplines that is validated on the basis of empirical data.

  2. Metabolic network reconstruction of Chlamydomonas offers insight into light-driven algal metabolism

    PubMed Central

    Chang, Roger L; Ghamsari, Lila; Manichaikul, Ani; Hom, Erik F Y; Balaji, Santhanam; Fu, Weiqi; Shen, Yun; Hao, Tong; Palsson, Bernhard Ø; Salehi-Ashtiani, Kourosh; Papin, Jason A

    2011-01-01

    Metabolic network reconstruction encompasses existing knowledge about an organism's metabolism and genome annotation, providing a platform for omics data analysis and phenotype prediction. The model alga Chlamydomonas reinhardtii is employed to study diverse biological processes from photosynthesis to phototaxis. Recent heightened interest in this species results from an international movement to develop algal biofuels. Integrating biological and optical data, we reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology. PMID:21811229

  3. Simulation-Based Prediction of Equivalent Continuous Noises during Construction Processes.

    PubMed

    Zhang, Hong; Pei, Yun

    2016-08-12

    Quantitative prediction of construction noise is crucial to evaluate construction plans to help make decisions to address noise levels. Considering limitations of existing methods for measuring or predicting the construction noise and particularly the equivalent continuous noise level over a period of time, this paper presents a discrete-event simulation method for predicting the construction noise in terms of equivalent continuous level. The noise-calculating models regarding synchronization, propagation and equivalent continuous level are presented. The simulation framework for modeling the noise-affected factors and calculating the equivalent continuous noise by incorporating the noise-calculating models into simulation strategy is proposed. An application study is presented to demonstrate and justify the proposed simulation method in predicting the equivalent continuous noise during construction. The study contributes to provision of a simulation methodology to quantitatively predict the equivalent continuous noise of construction by considering the relevant uncertainties, dynamics and interactions.

  4. Shape Matters: Intravital Microscopy Reveals Surprising Geometrical Dependence for Nanoparticles in Tumor Models of Extravasation

    PubMed Central

    Smith, Bryan Ronain; Kempen, Paul; Bouley, Donna; Xu, Alexander; Liu, Zhuang; Melosh, Nicholas; Dai, Hongjie; Sinclair, Robert; Gambhir, Sanjiv Sam

    2012-01-01

    Delivery is one of the most critical obstacles confronting nanoparticle use in cancer diagnosis and therapy. For most oncological applications, nanoparticles must extravasate in order to reach tumor cells and perform their designated task. However, little understanding exists regarding the effect of nanoparticle shape on extravasation. Herein we use real-time intravital microscopic imaging to meticulously examine how two different nanoparticles behave across three different murine tumor models. The study quantitatively demonstrates that high-aspect ratio single-walled carbon nanotubes (SWNTs) display extravasational behavior surprisingly different from, and counterintuitive to, spherical nanoparticles although the nanoparticles have similar surface coatings, area, and charge. This work quantitatively indicates that nanoscale extravasational competence is highly dependent on nanoparticle geometry and is heterogeneous. PMID:22650417

  5. Wavelet modeling and prediction of the stability of states: the Roman Empire and the European Union

    NASA Astrophysics Data System (ADS)

    Yaroshenko, Tatyana Y.; Krysko, Dmitri V.; Dobriyan, Vitalii; Zhigalov, Maksim V.; Vos, Hendrik; Vandenabeele, Peter; Krysko, Vadim A.

    2015-09-01

    How can the stability of a state be quantitatively determined and its future stability predicted? The rise and collapse of empires and states is very complex, and it is exceedingly difficult to understand and predict it. Existing theories are usually formulated as verbal models and, consequently, do not yield sharply defined, quantitative prediction that can be unambiguously validated with data. Here we describe a model that determines whether the state is in a stable or chaotic condition and predicts its future condition. The central model, which we test, is that growth and collapse of states is reflected by the changes of their territories, populations and budgets. The model was simulated within the historical societies of the Roman Empire (400 BC to 400 AD) and the European Union (1957-2007) by using wavelets and analysis of the sign change of the spectrum of Lyapunov exponents. The model matches well with the historical events. During wars and crises, the state becomes unstable; this is reflected in the wavelet analysis by a significant increase in the frequency ω (t) and wavelet coefficients W (ω, t) and the sign of the largest Lyapunov exponent becomes positive, indicating chaos. We successfully reconstructed and forecasted time series in the Roman Empire and the European Union by applying artificial neural network. The proposed model helps to quantitatively determine and forecast the stability of a state.

  6. Conditional Toxicity Value (CTV) Predictor: An In Silico Approach for Generating Quantitative Risk Estimates for Chemicals.

    PubMed

    Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A

    2018-05-01

    Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.

  7. The flow of power law fluids in elastic networks and porous media.

    PubMed

    Sochi, Taha

    2016-02-01

    The flow of power law fluids, which include shear thinning and shear thickening as well as Newtonian as a special case, in networks of interconnected elastic tubes is investigated using a residual-based pore scale network modeling method with the employment of newly derived formulae. Two relations describing the mechanical interaction between the local pressure and local cross-sectional area in distensible tubes of elastic nature are considered in the derivation of these formulae. The model can be used to describe shear dependent flows of mainly viscous nature. The behavior of the proposed model is vindicated by several tests in a number of special and limiting cases where the results can be verified quantitatively or qualitatively. The model, which is the first of its kind, incorporates more than one major nonlinearity corresponding to the fluid rheology and conduit mechanical properties, that is non-Newtonian effects and tube distensibility. The formulation, implementation, and performance indicate that the model enjoys certain advantages over the existing models such as being exact within the restricting assumptions on which the model is based, easy implementation, low computational costs, reliability, and smooth convergence. The proposed model can, therefore, be used as an alternative to the existing Newtonian distensible models; moreover, it stretches the capabilities of the existing modeling approaches to reach non-Newtonian rheologies.

  8. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  9. Non-parallel coevolution of sender and receiver in the acoustic communication system of treefrogs.

    PubMed

    Schul, Johannes; Bush, Sarah L

    2002-09-07

    Advertisement calls of closely related species often differ in quantitative features such as the repetition rate of signal units. These differences are important in species recognition. Current models of signal-receiver coevolution predict two possible patterns in the evolution of the mechanism used by receivers to recognize the call: (i) classical sexual selection models (Fisher process, good genes/indirect benefits, direct benefits models) predict that close relatives use qualitatively similar signal recognition mechanisms tuned to different values of a call parameter; and (ii) receiver bias models (hidden preference, pre-existing bias models) predict that if different signal recognition mechanisms are used by sibling species, evidence of an ancestral mechanism will persist in the derived species, and evidence of a pre-existing bias will be detectable in the ancestral species. We describe qualitatively different call recognition mechanisms in sibling species of treefrogs. Whereas Hyla chrysoscelis uses pulse rate to recognize male calls, Hyla versicolor uses absolute measurements of pulse duration and interval duration. We found no evidence of either hidden preferences or pre-existing biases. The results are compared with similar data from katydids (Tettigonia sp.). In both taxa, the data are not adequately explained by current models of signal-receiver coevolution.

  10. Nonparametric modeling of longitudinal covariance structure in functional mapping of quantitative trait loci.

    PubMed

    Yap, John Stephen; Fan, Jianqing; Wu, Rongling

    2009-12-01

    Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.

  11. A Short-Term Population Model of the Suicide Risk: The Case of Spain.

    PubMed

    De la Poza, Elena; Jódar, Lucas

    2018-06-14

    A relevant proportion of deaths by suicide have been attributed to other causes that produce the number of suicides remains hidden. The existence of a hidden number of cases is explained by the nature of the problem. Problems like this involve violence, and produce fear and social shame in victims' families. The existence of violence, fear and social shame experienced by victims favours a considerable number of suicides, identified as accidents or natural deaths. This paper proposes a short time discrete compartmental mathematical model to measure the suicidal risk for the case of Spain. The compartment model classifies and quantifies the amount of the Spanish population within the age intervals (16, 78) by their degree of suicide risk and their changes over time. Intercompartmental transits are due to the combination of quantitative and qualitative factors. Results are computed and simulations are performed to analyze the sensitivity of the model under uncertain coefficients.

  12. Expert review on poliovirus immunity and transmission.

    PubMed

    Duintjer Tebbens, Radboud J; Pallansch, Mark A; Chumakov, Konstantin M; Halsey, Neal A; Hovi, Tapani; Minor, Philip D; Modlin, John F; Patriarca, Peter A; Sutter, Roland W; Wright, Peter F; Wassilak, Steven G F; Cochi, Stephen L; Kim, Jong-Hoon; Thompson, Kimberly M

    2013-04-01

    Successfully managing risks to achieve wild polioviruses (WPVs) eradication and address the complexities of oral poliovirus vaccine (OPV) cessation to stop all cases of paralytic poliomyelitis depends strongly on our collective understanding of poliovirus immunity and transmission. With increased shifting from OPV to inactivated poliovirus vaccine (IPV), numerous risk management choices motivate the need to understand the tradeoffs and uncertainties and to develop models to help inform decisions. The U.S. Centers for Disease Control and Prevention hosted a meeting of international experts in April 2010 to review the available literature relevant to poliovirus immunity and transmission. This expert review evaluates 66 OPV challenge studies and other evidence to support the development of quantitative models of poliovirus transmission and potential outbreaks. This review focuses on characterization of immunity as a function of exposure history in terms of susceptibility to excretion, duration of excretion, and concentration of excreted virus. We also discuss the evidence of waning of host immunity to poliovirus transmission, the relationship between the concentration of poliovirus excreted and infectiousness, the importance of different transmission routes, and the differences in transmissibility between OPV and WPV. We discuss the limitations of the available evidence for use in polio risk models, and conclude that despite the relatively large number of studies on immunity, very limited data exist to directly support quantification of model inputs related to transmission. Given the limitations in the evidence, we identify the need for expert input to derive quantitative model inputs from the existing data. © 2012 Society for Risk Analysis.

  13. Unbiased Quantitative Models of Protein Translation Derived from Ribosome Profiling Data

    PubMed Central

    Gritsenko, Alexey A.; Hulsman, Marc; Reinders, Marcel J. T.; de Ridder, Dick

    2015-01-01

    Translation of RNA to protein is a core process for any living organism. While for some steps of this process the effect on protein production is understood, a holistic understanding of translation still remains elusive. In silico modelling is a promising approach for elucidating the process of protein synthesis. Although a number of computational models of the process have been proposed, their application is limited by the assumptions they make. Ribosome profiling (RP), a relatively new sequencing-based technique capable of recording snapshots of the locations of actively translating ribosomes, is a promising source of information for deriving unbiased data-driven translation models. However, quantitative analysis of RP data is challenging due to high measurement variance and the inability to discriminate between the number of ribosomes measured on a gene and their speed of translation. We propose a solution in the form of a novel multi-scale interpretation of RP data that allows for deriving models with translation dynamics extracted from the snapshots. We demonstrate the usefulness of this approach by simultaneously determining for the first time per-codon translation elongation and per-gene translation initiation rates of Saccharomyces cerevisiae from RP data for two versions of the Totally Asymmetric Exclusion Process (TASEP) model of translation. We do this in an unbiased fashion, by fitting the models using only RP data with a novel optimization scheme based on Monte Carlo simulation to keep the problem tractable. The fitted models match the data significantly better than existing models and their predictions show better agreement with several independent protein abundance datasets than existing models. Results additionally indicate that the tRNA pool adaptation hypothesis is incomplete, with evidence suggesting that tRNA post-transcriptional modifications and codon context may play a role in determining codon elongation rates. PMID:26275099

  14. Unbiased Quantitative Models of Protein Translation Derived from Ribosome Profiling Data.

    PubMed

    Gritsenko, Alexey A; Hulsman, Marc; Reinders, Marcel J T; de Ridder, Dick

    2015-08-01

    Translation of RNA to protein is a core process for any living organism. While for some steps of this process the effect on protein production is understood, a holistic understanding of translation still remains elusive. In silico modelling is a promising approach for elucidating the process of protein synthesis. Although a number of computational models of the process have been proposed, their application is limited by the assumptions they make. Ribosome profiling (RP), a relatively new sequencing-based technique capable of recording snapshots of the locations of actively translating ribosomes, is a promising source of information for deriving unbiased data-driven translation models. However, quantitative analysis of RP data is challenging due to high measurement variance and the inability to discriminate between the number of ribosomes measured on a gene and their speed of translation. We propose a solution in the form of a novel multi-scale interpretation of RP data that allows for deriving models with translation dynamics extracted from the snapshots. We demonstrate the usefulness of this approach by simultaneously determining for the first time per-codon translation elongation and per-gene translation initiation rates of Saccharomyces cerevisiae from RP data for two versions of the Totally Asymmetric Exclusion Process (TASEP) model of translation. We do this in an unbiased fashion, by fitting the models using only RP data with a novel optimization scheme based on Monte Carlo simulation to keep the problem tractable. The fitted models match the data significantly better than existing models and their predictions show better agreement with several independent protein abundance datasets than existing models. Results additionally indicate that the tRNA pool adaptation hypothesis is incomplete, with evidence suggesting that tRNA post-transcriptional modifications and codon context may play a role in determining codon elongation rates.

  15. A methodology to select a wire insulation for use in habitable spacecraft.

    PubMed

    Paulos, T; Apostolakis, G

    1998-08-01

    This paper investigates electrical overheating events aboard a habitable spacecraft. The wire insulation involved in these failures plays a major role in the entire event scenario from threat development to detection and damage assessment. Ideally, if models of wire overheating events in microgravity existed, the various wire insulations under consideration could be quantitatively compared. However, these models do not exist. In this paper, a methodology is developed that can be used to select a wire insulation that is best suited for use in a habitable spacecraft. The results of this study show that, based upon the Analytic Hierarchy Process and simplifying assumptions, the criteria selected, and data used in the analysis, Tefzel is better than Teflon for use in a habitable spacecraft.

  16. Incorporating temporal and clinical reasoning in a new measure of continuity of care.

    PubMed Central

    Spooner, S. A.

    1994-01-01

    Previously described quantitative methods for measuring continuity of care have assumed that perfect continuity exists when a patient sees only one provider, regardless of the temporal pattern and clinical context of the visits. This paper describes an implementation of a new operational model of continuity--the Temporal Continuity Index--that takes into account time intervals between well visits in a pediatric residency continuity clinic. Ideal continuity in this model is achieved when intervals between visits are appropriate based on the age of the patient and clinical context of the encounters. The fundamental concept in this model is the expectation interval, which contains the length of the maximum ideal follow-up interval for a visit and the maximum follow-up interval. This paper describes an initial implementation of the TCI model and compares TCI calculations to previous quantitative methods and proposes its use as part of the assessment of resident education in outpatient settings. PMID:7950019

  17. Quantitative analysis of intra-Golgi transport shows intercisternal exchange for all cargo

    PubMed Central

    Dmitrieff, Serge; Rao, Madan; Sens, Pierre

    2013-01-01

    The mechanisms controlling the transport of proteins through the Golgi stack of mammalian and plant cells is the subject of intense debate, with two models, cisternal progression and intercisternal exchange, emerging as major contenders. A variety of transport experiments have claimed support for each of these models. We reevaluate these experiments using a single quantitative coarse-grained framework of intra-Golgi transport that accounts for both transport models and their many variants. Our analysis makes a definitive case for the existence of intercisternal exchange both for small membrane proteins and large protein complexes––this implies that membrane structures larger than the typical protein-coated vesicles must be involved in transport. Notwithstanding, we find that current observations on protein transport cannot rule out cisternal progression as contributing significantly to the transport process. To discriminate between the different models of intra-Golgi transport, we suggest experiments and an analysis based on our extended theoretical framework that compare the dynamics of transiting and resident proteins. PMID:24019488

  18. Measuring water and sediment discharge from a road plot with a settling basin and tipping bucket

    Treesearch

    Thomas A. Black; Charles H. Luce

    2013-01-01

    A simple empirical method quantifies water and sediment production from a forest road surface, and is well suited for calibration and validation of road sediment models. To apply this quantitative method, the hydrologic technician installs bordered plots on existing typical road segments and measures coarse sediment production in a settling tank. When a tipping bucket...

  19. Strong plasma turbulence in the earth's electron foreshock

    NASA Technical Reports Server (NTRS)

    Robinson, P. A.; Newman, D. L.

    1991-01-01

    A quantitative model is developed to account for the distribution in magnitude and location of the intense plasma waves observed in the earth's electron foreshock given the observed rms levels of waves. In this model, nonlinear strong-turbulence effects cause solitonlike coherent wave packets to form and decouple from incoherent background beam-excited weak turbulence, after which they convect downstream with the solar wind while collapsing to scales as short as 100 m and fields as high as 2 V/m. The existence of waves with energy densities above the strong-turbulence wave-collapse threshold is inferred from observations from IMP 6 and ISEE 1 and quantitative agreement is found between the predicted distribution of fields in an ensemble of such wave packets and the actual field distribution observed in situ by IMP 6. Predictions for the polarization of plasma waves and the bandwidth of ion-sound waves are also consistent with the observations. It is shown that strong-turbulence effects must be incorporated in any comprehensive theory of the propagation and evolution of electron beams in the foreshock. Previous arguments against the existence of strong turbulence in the foreshock are refuted.

  20. Comparison of GEANT4 very low energy cross section models with experimental data in water.

    PubMed

    Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C

    2010-09-01

    The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models available in the GEANT4 toolkit have been compared in this article to available experimental data in the water vapor phase as well as to several published recommendations on the mass stopping power. These models represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.

  1. Visual salience metrics for image inpainting

    NASA Astrophysics Data System (ADS)

    Ardis, Paul A.; Singhal, Amit

    2009-01-01

    Quantitative metrics for successful image inpainting currently do not exist, with researchers instead relying upon qualitative human comparisons to evaluate their methodologies and techniques. In an attempt to rectify this situation, we propose two new metrics to capture the notions of noticeability and visual intent in order to evaluate inpainting results. The proposed metrics use a quantitative measure of visual salience based upon a computational model of human visual attention. We demonstrate how these two metrics repeatably correlate with qualitative opinion in a human observer study, correctly identify the optimum uses for exemplar-based inpainting (as specified in the original publication), and match qualitative opinion in published examples.

  2. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    NASA Astrophysics Data System (ADS)

    Egan, James; McMillan, Normal; Denieffe, David

    2011-08-01

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  3. A New Tool for Local Manipulation of Neuronal Micro-Circuitry with Ions and Force

    DTIC Science & Technology

    2017-02-07

    lifesci.ucsb.edu Final Report 7/30/2015-9/30/2016 3 neurons, memory , connectivity, microcircuitry 2/7/2017 Our goal is to compute the complete functional... memory trace. To date no functional connectivity map exists for living neurons at the resolution proposed here. In fact, a quantitative model of the...propagation signals are also present in cultures of human iPS-derived neurons and thus could be used to study axonal physiology in human disease models. 3

  4. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling

    PubMed Central

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-01-01

    Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270

  5. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    PubMed

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  6. A quantitative speciation model for the adsorption of organic pollutants on activated carbon.

    PubMed

    Grivé, M; García, D; Domènech, C; Richard, L; Rojo, I; Martínez, X; Rovira, M

    2013-01-01

    Granular activated carbon (GAC) is commonly used as adsorbent in water treatment plants given its high capacity for retaining organic pollutants in aqueous phase. The current knowledge on GAC behaviour is essentially empirical, and no quantitative description of the chemical relationships between GAC surface groups and pollutants has been proposed. In this paper, we describe a quantitative model for the adsorption of atrazine onto GAC surface. The model is based on results of potentiometric titrations and three types of adsorption experiments which have been carried out in order to determine the nature and distribution of the functional groups on the GAC surface, and evaluate the adsorption characteristics of GAC towards atrazine. Potentiometric titrations have indicated the existence of at least two different families of chemical groups on the GAC surface, including phenolic- and benzoic-type surface groups. Adsorption experiments with atrazine have been satisfactorily modelled with the geochemical code PhreeqC, assuming that atrazine is sorbed onto the GAC surface in equilibrium (log Ks = 5.1 ± 0.5). Independent thermodynamic calculations suggest a possible adsorption of atrazine on a benzoic derivative. The present work opens a new approach for improving the adsorption capabilities of GAC towards organic pollutants by modifying its chemical properties.

  7. Modelling the co-evolution of indirect genetic effects and inherited variability.

    PubMed

    Marjanovic, Jovana; Mulder, Han A; Rönnegård, Lars; Bijma, Piter

    2018-03-28

    When individuals interact, their phenotypes may be affected not only by their own genes but also by genes in their social partners. This phenomenon is known as Indirect Genetic Effects (IGEs). In aquaculture species and some plants, however, competition not only affects trait levels of individuals, but also inflates variability of trait values among individuals. In the field of quantitative genetics, the variability of trait values has been studied as a quantitative trait in itself, and is often referred to as inherited variability. Such studies, however, consider only the genetic effect of the focal individual on trait variability and do not make a connection to competition. Although the observed phenotypic relationship between competition and variability suggests an underlying genetic relationship, the current quantitative genetic models of IGE and inherited variability do not allow for such a relationship. The lack of quantitative genetic models that connect IGEs to inherited variability limits our understanding of the potential of variability to respond to selection, both in nature and agriculture. Models of trait levels, for example, show that IGEs may considerably change heritable variation in trait values. Currently, we lack the tools to investigate whether this result extends to variability of trait values. Here we present a model that integrates IGEs and inherited variability. In this model, the target phenotype, say growth rate, is a function of the genetic and environmental effects of the focal individual and of the difference in trait value between the social partner and the focal individual, multiplied by a regression coefficient. The regression coefficient is a genetic trait, which is a measure of cooperation; a negative value indicates competition, a positive value cooperation, and an increasing value due to selection indicates the evolution of cooperation. In contrast to the existing quantitative genetic models, our model allows for co-evolution of IGEs and variability, as the regression coefficient can respond to selection. Our simulations show that the model results in increased variability of body weight with increasing competition. When competition decreases, i.e., cooperation evolves, variability becomes significantly smaller. Hence, our model facilitates quantitative genetic studies on the relationship between IGEs and inherited variability. Moreover, our findings suggest that we may have been overlooking an entire level of genetic variation in variability, the one due to IGEs.

  8. Infrasonic waves generated by supersonic auroral arcs

    NASA Astrophysics Data System (ADS)

    Pasko, Victor P.

    2012-10-01

    A finite-difference time-domain (FDTD) model of infrasound propagation in a realistic atmosphere is used to provide quantitative interpretation of infrasonic waves produced by auroral arcs moving with supersonic speed. The Lorentz force and Joule heating are discussed in the existing literature as primary sources producing infrasound waves in the frequency range 0.1-0.01 Hz associated with the auroral electrojet. The results are consistent with original ideas of Swift (1973) and demonstrate that the synchronization of the speed of auroral arc and phase speed of the acoustic wave in the electrojet volume is an important condition for generation of magnitudes and frequency contents of infrasonic waves observable on the ground. The reported modeling also allows accurate quantitative reproduction of previously observed complex infrasonic waveforms including direct shock and reflected shockwaves, which are refracted back to the earth by the thermosphere.

  9. Using quantitative disease dynamics as a tool for guiding response to avian influenza in poultry in the United States of America☆

    PubMed Central

    Pepin, K.M.; Spackman, E.; Brown, J.D.; Pabilonia, K.L.; Garber, L.P.; Weaver, J.T.; Kennedy, D.A.; Patyk, K.A.; Huyvaert, K.P.; Miller, R.S.; Franklin, A.B.; Pedersen, K.; Bogich, T.L.; Rohani, P.; Shriner, S.A.; Webb, C.T.; Riley, S.

    2014-01-01

    Wild birds are the primary source of genetic diversity for influenza A viruses that eventually emerge in poultry and humans. Much progress has been made in the descriptive ecology of avian influenza viruses (AIVs), but contributions are less evident from quantitative studies (e.g., those including disease dynamic models). Transmission between host species, individuals and flocks has not been measured with sufficient accuracy to allow robust quantitative evaluation of alternate control protocols. We focused on the United States of America (USA) as a case study for determining the state of our quantitative knowledge of potential AIV emergence processes from wild hosts to poultry. We identified priorities for quantitative research that would build on existing tools for responding to AIV in poultry and concluded that the following knowledge gaps can be addressed with current empirical data: (1) quantification of the spatio-temporal relationships between AIV prevalence in wild hosts and poultry populations, (2) understanding how the structure of different poultry sectors impacts within-flock transmission, (3) determining mechanisms and rates of between-farm spread, and (4) validating current policy-decision tools with data. The modeling studies we recommend will improve our mechanistic understanding of potential AIV transmission patterns in USA poultry, leading to improved measures of accuracy and reduced uncertainty when evaluating alternative control strategies. PMID:24462191

  10. Toward the prediction of class I and II mouse major histocompatibility complex-peptide-binding affinity: in silico bioinformatic step-by-step guide using quantitative structure-activity relationships.

    PubMed

    Hattotuwagama, Channa K; Doytchinova, Irini A; Flower, Darren R

    2007-01-01

    Quantitative structure-activity relationship (QSAR) analysis is a cornerstone of modern informatics. Predictive computational models of peptide-major histocompatibility complex (MHC)-binding affinity based on QSAR technology have now become important components of modern computational immunovaccinology. Historically, such approaches have been built around semiqualitative, classification methods, but these are now giving way to quantitative regression methods. We review three methods--a 2D-QSAR additive-partial least squares (PLS) and a 3D-QSAR comparative molecular similarity index analysis (CoMSIA) method--which can identify the sequence dependence of peptide-binding specificity for various class I MHC alleles from the reported binding affinities (IC50) of peptide sets. The third method is an iterative self-consistent (ISC) PLS-based additive method, which is a recently developed extension to the additive method for the affinity prediction of class II peptides. The QSAR methods presented here have established themselves as immunoinformatic techniques complementary to existing methodology, useful in the quantitative prediction of binding affinity: current methods for the in silico identification of T-cell epitopes (which form the basis of many vaccines, diagnostics, and reagents) rely on the accurate computational prediction of peptide-MHC affinity. We have reviewed various human and mouse class I and class II allele models. Studied alleles comprise HLA-A*0101, HLA-A*0201, HLA-A*0202, HLA-A*0203, HLA-A*0206, HLA-A*0301, HLA-A*1101, HLA-A*3101, HLA-A*6801, HLA-A*6802, HLA-B*3501, H2-K(k), H2-K(b), H2-D(b) HLA-DRB1*0101, HLA-DRB1*0401, HLA-DRB1*0701, I-A(b), I-A(d), I-A(k), I-A(S), I-E(d), and I-E(k). In this chapter we show a step-by-step guide into predicting the reliability and the resulting models to represent an advance on existing methods. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made are freely available online at the URL http://www.jenner.ac.uk/MHCPred.

  11. Shallow cells in directional solidification

    NASA Technical Reports Server (NTRS)

    Merchant, G. J.; Davis, S. H.

    1989-01-01

    The existing theory on two-dimensional transitions (appropriate to thin parallel-plate geometries) is presented in such a way that it is possible to identify easily conditions for the onset of shallow cells. Conditions are given under which succinonitrile-acetone mixtures should undergo supercritical bifurcation in experimentally accessible ranges. These results suggest a means for the quantitative test of the Mullins and Sekerka (1964) model and its weakly nonlinear extensions.

  12. The Use of Mode of Action Information in Risk Assessment: Quantitative Key Events/Dose-Response Framework for Modeling the Dose-Response for Key Events

    EPA Science Inventory

    The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Act...

  13. CCTV Coverage Index Based on Surveillance Resolution and Its Evaluation Using 3D Spatial Analysis

    PubMed Central

    Choi, Kyoungah; Lee, Impyeong

    2015-01-01

    We propose a novel approach to evaluating how effectively a closed circuit television (CCTV) system can monitor a targeted area. With 3D models of the target area and the camera parameters of the CCTV system, the approach produces surveillance coverage index, which is newly defined in this study as a quantitative measure for surveillance performance. This index indicates the proportion of the space being monitored with a sufficient resolution to the entire space of the target area. It is determined by computing surveillance resolution at every position and orientation, which indicates how closely a specific object can be monitored with a CCTV system. We present full mathematical derivation for the resolution, which depends on the location and orientation of the object as well as the geometric model of a camera. With the proposed approach, we quantitatively evaluated the surveillance coverage of a CCTV system in an underground parking area. Our evaluation process provided various quantitative-analysis results, compelling us to examine the design of the CCTV system prior to its installation and understand the surveillance capability of an existing CCTV system. PMID:26389909

  14. Uncertainties have a meaning: Information entropy as a quality measure for 3-D geological models

    NASA Astrophysics Data System (ADS)

    Wellmann, J. Florian; Regenauer-Lieb, Klaus

    2012-03-01

    Analyzing, visualizing and communicating uncertainties are important issues as geological models can never be fully determined. To date, there exists no general approach to quantify uncertainties in geological modeling. We propose here to use information entropy as an objective measure to compare and evaluate model and observational results. Information entropy was introduced in the 50s and defines a scalar value at every location in the model for predictability. We show that this method not only provides a quantitative insight into model uncertainties but, due to the underlying concept of information entropy, can be related to questions of data integration (i.e. how is the model quality interconnected with the used input data) and model evolution (i.e. does new data - or a changed geological hypothesis - optimize the model). In other words information entropy is a powerful measure to be used for data assimilation and inversion. As a first test of feasibility, we present the application of the new method to the visualization of uncertainties in geological models, here understood as structural representations of the subsurface. Applying the concept of information entropy on a suite of simulated models, we can clearly identify (a) uncertain regions within the model, even for complex geometries; (b) the overall uncertainty of a geological unit, which is, for example, of great relevance in any type of resource estimation; (c) a mean entropy for the whole model, important to track model changes with one overall measure. These results cannot easily be obtained with existing standard methods. The results suggest that information entropy is a powerful method to visualize uncertainties in geological models, and to classify the indefiniteness of single units and the mean entropy of a model quantitatively. Due to the relationship of this measure to the missing information, we expect the method to have a great potential in many types of geoscientific data assimilation problems — beyond pure visualization.

  15. Behavioral momentum and resurgence: Effects of time in extinction and repeated resurgence tests

    PubMed Central

    Shahan, Timothy A.

    2014-01-01

    Resurgence is an increase in a previously extinguished operant response that occurs if an alternative reinforcement introduced during extinction is removed. Shahan and Sweeney (2011) developed a quantitative model of resurgence based on behavioral momentum theory that captures existing data well and predicts that resurgence should decrease as time in extinction and exposure to the alternative reinforcement increases. Two experiments tested this prediction. The data from Experiment 1 suggested that without a return to baseline, resurgence decreases with increased exposure to alternative reinforcement and to extinction of the target response. Experiment 2 tested the predictions of the model across two conditions, one with constant alternative reinforcement for five sessions, and the other with alternative reinforcement removed three times. In both conditions, the alternative reinforcement was removed for the final test session. Experiment 2 again demonstrated a decrease in relapse across repeated resurgence tests. Furthermore, comparably little resurgence was observed at the same time point in extinction in the final test, despite dissimilar previous exposures to alternative reinforcement removal. The quantitative model provided a good description of the observed data in both experiments. More broadly, these data suggest that increased exposure to extinction may be a successful strategy to reduce resurgence. The relationship between these data and existing tests of the effect of time in extinction on resurgence is discussed. PMID:23982985

  16. Stochastic Simulation of Actin Dynamics Reveals the Role of Annealing and Fragmentation

    PubMed Central

    Fass, Joseph; Pak, Chi; Bamburg, James; Mogilner, Alex

    2008-01-01

    Recent observations of F-actin dynamics call for theoretical models to interpret and understand the quantitative data. A number of existing models rely on simplifications and do not take into account F-actin fragmentation and annealing. We use Gillespie’s algorithm for stochastic simulations of the F-actin dynamics including fragmentation and annealing. The simulations vividly illustrate that fragmentation and annealing have little influence on the shape of the polymerization curve and on nucleotide profiles within filaments but drastically affect the F-actin length distribution, making it exponential. We find that recent surprising measurements of high length diffusivity at the critical concentration cannot be explained by fragmentation and annealing events unless both fragmentation rates and frequency of undetected fragmentation and annealing events are greater than previously thought. The simulations compare well with experimentally measured actin polymerization data and lend additional support to a number of existing theoretical models. PMID:18279896

  17. Making riverscapes real

    NASA Astrophysics Data System (ADS)

    Carbonneau, Patrice; Fonstad, Mark A.; Marcus, W. Andrew; Dugdale, Stephen J.

    2012-01-01

    The structure and function of rivers have long been characterized either by: (1) qualitative models such as the River Continuum Concept or Serial Discontinuity Concept which paint broad descriptive portraits of how river habitats and communities vary, or (2) quantitative models, such as downstream hydraulic geometry, which rely on a limited number of measurements spread widely throughout a river basin. In contrast, authors such as Fausch et al. (2002) and Wiens (2002) proposed applying existing quantitative, spatially comprehensive ecology and landscape ecology methods to rivers. This new framework for river sciences which preserves variability and spatial relationships is called a riverine landscape or a 'riverscape'. Application of this riverscape concept requires information on the spatial distribution of organism-scale habitats throughout entire river systems. This article examines the ways in which recent technical and methodological developments can allow us to quantitatively implement and realize the riverscape concept. Using 3-cm true color aerial photos and 5-m resolution elevation data from the River Tromie, Scotland, we apply the newly developed Fluvial Information System which integrates a suite of cutting edge, high resolution, remote sensing methods in a spatially explicit framework. This new integrated approach allows for the extraction of primary fluvial variables such as width, depth, particle size, and elevation. From these first-order variables, we derive second-order geomorphic and hydraulic variables including velocity, stream power, Froude number and shear stress. Channel slope can be approximated from available topographic data. Based on these first and second-order variables, we produce riverscape metrics that begin to explore how geomorphic structures may influence river habitats, including connectivity, patchiness of habitat, and habitat distributions. The results show a complex interplay of geomorphic variable and habitat patchiness that is not predicted by existing fluvial theory. Riverscapes, thus, challenge the existing understanding of how rivers structure themselves and will force development of new paradigms.

  18. A general nonlinear magnetomechanical model for ferromagnetic materials under a constant weak magnetic field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Pengpeng; Zheng, Xiaojing, E-mail: xjzheng@xidian.edu.cn; Jin, Ke

    2016-04-14

    Weak magnetic nondestructive testing (e.g., metal magnetic memory method) concerns the magnetization variation of ferromagnetic materials due to its applied load and a weak magnetic surrounding them. One key issue on these nondestructive technologies is the magnetomechanical effect for quantitative evaluation of magnetization state from stress–strain condition. A representative phenomenological model has been proposed to explain the magnetomechanical effect by Jiles in 1995. However, the Jiles' model has some deficiencies in quantification, for instance, there is a visible difference between theoretical prediction and experimental measurements on stress–magnetization curve, especially in the compression case. Based on the thermodynamic relations and themore » approach law of irreversible magnetization, a nonlinear coupled model is proposed to improve the quantitative evaluation of the magnetomechanical effect. Excellent agreement has been achieved between the predictions from the present model and previous experimental results. In comparison with Jiles' model, the prediction accuracy is improved greatly by the present model, particularly for the compression case. A detailed study has also been performed to reveal the effects of initial magnetization status, cyclic loading, and demagnetization factor on the magnetomechanical effect. Our theoretical model reveals that the stable weak magnetic signals of nondestructive testing after multiple cyclic loads are attributed to the first few cycles eliminating most of the irreversible magnetization. Remarkably, the existence of demagnetization field can weaken magnetomechanical effect, therefore, significantly reduces the testing capability. This theoretical model can be adopted to quantitatively analyze magnetic memory signals, and then can be applied in weak magnetic nondestructive testing.« less

  19. Clines in quantitative traits: The role of migration patterns and selection scenarios

    PubMed Central

    Geroldinger, Ludwig; Bürger, Reinhard

    2015-01-01

    The existence, uniqueness, and shape of clines in a quantitative trait under selection toward a spatially varying optimum is studied. The focus is on deterministic diploid two-locus n-deme models subject to various migration patterns and selection scenarios. Migration patterns may exhibit isolation by distance, as in the stepping-stone model, or random dispersal, as in the island model. The phenotypic optimum may change abruptly in a single environmental step, more gradually, or not at all. Symmetry assumptions are imposed on phenotypic optima and migration rates. We study clines in the mean, variance, and linkage disequilibrium (LD). Clines result from polymorphic equilibria. The possible equilibrium configurations are determined as functions of the migration rate. Whereas for weak migration, many polymorphic equilibria may be simultaneously stable, their number decreases with increasing migration rate. Also for intermediate migration rates polymorphic equilibria are in general not unique, however, for loci of equal effects the corresponding clines in the mean, variance, and LD are unique. For sufficiently strong migration, no polymorphism is maintained. Both migration pattern and selection scenario exert strong influence on the existence and shape of clines. The results for discrete demes are compared with those from models in which space varies continuously and dispersal is modeled by diffusion. Comparisons with previous studies, which investigated clines under neutrality or under linkage equilibrium, are performed. If there is no long-distance migration, the environment does not change abruptly, and linkage is not very tight, populations are almost everywhere close to linkage equilibrium. PMID:25446959

  20. Tissue material properties and computational modelling of the human tibiofemoral joint: a critical review

    PubMed Central

    Akhtar, Riaz; Comerford, Eithne J.; Bates, Karl T.

    2018-01-01

    Understanding how structural and functional alterations of individual tissues impact on whole-joint function is challenging, particularly in humans where direct invasive experimentation is difficult. Finite element (FE) computational models produce quantitative predictions of the mechanical and physiological behaviour of multiple tissues simultaneously, thereby providing a means to study changes that occur through healthy ageing and disease such as osteoarthritis (OA). As a result, significant research investment has been placed in developing such models of the human knee. Previous work has highlighted that model predictions are highly sensitive to the various inputs used to build them, particularly the mathematical definition of material properties of biological tissues. The goal of this systematic review is two-fold. First, we provide a comprehensive summation and evaluation of existing linear elastic material property data for human tibiofemoral joint tissues, tabulating numerical values as a reference resource for future studies. Second, we review efforts to model tibiofemoral joint mechanical behaviour through FE modelling with particular focus on how studies have sourced tissue material properties. The last decade has seen a renaissance in material testing fuelled by development of a variety of new engineering techniques that allow the mechanical behaviour of both soft and hard tissues to be characterised at a spectrum of scales from nano- to bulk tissue level. As a result, there now exists an extremely broad range of published values for human tibiofemoral joint tissues. However, our systematic review highlights gaps and ambiguities that mean quantitative understanding of how tissue material properties alter with age and OA is limited. It is therefore currently challenging to construct FE models of the knee that are truly representative of a specific age or disease-state. Consequently, recent tibiofemoral joint FE models have been highly generic in terms of material properties even relying on non-human data from multiple species. We highlight this by critically evaluating current ability to quantitatively compare and model (1) young and old and (2) healthy and OA human tibiofemoral joints. We suggest that future research into both healthy and diseased knee function will benefit greatly from a subject- or cohort-specific approach in which FE models are constructed using material properties, medical imagery and loading data from cohorts with consistent demographics and/or disease states. PMID:29379690

  1. Tissue material properties and computational modelling of the human tibiofemoral joint: a critical review.

    PubMed

    Peters, Abby E; Akhtar, Riaz; Comerford, Eithne J; Bates, Karl T

    2018-01-01

    Understanding how structural and functional alterations of individual tissues impact on whole-joint function is challenging, particularly in humans where direct invasive experimentation is difficult. Finite element (FE) computational models produce quantitative predictions of the mechanical and physiological behaviour of multiple tissues simultaneously, thereby providing a means to study changes that occur through healthy ageing and disease such as osteoarthritis (OA). As a result, significant research investment has been placed in developing such models of the human knee. Previous work has highlighted that model predictions are highly sensitive to the various inputs used to build them, particularly the mathematical definition of material properties of biological tissues. The goal of this systematic review is two-fold. First, we provide a comprehensive summation and evaluation of existing linear elastic material property data for human tibiofemoral joint tissues, tabulating numerical values as a reference resource for future studies. Second, we review efforts to model tibiofemoral joint mechanical behaviour through FE modelling with particular focus on how studies have sourced tissue material properties. The last decade has seen a renaissance in material testing fuelled by development of a variety of new engineering techniques that allow the mechanical behaviour of both soft and hard tissues to be characterised at a spectrum of scales from nano- to bulk tissue level. As a result, there now exists an extremely broad range of published values for human tibiofemoral joint tissues. However, our systematic review highlights gaps and ambiguities that mean quantitative understanding of how tissue material properties alter with age and OA is limited. It is therefore currently challenging to construct FE models of the knee that are truly representative of a specific age or disease-state. Consequently, recent tibiofemoral joint FE models have been highly generic in terms of material properties even relying on non-human data from multiple species. We highlight this by critically evaluating current ability to quantitatively compare and model (1) young and old and (2) healthy and OA human tibiofemoral joints. We suggest that future research into both healthy and diseased knee function will benefit greatly from a subject- or cohort-specific approach in which FE models are constructed using material properties, medical imagery and loading data from cohorts with consistent demographics and/or disease states.

  2. The analysis of morphometric data on rocky mountain wolves and artic wolves using statistical method

    NASA Astrophysics Data System (ADS)

    Ammar Shafi, Muhammad; Saifullah Rusiman, Mohd; Hamzah, Nor Shamsidah Amir; Nor, Maria Elena; Ahmad, Noor’ani; Azia Hazida Mohamad Azmi, Nur; Latip, Muhammad Faez Ab; Hilmi Azman, Ahmad

    2018-04-01

    Morphometrics is a quantitative analysis depending on the shape and size of several specimens. Morphometric quantitative analyses are commonly used to analyse fossil record, shape and size of specimens and others. The aim of the study is to find the differences between rocky mountain wolves and arctic wolves based on gender. The sample utilised secondary data which included seven variables as independent variables and two dependent variables. Statistical modelling was used in the analysis such was the analysis of variance (ANOVA) and multivariate analysis of variance (MANOVA). The results showed there exist differentiating results between arctic wolves and rocky mountain wolves based on independent factors and gender.

  3. Transmission of Bacterial Zoonotic Pathogens between Pets and Humans: The Role of Pet Food.

    PubMed

    Lambertini, Elisabetta; Buchanan, Robert L; Narrod, Clare; Pradhan, Abani K

    2016-01-01

    Recent Salmonella outbreaks associated with dry pet food and treats raised the level of concern for these products as vehicle of pathogen exposure for both pets and their owners. The need to characterize the microbiological and risk profiles of this class of products is currently not supported by sufficient specific data. This systematic review summarizes existing data on the main variables needed to support an ingredients-to-consumer quantitative risk model to (1) describe the microbial ecology of bacterial pathogens in the dry pet food production chain, (2) estimate pet exposure to pathogens through dry food consumption, and (3) assess human exposure and illness incidence due to contact with pet food and pets in the household. Risk models populated with the data here summarized will provide a tool to quantitatively address the emerging public health concerns associated with pet food and the effectiveness of mitigation measures. Results of such models can provide a basis for improvements in production processes, risk communication to consumers, and regulatory action.

  4. Modelling Ebola virus dynamics: Implications for therapy.

    PubMed

    Martyushev, Alexey; Nakaoka, Shinji; Sato, Kei; Noda, Takeshi; Iwami, Shingo

    2016-11-01

    Ebola virus (EBOV) causes a severe, often fatal Ebola virus disease (EVD), for which no approved antivirals exist. Recently, some promising anti-EBOV drugs, which are experimentally potent in animal models, have been developed. However, because the quantitative dynamics of EBOV replication in humans is uncertain, it remains unclear how much antiviral suppression of viral replication affects EVD outcome in patients. Here, we developed a novel mathematical model to quantitatively analyse human viral load data obtained during the 2000/01 Uganda EBOV outbreak and evaluated the effects of different antivirals. We found that nucleoside analogue- and siRNA-based therapies are effective if a therapy with a >50% inhibition rate is initiated within a few days post-symptom-onset. In contrast, antibody-based therapy requires not only a higher inhibition rate but also an earlier administration, especially for otherwise fatal cases. Our results demonstrate that an appropriate choice of EBOV-specific drugs is required for effective EVD treatment. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. How Well Does LCA Model Land Use Impacts on Biodiversity?--A Comparison with Approaches from Ecology and Conservation.

    PubMed

    Curran, Michael; de Souza, Danielle Maia; Antón, Assumpció; Teixeira, Ricardo F M; Michelsen, Ottar; Vidal-Legaz, Beatriz; Sala, Serenella; Milà i Canals, Llorenç

    2016-03-15

    The modeling of land use impacts on biodiversity is considered a priority in life cycle assessment (LCA). Many diverging approaches have been proposed in an expanding literature on the topic. The UNEP/SETAC Life Cycle Initiative is engaged in building consensus on a shared modeling framework to highlight best-practice and guide model application by practitioners. In this paper, we evaluated the performance of 31 models from both the LCA and the ecology/conservation literature (20 from LCA, 11 from non-LCA fields) according to a set of criteria reflecting (i) model completeness, (ii) biodiversity representation, (iii) impact pathway coverage, (iv) scientific quality, and (v) stakeholder acceptance. We show that LCA models tend to perform worse than those from ecology and conservation (although not significantly), implying room for improvement. We identify seven best-practice recommendations that can be implemented immediately to improve LCA models based on existing approaches in the literature. We further propose building a "consensus model" through weighted averaging of existing information, to complement future development. While our research focuses on conceptual model design, further quantitative comparison of promising models in shared case studies is an essential prerequisite for future informed model choice.

  6. Quantitative EEG features selection in the classification of attention and response control in the children and adolescents with attention deficit hyperactivity disorder.

    PubMed

    Bashiri, Azadeh; Shahmoradi, Leila; Beigy, Hamid; Savareh, Behrouz A; Nosratabadi, Masood; N Kalhori, Sharareh R; Ghazisaeedi, Marjan

    2018-06-01

    Quantitative EEG gives valuable information in the clinical evaluation of psychological disorders. The purpose of the present study is to identify the most prominent features of quantitative electroencephalography (QEEG) that affect attention and response control parameters in children with attention deficit hyperactivity disorder. The QEEG features and the Integrated Visual and Auditory-Continuous Performance Test ( IVA-CPT) of 95 attention deficit hyperactivity disorder subjects were preprocessed by Independent Evaluation Criterion for Binary Classification. Then, the importance of selected features in the classification of desired outputs was evaluated using the artificial neural network. Findings uncovered the highest rank of QEEG features in each IVA-CPT parameters related to attention and response control. Using the designed model could help therapists to determine the existence or absence of defects in attention and response control relying on QEEG.

  7. Quantitative Evaluation Method of Each Generation Margin for Power System Planning

    NASA Astrophysics Data System (ADS)

    Su, Su; Tanaka, Kazuyuki

    As the power system deregulation advances, the competition among the power companies becomes heated, and they seek more efficient system planning using existing facilities. Therefore, an efficient system planning method has been expected. This paper proposes a quantitative evaluation method for the (N-1) generation margin considering the overload and the voltage stability restriction. Concerning the generation margin related with the overload, a fast solution method without the recalculation of the (N-1) Y-matrix is proposed. Referred to the voltage stability, this paper proposes an efficient method to search the stability limit. The IEEE30 model system which is composed of 6 generators and 14 load nodes is employed to validate the proposed method. According to the results, the proposed method can reduce the computational cost for the generation margin related with the overload under the (N-1) condition, and specify the value quantitatively.

  8. Transforming Boolean models to continuous models: methodology and application to T-cell receptor signaling

    PubMed Central

    Wittmann, Dominik M; Krumsiek, Jan; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Klamt, Steffen; Theis, Fabian J

    2009-01-01

    Background The understanding of regulatory and signaling networks has long been a core objective in Systems Biology. Knowledge about these networks is mainly of qualitative nature, which allows the construction of Boolean models, where the state of a component is either 'off' or 'on'. While often able to capture the essential behavior of a network, these models can never reproduce detailed time courses of concentration levels. Nowadays however, experiments yield more and more quantitative data. An obvious question therefore is how qualitative models can be used to explain and predict the outcome of these experiments. Results In this contribution we present a canonical way of transforming Boolean into continuous models, where the use of multivariate polynomial interpolation allows transformation of logic operations into a system of ordinary differential equations (ODE). The method is standardized and can readily be applied to large networks. Other, more limited approaches to this task are briefly reviewed and compared. Moreover, we discuss and generalize existing theoretical results on the relation between Boolean and continuous models. As a test case a logical model is transformed into an extensive continuous ODE model describing the activation of T-cells. We discuss how parameters for this model can be determined such that quantitative experimental results are explained and predicted, including time-courses for multiple ligand concentrations and binding affinities of different ligands. This shows that from the continuous model we may obtain biological insights not evident from the discrete one. Conclusion The presented approach will facilitate the interaction between modeling and experiments. Moreover, it provides a straightforward way to apply quantitative analysis methods to qualitatively described systems. PMID:19785753

  9. Testing the Community-Based Learning Collaborative (CBLC) implementation model: a study protocol.

    PubMed

    Hanson, Rochelle F; Schoenwald, Sonja; Saunders, Benjamin E; Chapman, Jason; Palinkas, Lawrence A; Moreland, Angela D; Dopp, Alex

    2016-01-01

    High rates of youth exposure to violence, either through direct victimization or witnessing, result in significant health/mental health consequences and high associated lifetime costs. Evidence-based treatments (EBTs), such as Trauma-Focused Cognitive Behavioral Therapy (TF-CBT), can prevent and/or reduce these negative effects, yet these treatments are not standard practice for therapists working with children identified by child welfare or mental health systems as needing services. While research indicates that collaboration among child welfare and mental health services sectors improves availability and sustainment of EBTs for children, few implementation strategies designed specifically to promote and sustain inter-professional collaboration (IC) and inter-organizational relationships (IOR) have undergone empirical investigation. A potential candidate for evaluation is the Community-Based Learning Collaborative (CBLC) implementation model, an adaptation of the Learning Collaborative which includes strategies designed to develop and strengthen inter-professional relationships between brokers and providers of mental health services to promote IC and IOR and achieve sustained implementation of EBTs for children within a community. This non-experimental, mixed methods study involves two phases: (1) analysis of existing prospective quantitative and qualitative quality improvement and project evaluation data collected pre and post, weekly, and monthly from 998 participants in one of seven CBLCs conducted as part of a statewide initiative; and (2) Phase 2 collection of new quantitative and qualitative (key informant interviews) data during the funded study period to evaluate changes in relations among IC, IOR, social networks and the penetration and sustainment of TF-CBT in targeted communities. Recruitment for Phase 2 is from the pool of 998 CBLC participants to achieve a targeted enrollment of n = 150. Study aims include: (1) Use existing quality improvement (weekly/monthly online surveys; pre-post surveys; interviews) and newly collected quantitative (monthly surveys) and qualitative (key informant interviews) data and social network analysis to test whether CBLC strategies are associated with penetration and sustainment of TF-CBT; and (2) Use existing quantitative quality improvement (weekly/monthly on-line surveys; pre/post surveys) and newly collected qualitative (key informant interviews) data and social network analysis to test whether CBLC strategies are associated with increased IOR and IC intensity. The proposed research leverages an on-going, statewide implementation initiative to generate evidence about implementation strategies needed to make trauma-focused EBTs more accessible to children. This study also provides feasibility data to inform an effectiveness trial that will utilize a time-series design to rigorously evaluate the CBLC model as a mechanism to improve access and sustained use of EBTs for children.

  10. A mixed model for the relationship between climate and human cranial form.

    PubMed

    Katz, David C; Grote, Mark N; Weaver, Timothy D

    2016-08-01

    We expand upon a multivariate mixed model from quantitative genetics in order to estimate the magnitude of climate effects in a global sample of recent human crania. In humans, genetic distances are correlated with distances based on cranial form, suggesting that population structure influences both genetic and quantitative trait variation. Studies controlling for this structure have demonstrated significant underlying associations of cranial distances with ecological distances derived from climate variables. However, to assess the biological importance of an ecological predictor, estimates of effect size and uncertainty in the original units of measurement are clearly preferable to significance claims based on units of distance. Unfortunately, the magnitudes of ecological effects are difficult to obtain with distance-based methods, while models that produce estimates of effect size generally do not scale to high-dimensional data like cranial shape and form. Using recent innovations that extend quantitative genetics mixed models to highly multivariate observations, we estimate morphological effects associated with a climate predictor for a subset of the Howells craniometric dataset. Several measurements, particularly those associated with cranial vault breadth, show a substantial linear association with climate, and the multivariate model incorporating a climate predictor is preferred in model comparison. Previous studies demonstrated the existence of a relationship between climate and cranial form. The mixed model quantifies this relationship concretely. Evolutionary questions that require population structure and phylogeny to be disentangled from potential drivers of selection may be particularly well addressed by mixed models. Am J Phys Anthropol 160:593-603, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  11. Real Patient and its Virtual Twin: Application of Quantitative Systems Toxicology Modelling in the Cardiac Safety Assessment of Citalopram.

    PubMed

    Patel, Nikunjkumar; Wiśniowska, Barbara; Jamei, Masoud; Polak, Sebastian

    2017-11-27

    A quantitative systems toxicology (QST) model for citalopram was established to simulate, in silico, a 'virtual twin' of a real patient to predict the occurrence of cardiotoxic events previously reported in patients under various clinical conditions. The QST model considers the effects of citalopram and its most notable electrophysiologically active primary (desmethylcitalopram) and secondary (didesmethylcitalopram) metabolites, on cardiac electrophysiology. The in vitro cardiac ion channel current inhibition data was coupled with the biophysically detailed model of human cardiac electrophysiology to investigate the impact of (i) the inhibition of multiple ion currents (I Kr , I Ks , I CaL ); (ii) the inclusion of metabolites in the QST model; and (iii) unbound or total plasma as the operating drug concentration, in predicting clinically observed QT prolongation. The inclusion of multiple ion channel current inhibition and metabolites in the simulation with unbound plasma citalopram concentration provided the lowest prediction error. The predictive performance of the model was verified with three additional therapeutic and supra-therapeutic drug exposure clinical cases. The results indicate that considering only the hERG ion channel inhibition of only the parent drug is potentially misleading, and the inclusion of active metabolite data and the influence of other ion channel currents should be considered to improve the prediction of potential cardiac toxicity. Mechanistic modelling can help bridge the gaps existing in the quantitative translation from preclinical cardiac safety assessment to clinical toxicology. Moreover, this study shows that the QST models, in combination with appropriate drug and systems parameters, can pave the way towards personalised safety assessment.

  12. Survival models for harvest management of mourning dove populations

    USGS Publications Warehouse

    Otis, D.L.

    2002-01-01

    Quantitative models of the relationship between annual survival and harvest rate of migratory game-bird populations are essential to science-based harvest management strategies. I used the best available band-recovery and harvest data for mourning doves (Zenaida macroura) to build a set of models based on different assumptions about compensatory harvest mortality. Although these models suffer from lack of contemporary data, they can be used in development of an initial set of population models that synthesize existing demographic data on a management-unit scale, and serve as a tool for prioritization of population demographic information needs. Credible harvest management plans for mourning dove populations will require a long-term commitment to population monitoring and iterative population analysis.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jager, Yetta; Efroymson, Rebecca Ann; Sublette, K.

    Quantitative tools are needed to evaluate the ecological effects of increasing petroleum production. In this article, we describe two stochastic models for simulating the spatial distribution of brine spills on a landscape. One model uses general assumptions about the spatial arrangement of spills and their sizes; the second model distributes spills by siting rectangular well complexes and conditioning spill probabilities on the configuration of pipes. We present maps of landscapes with spills produced by the two methods and compare the ability of the models to reproduce a specified spill area. A strength of the models presented here is their abilitymore » to extrapolate from the existing landscape to simulate landscapes with a higher (or lower) density of oil wells.« less

  14. Efficient Bayesian mixed model analysis increases association power in large cohorts

    PubMed Central

    Loh, Po-Ru; Tucker, George; Bulik-Sullivan, Brendan K; Vilhjálmsson, Bjarni J; Finucane, Hilary K; Salem, Rany M; Chasman, Daniel I; Ridker, Paul M; Neale, Benjamin M; Berger, Bonnie; Patterson, Nick; Price, Alkes L

    2014-01-01

    Linear mixed models are a powerful statistical tool for identifying genetic associations and avoiding confounding. However, existing methods are computationally intractable in large cohorts, and may not optimize power. All existing methods require time cost O(MN2) (where N = #samples and M = #SNPs) and implicitly assume an infinitesimal genetic architecture in which effect sizes are normally distributed, which can limit power. Here, we present a far more efficient mixed model association method, BOLT-LMM, which requires only a small number of O(MN)-time iterations and increases power by modeling more realistic, non-infinitesimal genetic architectures via a Bayesian mixture prior on marker effect sizes. We applied BOLT-LMM to nine quantitative traits in 23,294 samples from the Women’s Genome Health Study (WGHS) and observed significant increases in power, consistent with simulations. Theory and simulations show that the boost in power increases with cohort size, making BOLT-LMM appealing for GWAS in large cohorts. PMID:25642633

  15. BEopt-CA (Ex) -- A Tool for Optimal Integration of EE/DR/ES+PV in Existing California Homes. Cooperative Research and Development Final Report, CRADA Number CRD-11-429

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Craig

    Opportunities for combining energy efficiency, demand response, and energy storage with PV are often missed, because the required knowledge and expertise for these different technologies exist in separate organizations or individuals. Furthermore, there is a lack of quantitative tools to optimize energy efficiency, demand response and energy storage with PV, especially for existing buildings. Our goal is to develop a modeling tool, BEopt-CA (Ex), with capabilities to facilitate identification and implementation of a balanced integration of energy efficiency (EE), demand response (DR), and energy storage (ES) with photovoltaics (PV) within the residential retrofit market. To achieve this goal, we willmore » adapt and extend an existing tool -- BEopt -- that is designed to identify optimal combinations of efficiency and PV in new home designs. In addition, we will develop multifamily residential modeling capabilities for use in California, to facilitate integration of distributed solar power into the grid in order to maximize its value to California ratepayers. The project is follow-on research that leverages previous California Solar Initiative RD&D investment in the BEopt software. BEopt facilitates finding the least cost combination of energy efficiency and renewables to support integrated DSM (iDSM) and Zero Net Energy (ZNE) in California residential buildings. However, BEopt is currently focused on modeling single-family houses and does not include satisfactory capabilities for modeling multifamily homes. The project brings BEopt's existing modeling and optimization capabilities to multifamily buildings, including duplexes, triplexes, townhouses, flats, and low-rise apartment buildings.« less

  16. The Role of Excitons on Light Amplification in Lead Halide Perovskites.

    PubMed

    Lü, Quan; Wei, Haohan; Sun, Wenzhao; Wang, Kaiyang; Gu, Zhiyuan; Li, Jiankai; Liu, Shuai; Xiao, Shumin; Song, Qinghai

    2016-12-01

    The role of excitons on the amplifications of lead halide perovskites has been explored. Unlike the photoluminescence, the intensity of amplified spontaneous emission is partially suppressed at low temperature. The detailed analysis and experiments show that the inhibition is attributed to the existence of exciton and a quantitative model has been built to explain the experimental observations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Imaging of the PAH Emission Bands in the Orion Bar

    NASA Technical Reports Server (NTRS)

    Bregman, Jesse; Harker, David; Rank, David; Temi, Pasqiale; Morrison, David (Technical Monitor)

    1994-01-01

    The infrared spectrum of many planetary nebulae, HII regions, galactic nuclei, reflection nebulae, and WC stars are dominated by a set of narrow and broad features which for many years were called the "unidentified infrared bands". These bands have been attributed to several carbon-rich molecular species which all contain only carbon and hydrogen atoms, and fall into the class of PAH molecules or are conglomerates of PAH skeletons. If these bands are from PAHs, then PAHs contain 1-10% of the interstellar carbon, making them the most abundant molecular species in the interstellar medium after CO. From ground based telescopes, we have studied the emission bands assigned to C-H bond vibrations in PAHs (3.3, 11.3 microns) in the Orion Bar region, and showed that their distribution and intensities are consistent with a quantitative PAH model. We have recently obtained spectral images of the Orion Bar from the KAO at 6.2 and 7.7 microns using a 128 x 128 Si:Ga array camera in order to study the C-C modes of the PAH molecules. We will show these new data along with our existing C-H mode data set, and make a quantitative comparison of the data with the existing PAH model.

  18. Secondary Interstellar Oxygen in the Heliosphere: Numerical Modeling and Comparison with IBEX-Lo Data

    NASA Astrophysics Data System (ADS)

    Baliukin, I. I.; Izmodenov, V. V.; Möbius, E.; Alexashov, D. B.; Katushkina, O. A.; Kucharek, H.

    2017-12-01

    Quantitative analysis of the interstellar heavy (oxygen and neon) atom fluxes obtained by the Interstellar Boundary Explorer (IBEX) suggests the existence of the secondary interstellar oxygen component. This component is formed near the heliopause due to charge exchange of interstellar oxygen ions with hydrogen atoms, as was predicted theoretically. A detailed quantitative analysis of the fluxes of interstellar heavy atoms is only possible with a model that takes into account both the filtration of primary and the production of secondary interstellar oxygen in the boundary region of the heliosphere as well as a detailed simulation of the motion of interstellar atoms inside the heliosphere. This simulation must take into account photoionization, charge exchange with the protons of the solar wind and solar gravitational attraction. This paper presents the results of modeling interstellar oxygen and neon atoms through the heliospheric interface and inside the heliosphere based on a three-dimensional kinetic-MHD model of the solar wind interaction with the local interstellar medium and a comparison of these results with the data obtained on the IBEX spacecraft.

  19. Biomechanics-based in silico medicine: the manifesto of a new science.

    PubMed

    Viceconti, Marco

    2015-01-21

    In this perspective article we discuss the role of contemporary biomechanics in the light of recent applications such as the development of the so-called Virtual Physiological Human technologies for physiology-based in silico medicine. In order to build Virtual Physiological Human (VPH) models, computer models that capture and integrate the complex systemic dynamics of living organisms across radically different space-time scales, we need to re-formulate a vast body of existing biology and physiology knowledge so that it is formulated as a quantitative hypothesis, which can be expressed in mathematical terms. Once the predictive accuracy of these models is confirmed against controlled experiments and against clinical observations, we will have VPH model that can reliably predict certain quantitative changes in health status of a given patient, but also, more important, we will have a theory, in the true meaning this word has in the scientific method. In this scenario, biomechanics plays a very important role, biomechanics is one of the few areas of life sciences where we attempt to build full mechanistic explanations based on quantitative observations, in other words, we investigate living organisms like physical systems. This is in our opinion a Copernican revolution, around which the scope of biomechanics should be re-defined. Thus, we propose a new definition for our research domain "Biomechanics is the study of living organisms as mechanistic systems". Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. A spatial Bayesian network model to assess the benefits of early warning for urban flood risk to people

    NASA Astrophysics Data System (ADS)

    Balbi, Stefano; Villa, Ferdinando; Mojtahed, Vahid; Hegetschweiler, Karin Tessa; Giupponi, Carlo

    2016-06-01

    This article presents a novel methodology to assess flood risk to people by integrating people's vulnerability and ability to cushion hazards through coping and adapting. The proposed approach extends traditional risk assessments beyond material damages; complements quantitative and semi-quantitative data with subjective and local knowledge, improving the use of commonly available information; and produces estimates of model uncertainty by providing probability distributions for all of its outputs. Flood risk to people is modeled using a spatially explicit Bayesian network model calibrated on expert opinion. Risk is assessed in terms of (1) likelihood of non-fatal physical injury, (2) likelihood of post-traumatic stress disorder and (3) likelihood of death. The study area covers the lower part of the Sihl valley (Switzerland) including the city of Zurich. The model is used to estimate the effect of improving an existing early warning system, taking into account the reliability, lead time and scope (i.e., coverage of people reached by the warning). Model results indicate that the potential benefits of an improved early warning in terms of avoided human impacts are particularly relevant in case of a major flood event.

  1. Using quantitative disease dynamics as a tool for guiding response to avian influenza in poultry in the United States of America.

    PubMed

    Pepin, K M; Spackman, E; Brown, J D; Pabilonia, K L; Garber, L P; Weaver, J T; Kennedy, D A; Patyk, K A; Huyvaert, K P; Miller, R S; Franklin, A B; Pedersen, K; Bogich, T L; Rohani, P; Shriner, S A; Webb, C T; Riley, S

    2014-03-01

    Wild birds are the primary source of genetic diversity for influenza A viruses that eventually emerge in poultry and humans. Much progress has been made in the descriptive ecology of avian influenza viruses (AIVs), but contributions are less evident from quantitative studies (e.g., those including disease dynamic models). Transmission between host species, individuals and flocks has not been measured with sufficient accuracy to allow robust quantitative evaluation of alternate control protocols. We focused on the United States of America (USA) as a case study for determining the state of our quantitative knowledge of potential AIV emergence processes from wild hosts to poultry. We identified priorities for quantitative research that would build on existing tools for responding to AIV in poultry and concluded that the following knowledge gaps can be addressed with current empirical data: (1) quantification of the spatio-temporal relationships between AIV prevalence in wild hosts and poultry populations, (2) understanding how the structure of different poultry sectors impacts within-flock transmission, (3) determining mechanisms and rates of between-farm spread, and (4) validating current policy-decision tools with data. The modeling studies we recommend will improve our mechanistic understanding of potential AIV transmission patterns in USA poultry, leading to improved measures of accuracy and reduced uncertainty when evaluating alternative control strategies. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  2. Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization

    PubMed Central

    Marai, G. Elisabeta

    2018-01-01

    Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage—and its evaluation—of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature. PMID:28866550

  3. A comparison of major petroleum life cycle models | Science ...

    EPA Pesticide Factsheets

    Many organizations have attempted to develop an accurate well-to-pump life cycle model of petroleum products in order to inform decision makers of the consequences of its use. Our paper studies five of these models, demonstrating the differences in their predictions and attempting to evaluate their data quality. Carbon dioxide well-to-pump emissions for gasoline showed a variation of 35 %, and other pollutants such as ammonia and particulate matter varied up to 100 %. Differences in allocation do not appear to explain differences in predictions. Effects of these deviations on well-to-wheels passenger vehicle and truck transportation life cycle models may be minimal for effects such as global warming potential (6 % spread), but for respiratory effects of criteria pollutants (41 % spread) and other impact categories, they can be significant. A data quality assessment of the models’ documentation revealed real differences between models in temporal and geographic representativeness, completeness, as well as transparency. Stakeholders may need to consider carefully the tradeoffs inherent when selecting a model to conduct life cycle assessments for systems that make heavy use of petroleum products. This is a qualitative and quantitative comparison of petroleum LCA models intended for an expert audience interested in better understanding the data quality of existing petroleum life cycle models and the quantitative differences between these models.

  4. Cycle frequency in standard Rock-Paper-Scissors games: Evidence from experimental economics

    NASA Astrophysics Data System (ADS)

    Xu, Bin; Zhou, Hai-Jun; Wang, Zhijian

    2013-10-01

    The Rock-Paper-Scissors (RPS) game is a widely used model system in game theory. Evolutionary game theory predicts the existence of persistent cycles in the evolutionary trajectories of the RPS game, but experimental evidence has remained to be rather weak. In this work, we performed laboratory experiments on the RPS game and analyzed the social-state evolutionary trajectories of twelve populations of N=6 players. We found strong evidence supporting the existence of persistent cycles. The mean cycling frequency was measured to be 0.029±0.009 period per experimental round. Our experimental observations can be quantitatively explained by a simple non-equilibrium model, namely the discrete-time logit dynamical process with a noise parameter. Our work therefore favors the evolutionary game theory over the classical game theory for describing the dynamical behavior of the RPS game.

  5. Comprehensive Understanding of the Zipingpu Reservoir to the Ms8.0 Wenchuan Earthquake

    NASA Astrophysics Data System (ADS)

    Cheng, H.; Pang, Y. J.; Zhang, H.; Shi, Y.

    2014-12-01

    After the Wenchuan earthquake occurred, whether the big earthquake triggered by the storage of the Zipingpu Reservoir has attracted wide attention in international academic community. In addition to the qualitative discussion, many scholars also adopted the quantitative analysis methods to calculate the stress changes, but due to the different results, they draw very different conclusions. Here, we take the dispute of different teams in the quantitative calculation of Zipingpu reservoir as a starting point. In order to find out the key influence factors of quantitative calculation and know about the existing uncertainty elements during the numerical simulation, we analyze factors which may cause the differences. The preliminary results show that the calculation methods (analytical method or numerical method), dimension of models (2-D or 3-D), diffusion model, diffusion coefficient and focal mechanism are the main factors resulted in the differences, especially the diffusion coefficient of the fractured rock mass. The change of coulomb failure stress of the epicenter of Wenchuan earthquake attained from 2-D model is about 3 times of that of 3-D model. And it is not reasonable that only considering the fault permeability (assuming the permeability of rock mass as infinity) or only considering homogeneous isotropic rock mass permeability (ignoring the fault permeability). The different focal mechanisms also could dramatically affect the change of coulomb failure stress of the epicenter of Wenchuan earthquake, and the differences can research 2-7 times. And the differences the change of coulomb failure stress can reach several hundreds times, when selecting different diffusion coefficients. According to existing research that the magnitude of coulomb failure stress change is about several kPa, we could not rule out the possibility that the Zipingpu Reservoir may trigger the 2008 Wenchuan earthquake. However, for the background stress is not clear and coulomb failure stress change is too little, we also not sure there must be a connection between reservoir and earthquake. In future work, we should target on the basis of field survey and indoor experiment, improve the model and develop high performance simulation.

  6. Exploring a taxonomy for aggression against women: can it aid conceptual clarity?

    PubMed

    Cook, Sarah; Parrott, Dominic

    2009-01-01

    The assessment of aggression against women is demanding primarily because assessment strategies do not share a common language to describe reliably the wide range of forms of aggression women experience. The lack of a common language impairs efforts to describe these experiences, understand causes and consequences of aggression against women, and develop effective intervention and prevention efforts. This review accomplishes two goals. First, it applies a theoretically and empirically based taxonomy to behaviors assessed by existing measurement instruments. Second, it evaluates whether the taxonomy provides a common language for the field. Strengths of the taxonomy include its ability to describe and categorize all forms of aggression found in existing quantitative measures. The taxonomy also classifies numerous examples of aggression discussed in the literature but notably absent from quantitative measures. Although we use existing quantitative measures as a starting place to evaluate the taxonomy, its use is not limited to quantitative methods. Implications for theory, research, and practice are discussed.

  7. Modeling the influence of a reduced equator-to-pole sea surface temperature gradient on the distribution of water isotopes in the Early/Middle Eocene

    NASA Astrophysics Data System (ADS)

    Speelman, Eveline N.; Sewall, Jacob O.; Noone, David; Huber, Matthew; von der Heydt, Anna; Damsté, Jaap Sinninghe; Reichart, Gert-Jan

    2010-09-01

    Proxy-based climate reconstructions suggest the existence of a strongly reduced equator-to-pole temperature gradient during the Azolla interval in the Early/Middle Eocene, compared to modern. Changes in the hydrological cycle, as a consequence of a reduced temperature gradient, are expected to be reflected in the isotopic composition of precipitation (δD, δ 18O). The interpretation of water isotopic records to quantitatively reconstruct past precipitation patterns is, however, hampered by a lack of detailed information on changes in their spatial and temporal distribution. Using the isotope-enabled version of the National Center for Atmospheric Research (NCAR) atmospheric general circulation model, Community Atmosphere Model v.3 (isoCAM3), relationships between water isotopes and past climates can be simulated. Here we examine the influence of an imposed reduced meridional sea surface temperature gradient on the spatial distribution of precipitation and its isotopic composition in an Early/Middle Eocene setting. As a result of the applied forcings, the Eocene simulation predicts the occurrence of less depleted high latitude precipitation, with δD values ranging only between 0 and -140‰ (compared to Present-day 0 to -300‰). Comparison with Early/Middle Eocene-age isotopic proxy data shows that the simulation accurately captures the main features of the spatial distribution of the isotopic composition of Early/Middle Eocene precipitation over land in conjunction with the aspects of the modeled Early/Middle Eocene climate. Hence, the included stable isotope module quantitatively supports the existence of a reduced meridional temperature gradient during this interval.

  8. From intuition to statistics in building subsurface structural models

    USGS Publications Warehouse

    Brandenburg, J.P.; Alpak, F.O.; Naruk, S.; Solum, J.

    2011-01-01

    Experts associated with the oil and gas exploration industry suggest that combining forward trishear models with stochastic global optimization algorithms allows a quantitative assessment of the uncertainty associated with a given structural model. The methodology is applied to incompletely imaged structures related to deepwater hydrocarbon reservoirs and results are compared to prior manual palinspastic restorations and borehole data. This methodology is also useful for extending structural interpretations into other areas of limited resolution, such as subsalt in addition to extrapolating existing data into seismic data gaps. This technique can be used for rapid reservoir appraisal and potentially have other applications for seismic processing, well planning, and borehole stability analysis.

  9. A Conceptual Measurement Model for eHealth Readiness: a Team Based Perspective

    PubMed Central

    Phillips, James; Poon, Simon K.; Yu, Dan; Lam, Mary; Hines, Monique; Brunner, Melissa; Power, Emma; Keep, Melanie; Shaw, Tim; Togher, Leanne

    2017-01-01

    Despite the shift towards collaborative healthcare and the increase in the use of eHealth technologies, there does not currently exist a model for the measurement of eHealth readiness in interdisciplinary healthcare teams. This research aims to address this gap in the literature through the development of a three phase methodology incorporating qualitative and quantitative methods. We propose a conceptual measurement model consisting of operationalized themes affecting readiness across four factors: (i) Organizational Capabilities, (ii) Team Capabilities, (iii) Patient Capabilities, and (iv) Technology Capabilities. The creation of this model will allow for the measurement of the readiness of interdisciplinary healthcare teams to use eHealth technologies to improve patient outcomes. PMID:29854207

  10. Quantitative Studies on the Propagation and Extinction of Near-Limit Premixed Flames Under Normal and Microgravity

    NASA Technical Reports Server (NTRS)

    Dong, Y.; Spedding, G. R.; Egolfopoulos, F. N.; Miller, F. J.

    2003-01-01

    The main objective of this research is to introduce accurate fluid mechanics measurements diagnostics in the 2.2-s drop tower for the determination of the detailed flow-field at the states of extinction. These results are important as they can then be compared with confidence with detailed numerical simulations so that important insight is provided into near-limit phenomena that are controlled by not well-understood kinetics and thermal radiation processes. Past qualitative studies did enhance our general understanding on the subject. However, quantitative studies are essential for the validation of existing models that subsequently be used to describe near-limit phenomena that can initiate catastrophic events in micro- and/or reduced gravity environments.

  11. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  12. Mapping of quantitative trait loci using the skew-normal distribution.

    PubMed

    Fernandes, Elisabete; Pacheco, António; Penha-Gonçalves, Carlos

    2007-11-01

    In standard interval mapping (IM) of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. When this assumption of normality is violated, the most commonly adopted strategy is to use the previous model after data transformation. However, an appropriate transformation may not exist or may be difficult to find. Also this approach can raise interpretation issues. An interesting alternative is to consider a skew-normal mixture model in standard IM, and the resulting method is here denoted as skew-normal IM. This flexible model that includes the usual symmetric normal distribution as a special case is important, allowing continuous variation from normality to non-normality. In this paper we briefly introduce the main peculiarities of the skew-normal distribution. The maximum likelihood estimates of parameters of the skew-normal distribution are obtained by the expectation-maximization (EM) algorithm. The proposed model is illustrated with real data from an intercross experiment that shows a significant departure from the normality assumption. The performance of the skew-normal IM is assessed via stochastic simulation. The results indicate that the skew-normal IM has higher power for QTL detection and better precision of QTL location as compared to standard IM and nonparametric IM.

  13. Closing the loop: modelling of heart failure progression from health to end-stage using a meta-analysis of left ventricular pressure-volume loops.

    PubMed

    Warriner, David R; Brown, Alistair G; Varma, Susheel; Sheridan, Paul J; Lawford, Patricia; Hose, David R; Al-Mohammad, Abdallah; Shi, Yubing

    2014-01-01

    The American Heart Association (AHA)/American College of Cardiology (ACC) guidelines for the classification of heart failure (HF) are descriptive but lack precise and objective measures which would assist in categorising such patients. Our aim was two fold, firstly to demonstrate quantitatively the progression of HF through each stage using a meta-analysis of existing left ventricular (LV) pressure-volume (PV) loop data and secondly use the LV PV loop data to create stage specific HF models. A literature search yielded 31 papers with PV data, representing over 200 patients in different stages of HF. The raw pressure and volume data were extracted from the papers using a digitising software package and the means were calculated. The data demonstrated that, as HF progressed, stroke volume (SV), ejection fraction (EF%) decreased while LV volumes increased. A 2-element lumped parameter model was employed to model the mean loops and the error was calculated between the loops, demonstrating close fit between the loops. The only parameter that was consistently and statistically different across all the stages was the elastance (Emax). For the first time, the authors have created a visual and quantitative representation of the AHA/ACC stages of LVSD-HF, from normal to end-stage. The study demonstrates that robust, load-independent and reproducible parameters, such as elastance, can be used to categorise and model HF, complementing the existing classification. The modelled PV loops establish previously unknown physiological parameters for each AHA/ACC stage of LVSD-HF, such as LV elastance and highlight that it this parameter alone, in lumped parameter models, that determines the severity of HF. Such information will enable cardiovascular modellers with an interest in HF, to create more accurate models of the heart as it fails.

  14. An augmented classical least squares method for quantitative Raman spectral analysis against component information loss.

    PubMed

    Zhou, Yan; Cao, Hui

    2013-01-01

    We propose an augmented classical least squares (ACLS) calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV) curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS) and principal component regression (PCR) using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA) was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR.

  15. Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories

    NASA Astrophysics Data System (ADS)

    Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly

    The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.

  16. 76 FR 38719 - Interim Notice of Funding Availability for the Department of Transportation's National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-01

    ... emissions, (applicants are encouraged to provide quantitative information regarding expected reductions in...). Applicants are encouraged to provide quantitative information that validates the existence of substantial... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...

  17. EPS-LASSO: Test for High-Dimensional Regression Under Extreme Phenotype Sampling of Continuous Traits.

    PubMed

    Xu, Chao; Fang, Jian; Shen, Hui; Wang, Yu-Ping; Deng, Hong-Wen

    2018-01-25

    Extreme phenotype sampling (EPS) is a broadly-used design to identify candidate genetic factors contributing to the variation of quantitative traits. By enriching the signals in extreme phenotypic samples, EPS can boost the association power compared to random sampling. Most existing statistical methods for EPS examine the genetic factors individually, despite many quantitative traits have multiple genetic factors underlying their variation. It is desirable to model the joint effects of genetic factors, which may increase the power and identify novel quantitative trait loci under EPS. The joint analysis of genetic data in high-dimensional situations requires specialized techniques, e.g., the least absolute shrinkage and selection operator (LASSO). Although there are extensive research and application related to LASSO, the statistical inference and testing for the sparse model under EPS remain unknown. We propose a novel sparse model (EPS-LASSO) with hypothesis test for high-dimensional regression under EPS based on a decorrelated score function. The comprehensive simulation shows EPS-LASSO outperforms existing methods with stable type I error and FDR control. EPS-LASSO can provide a consistent power for both low- and high-dimensional situations compared with the other methods dealing with high-dimensional situations. The power of EPS-LASSO is close to other low-dimensional methods when the causal effect sizes are small and is superior when the effects are large. Applying EPS-LASSO to a transcriptome-wide gene expression study for obesity reveals 10 significant body mass index associated genes. Our results indicate that EPS-LASSO is an effective method for EPS data analysis, which can account for correlated predictors. The source code is available at https://github.com/xu1912/EPSLASSO. hdeng2@tulane.edu. Supplementary data are available at Bioinformatics online. © The Author (2018). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  18. A general model for the scaling of offspring size and adult size.

    PubMed

    Falster, Daniel S; Moles, Angela T; Westoby, Mark

    2008-09-01

    Understanding evolutionary coordination among different life-history traits is a key challenge for ecology and evolution. Here we develop a general quantitative model predicting how offspring size should scale with adult size by combining a simple model for life-history evolution with a frequency-dependent survivorship model. The key innovation is that larger offspring are afforded three different advantages during ontogeny: higher survivorship per time, a shortened juvenile phase, and advantage during size-competitive growth. In this model, it turns out that size-asymmetric advantage during competition is the factor driving evolution toward larger offspring sizes. For simplified and limiting cases, the model is shown to produce the same predictions as the previously existing theory on which it is founded. The explicit treatment of different survival advantages has biologically important new effects, mainly through an interaction between total maternal investment in reproduction and the duration of competitive growth. This goes on to explain alternative allometries between log offspring size and log adult size, as observed in mammals (slope = 0.95) and plants (slope = 0.54). Further, it suggests how these differences relate quantitatively to specific biological processes during recruitment. In these ways, the model generalizes across previous theory and provides explanations for some differences between major taxa.

  19. Plausible combinations: An improved method to evaluate the covariate structure of Cormack-Jolly-Seber mark-recapture models

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; McDonald, Trent L.; Amstrup, Steven C.

    2013-01-01

    Mark-recapture models are extensively used in quantitative population ecology, providing estimates of population vital rates, such as survival, that are difficult to obtain using other methods. Vital rates are commonly modeled as functions of explanatory covariates, adding considerable flexibility to mark-recapture models, but also increasing the subjectivity and complexity of the modeling process. Consequently, model selection and the evaluation of covariate structure remain critical aspects of mark-recapture modeling. The difficulties involved in model selection are compounded in Cormack-Jolly- Seber models because they are composed of separate sub-models for survival and recapture probabilities, which are conceptualized independently even though their parameters are not statistically independent. The construction of models as combinations of sub-models, together with multiple potential covariates, can lead to a large model set. Although desirable, estimation of the parameters of all models may not be feasible. Strategies to search a model space and base inference on a subset of all models exist and enjoy widespread use. However, even though the methods used to search a model space can be expected to influence parameter estimation, the assessment of covariate importance, and therefore the ecological interpretation of the modeling results, the performance of these strategies has received limited investigation. We present a new strategy for searching the space of a candidate set of Cormack-Jolly-Seber models and explore its performance relative to existing strategies using computer simulation. The new strategy provides an improved assessment of the importance of covariates and covariate combinations used to model survival and recapture probabilities, while requiring only a modest increase in the number of models on which inference is based in comparison to existing techniques.

  20. Atmospheric Effects of Subsonic Aircraft: Interim Assessment Report of the Advanced Subsonic Technology Program

    NASA Technical Reports Server (NTRS)

    Friedl, Randall R. (Editor)

    1997-01-01

    This first interim assessment of the subsonic assessment (SASS) project attempts to summarize concisely the status of our knowledge concerning the impacts of present and future subsonic aircraft fleets. It also highlights the major areas of scientific uncertainty, through review of existing data bases and model-based sensitivity studies. In view of the need for substantial improvements in both model formulations and experimental databases, this interim assessment cannot provide confident numerical predictions of aviation impacts. However, a number of quantitative estimates are presented, which provide some guidance to policy makers.

  1. Imaging 2D optical diffuse reflectance in skeletal muscle

    NASA Astrophysics Data System (ADS)

    Ranasinghesagara, Janaka; Yao, Gang

    2007-04-01

    We discovered a unique pattern of optical reflectance from fresh prerigor skeletal muscles, which can not be described using existing theories. A numerical fitting function was developed to quantify the equiintensity contours of acquired reflectance images. Using this model, we studied the changes of reflectance profile during stretching and rigor process. We found that the prominent anisotropic features diminished after rigor completion. These results suggested that muscle sarcomere structures played important roles in modulating light propagation in whole muscle. When incorporating the sarcomere diffraction in a Monte Carlo model, we showed that the resulting reflectance profiles quantitatively resembled the experimental observation.

  2. 1, 2, 3, 4: infusing quantitative literacy into introductory biology.

    PubMed

    Speth, Elena Bray; Momsen, Jennifer L; Moyerbrailean, Gregory A; Ebert-May, Diane; Long, Tammy M; Wyse, Sara; Linton, Debra

    2010-01-01

    Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    El-Atwani, O.; Norris, S. A.; Ludwig, K.

    In this study, several proposed mechanisms and theoretical models exist concerning nanostructure evolution on III-V semiconductors (particularly GaSb) via ion beam irradiation. However, making quantitative contact between experiment on the one hand and model-parameter dependent predictions from different theories on the other is usually difficult. In this study, we take a different approach and provide an experimental investigation with a range of targets (GaSb, GaAs, GaP) and ion species (Ne, Ar, Kr, Xe) to determine new parametric trends regarding nanostructure evolution. Concurrently, atomistic simulations using binary collision approximation over the same ion/target combinations were performed to determine parametric trends onmore » several quantities related to existing model. A comparison of experimental and numerical trends reveals that the two are broadly consistent under the assumption that instabilities are driven by chemical instability based on phase separation. Furthermore, the atomistic simulations and a survey of material thermodynamic properties suggest that a plausible microscopic mechanism for this process is an ion-enhanced mobility associated with energy deposition by collision cascades.« less

  4. Impact of diet on the design of waste processors in CELSS

    NASA Technical Reports Server (NTRS)

    Waleh, Ahmad; Kanevsky, Valery; Nguyen, Thoi K.; Upadhye, Ravi; Wydeven, Theodore

    1991-01-01

    The preliminary results of a design analysis for a waste processor which employs existing technologies and takes into account the constraints of human diet are presented. The impact of diet is determined by using a model and an algorithm developed for the control and management of diet in a Controlled Ecological Life Support System (CELSS). A material and energy balance model for thermal oxidation of waste is developed which is consistent with both physical/chemical methods of incineration and supercritical water oxidation. The two models yield quantitative analysis of the diet and waste streams and the specific design parameters for waste processors, respectively. The results demonstrate that existing technologies can meet the demands of waste processing, but the choice and design of the processors or processing methods will be sensitive to the constraints of diet. The numerical examples are chosen to display the nature and extent of the gap in the available experiment information about CELSS requirements.

  5. Reassessing Pliocene temperature gradients

    NASA Astrophysics Data System (ADS)

    Tierney, J. E.

    2017-12-01

    With CO2 levels similar to present, the Pliocene Warm Period (PWP) is one of our best analogs for climate change in the near future. Temperature proxy data from the PWP describe dramatically reduced zonal and meridional temperature gradients that have proved difficult to reproduce with climate model simulations. Recently, debate has emerged regarding the interpretation of the proxies used to infer Pliocene temperature gradients; these interpretations affect the magnitude of inferred change and the degree of inconsistency with existing climate model simulations of the PWP. Here, I revisit the issue using Bayesian proxy forward modeling and prediction that propagates known uncertainties in the Mg/Ca, UK'37, and TEX86 proxy systems. These new spatiotemporal predictions are quantitatively compared to PWP simulations to assess probabilistic agreement. Results show generally good agreement between existing Pliocene simulations from the PlioMIP ensemble and SST proxy data, suggesting that exotic changes in the ocean-atmosphere are not needed to explain the Pliocene climate state. Rather, the spatial changes in SST during the Pliocene are largely consistent with elevated CO2 forcing.

  6. Quantitative Image Feature Engine (QIFE): an Open-Source, Modular Engine for 3D Quantitative Feature Extraction from Volumetric Medical Images.

    PubMed

    Echegaray, Sebastian; Bakr, Shaimaa; Rubin, Daniel L; Napel, Sandy

    2017-10-06

    The aim of this study was to develop an open-source, modular, locally run or server-based system for 3D radiomics feature computation that can be used on any computer system and included in existing workflows for understanding associations and building predictive models between image features and clinical data, such as survival. The QIFE exploits various levels of parallelization for use on multiprocessor systems. It consists of a managing framework and four stages: input, pre-processing, feature computation, and output. Each stage contains one or more swappable components, allowing run-time customization. We benchmarked the engine using various levels of parallelization on a cohort of CT scans presenting 108 lung tumors. Two versions of the QIFE have been released: (1) the open-source MATLAB code posted to Github, (2) a compiled version loaded in a Docker container, posted to DockerHub, which can be easily deployed on any computer. The QIFE processed 108 objects (tumors) in 2:12 (h/mm) using 1 core, and 1:04 (h/mm) hours using four cores with object-level parallelization. We developed the Quantitative Image Feature Engine (QIFE), an open-source feature-extraction framework that focuses on modularity, standards, parallelism, provenance, and integration. Researchers can easily integrate it with their existing segmentation and imaging workflows by creating input and output components that implement their existing interfaces. Computational efficiency can be improved by parallelizing execution at the cost of memory usage. Different parallelization levels provide different trade-offs, and the optimal setting will depend on the size and composition of the dataset to be processed.

  7. 77 FR 4863 - Notice of Funding Availability for the Department of Transportation's National Infrastructure...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-31

    ... quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as a result of... encouraged to provide quantitative information that validates the existence of substantial transportation... quantitative and qualitative measures. Therefore, applicants for TIGER Discretionary Grants are generally...

  8. 75 FR 30460 - Notice of Funding Availability for the Department of Transportation's National Infrastructure...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-01

    ... provide quantitative information regarding expected reductions in emissions of CO 2 or fuel consumption as... provide quantitative information that validates the existence of substantial transportation-related costs... infrastructure investments on systematic analysis of expected benefits and costs, including both quantitative and...

  9. Cement-based materials' characterization using ultrasonic attenuation

    NASA Astrophysics Data System (ADS)

    Punurai, Wonsiri

    The quantitative nondestructive evaluation (NDE) of cement-based materials is a critical area of research that is leading to advances in the health monitoring and condition assessment of the civil infrastructure. Ultrasonic NDE has been implemented with varying levels of success to characterize cement-based materials with complex microstructure and damage. A major issue with the application of ultrasonic techniques to characterize cement-based materials is their inherent inhomogeneity at multiple length scales. Ultrasonic waves propagating in these materials exhibit a high degree of attenuation losses, making quantitative interpretations difficult. Physically, these attenuation losses are a combination of internal friction in a viscoelastic material (ultrasonic absorption), and the scattering losses due to the material heterogeneity. The objective of this research is to use ultrasonic attenuation to characterize the microstructure of heterogeneous cement-based materials. The study considers a real, but simplified cement-based material, cement paste---a common bonding matrix of all cement-based composites. Cement paste consists of Portland cement and water but does not include aggregates. First, this research presents the findings of a theoretical study that uses a set of existing acoustics models to quantify the scattered ultrasonic wavefield from a known distribution of entrained air voids. These attenuation results are then coupled with experimental measurements to develop an inversion procedure that directly predicts the size and volume fraction of entrained air voids in a cement paste specimen. Optical studies verify the accuracy of the proposed inversion scheme. These results demonstrate the effectiveness of using attenuation to measure the average size, volume fraction of entrained air voids and the existence of additional larger entrapped air voids in hardened cement paste. Finally, coherent and diffuse ultrasonic waves are used to develop a direct relationship between attenuation and water to cement (w/c) ratio. A phenomenological model based on the existence of fluid-filled capillary voids is used to help explain the experimentally observed behavior. Overall this research shows the potential of using ultrasonic attenuation to quantitatively characterize cement paste. The absorption and scattering losses can be related to the individual microstructural elements of hardened cement paste. By taking a fundamental, mechanics-based approach, it should be possible to add additional components such as scattering by aggregates or even microcracks in a systematic fashion and eventually build a realistic model for ultrasonic wave propagation study for concrete.

  10. The Impact of School Climate on Student Achievement in the Middle Schools of the Commonwealth of Virginia: A Quantitative Analysis of Existing Data

    ERIC Educational Resources Information Center

    Bergren, David Alexander

    2014-01-01

    This quantitative study was designed to be an analysis of the relationship between school climate and student achievement through the creation of an index of climate-factors (SES, discipline, attendance, and school size) for which publicly available data existed. The index that was formed served as a proxy measure of climate; it was analyzed…

  11. A Quantitative Human Spacecraft Design Evaluation Model for Assessing Crew Accommodation and Utilization

    NASA Astrophysics Data System (ADS)

    Fanchiang, Christine

    Crew performance, including both accommodation and utilization factors, is an integral part of every human spaceflight mission from commercial space tourism, to the demanding journey to Mars and beyond. Spacecraft were historically built by engineers and technologists trying to adapt the vehicle into cutting edge rocketry with the assumption that the astronauts could be trained and will adapt to the design. By and large, that is still the current state of the art. It is recognized, however, that poor human-machine design integration can lead to catastrophic and deadly mishaps. The premise of this work relies on the idea that if an accurate predictive model exists to forecast crew performance issues as a result of spacecraft design and operations, it can help designers and managers make better decisions throughout the design process, and ensure that the crewmembers are well-integrated with the system from the very start. The result should be a high-quality, user-friendly spacecraft that optimizes the utilization of the crew while keeping them alive, healthy, and happy during the course of the mission. Therefore, the goal of this work was to develop an integrative framework to quantitatively evaluate a spacecraft design from the crew performance perspective. The approach presented here is done at a very fundamental level starting with identifying and defining basic terminology, and then builds up important axioms of human spaceflight that lay the foundation for how such a framework can be developed. With the framework established, a methodology for characterizing the outcome using a mathematical model was developed by pulling from existing metrics and data collected on human performance in space. Representative test scenarios were run to show what information could be garnered and how it could be applied as a useful, understandable metric for future spacecraft design. While the model is the primary tangible product from this research, the more interesting outcome of this work is the structure of the framework and what it tells future researchers in terms of where the gaps and limitations exist for developing a better framework. It also identifies metrics that can now be collected as part of future validation efforts for the model.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haney, Thomas Jay

    This report documents the Data Quality Objectives (DQOs) developed for the Idaho National Laboratory (INL) Site ambient air surveillance program. The development of the DQOs was based on the seven-step process recommended “for systematic planning to generate performance and acceptance criteria for collecting environmental data” (EPA 2006). The process helped to determine the type, quantity, and quality of data needed to meet current regulatory requirements and to follow U.S. Department of Energy guidance for environmental surveillance air monitoring design. It also considered the current air monitoring program that has existed at INL Site since the 1950s. The development of themore » DQOs involved the application of the atmospheric dispersion model CALPUFF to identify likely contamination dispersion patterns at and around the INL Site using site-specific meteorological data. Model simulations were used to quantitatively assess the probable frequency of detection of airborne radionuclides released by INL Site facilities using existing and proposed air monitors.« less

  13. ANN-based calibration model of FTIR used in transformer online monitoring

    NASA Astrophysics Data System (ADS)

    Li, Honglei; Liu, Xian-yong; Zhou, Fangjie; Tan, Kexiong

    2005-02-01

    Recently, chromatography column and gas sensor have been used in online monitoring device of dissolved gases in transformer oil. But some disadvantages still exist in these devices: consumption of carrier gas, requirement of calibration, etc. Since FTIR has high accuracy, consume no carrier gas and require no calibration, the researcher studied the application of FTIR in such monitoring device. Experiments of "Flow gas method" were designed, and spectrum of mixture composed of different gases was collected with A BOMEM MB104 FTIR Spectrometer. A key question in the application of FTIR is that: the absorbance spectrum of 3 fault key gases, including C2H4, CH4 and C2H6, are overlapped seriously at 2700~3400cm-1. Because Absorbance Law is no longer appropriate, a nonlinear calibration model based on BP ANN was setup to in the quantitative analysis. The height absorbance of C2H4, CH4 and C2H6 were adopted as quantitative feature, and all the data were normalized before training the ANN. Computing results show that the calibration model can effectively eliminate the cross disturbance to measurement.

  14. The integration of quantitative information with an intelligent decision support system for residential energy retrofits

    NASA Astrophysics Data System (ADS)

    Mo, Yunjeong

    The purpose of this research is to support the development of an intelligent Decision Support System (DSS) by integrating quantitative information with expert knowledge in order to facilitate effective retrofit decision-making. To achieve this goal, the Energy Retrofit Decision Process Framework is analyzed. Expert system shell software, a retrofit measure cost database, and energy simulation software are needed for developing the DSS; Exsys Corvid, the NREM database and BEopt were chosen for implementing an integration model. This integration model demonstrates the holistic function of a residential energy retrofit system for existing homes, by providing a prioritized list of retrofit measures with cost information, energy simulation and expert advice. The users, such as homeowners and energy auditors, can acquire all of the necessary retrofit information from this unified system without having to explore several separate systems. The integration model plays the role of a prototype for the finalized intelligent decision support system. It implements all of the necessary functions for the finalized DSS, including integration of the database, energy simulation and expert knowledge.

  15. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    PubMed

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  16. Improving power and robustness for detecting genetic association with extreme-value sampling design.

    PubMed

    Chen, Hua Yun; Li, Mingyao

    2011-12-01

    Extreme-value sampling design that samples subjects with extremely large or small quantitative trait values is commonly used in genetic association studies. Samples in such designs are often treated as "cases" and "controls" and analyzed using logistic regression. Such a case-control analysis ignores the potential dose-response relationship between the quantitative trait and the underlying trait locus and thus may lead to loss of power in detecting genetic association. An alternative approach to analyzing such data is to model the dose-response relationship by a linear regression model. However, parameter estimation from this model can be biased, which may lead to inflated type I errors. We propose a robust and efficient approach that takes into consideration of both the biased sampling design and the potential dose-response relationship. Extensive simulations demonstrate that the proposed method is more powerful than the traditional logistic regression analysis and is more robust than the linear regression analysis. We applied our method to the analysis of a candidate gene association study on high-density lipoprotein cholesterol (HDL-C) which includes study subjects with extremely high or low HDL-C levels. Using our method, we identified several SNPs showing a stronger evidence of association with HDL-C than the traditional case-control logistic regression analysis. Our results suggest that it is important to appropriately model the quantitative traits and to adjust for the biased sampling when dose-response relationship exists in extreme-value sampling designs. © 2011 Wiley Periodicals, Inc.

  17. Probabilistic quantitative microbial risk assessment model of norovirus from wastewater irrigated vegetables in Ghana using genome copies and fecal indicator ratio conversion for estimating exposure dose.

    PubMed

    Owusu-Ansah, Emmanuel de-Graft Johnson; Sampson, Angelina; Amponsah, Samuel K; Abaidoo, Robert C; Dalsgaard, Anders; Hald, Tine

    2017-12-01

    The need to replace the commonly applied fecal indicator conversions ratio (an assumption of 1:10 -5 virus to fecal indicator organism) in Quantitative Microbial Risk Assessment (QMRA) with models based on quantitative data on the virus of interest has gained prominence due to the different physical and environmental factors that might influence the reliability of using indicator organisms in microbial risk assessment. The challenges facing analytical studies on virus enumeration (genome copies or particles) have contributed to the already existing lack of data in QMRA modelling. This study attempts to fit a QMRA model to genome copies of norovirus data. The model estimates the risk of norovirus infection from the intake of vegetables irrigated with wastewater from different sources. The results were compared to the results of a corresponding model using the fecal indicator conversion ratio to estimate the norovirus count. In all scenarios of using different water sources, the application of the fecal indicator conversion ratio underestimated the norovirus disease burden, measured by the Disability Adjusted Life Years (DALYs), when compared to results using the genome copies norovirus data. In some cases the difference was >2 orders of magnitude. All scenarios using genome copies met the 10 -4 DALY per person per year for consumption of vegetables irrigated with wastewater, although these results are considered to be highly conservative risk estimates. The fecal indicator conversion ratio model of stream-water and drain-water sources of wastewater achieved the 10 -6 DALY per person per year threshold, which tends to indicate an underestimation of health risk when compared to using genome copies for estimating the dose. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Initial Description of a Quantitative, Cross-Species (Chimpanzee-Human) Social Responsiveness Measure

    ERIC Educational Resources Information Center

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E.; Constantino, John N.; Povinelli, Daniel J.; Pruett, John R., Jr.

    2011-01-01

    Objective: Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species…

  19. 75 FR 81632 - Australia Beef Imports Approved for the Electronic Certification System (eCERT)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-28

    ..., 2011, the export certification requirement for imports of beef from Australia subject to quantitative...: Background There are existing quantitative restraints on beef from Australia pursuant to U.S. Note 3... quantitative quota restrictions beginning January 3, 2011. Such imports that are entered, or withdrawn from...

  20. Congruent climate-related genecological responses from molecular markers and quantitative traits for western white pine (Pinus monticola)

    Treesearch

    Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim

    2009-01-01

    Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...

  1. Engineering challenges of BioNEMS: the integration of microfluidics, micro- and nanodevices, models and external control for systems biology.

    PubMed

    Wikswo, J P; Prokop, A; Baudenbacher, F; Cliffel, D; Csukas, B; Velkovsky, M

    2006-08-01

    Systems biology, i.e. quantitative, postgenomic, postproteomic, dynamic, multiscale physiology, addresses in an integrative, quantitative manner the shockwave of genetic and proteomic information using computer models that may eventually have 10(6) dynamic variables with non-linear interactions. Historically, single biological measurements are made over minutes, suggesting the challenge of specifying 10(6) model parameters. Except for fluorescence and micro-electrode recordings, most cellular measurements have inadequate bandwidth to discern the time course of critical intracellular biochemical events. Micro-array expression profiles of thousands of genes cannot determine quantitative dynamic cellular signalling and metabolic variables. Major gaps must be bridged between the computational vision and experimental reality. The analysis of cellular signalling dynamics and control requires, first, micro- and nano-instruments that measure simultaneously multiple extracellular and intracellular variables with sufficient bandwidth; secondly, the ability to open existing internal control and signalling loops; thirdly, external BioMEMS micro-actuators that provide high bandwidth feedback and externally addressable intracellular nano-actuators; and, fourthly, real-time, closed-loop, single-cell control algorithms. The unravelling of the nested and coupled nature of cellular control loops requires simultaneous recording of multiple single-cell signatures. Externally controlled nano-actuators, needed to effect changes in the biochemical, mechanical and electrical environment both outside and inside the cell, will provide a major impetus for nanoscience.

  2. Predicting perceived visual complexity of abstract patterns using computational measures: The influence of mirror symmetry on complexity perception

    PubMed Central

    Leder, Helmut

    2017-01-01

    Visual complexity is relevant for many areas ranging from improving usability of technical displays or websites up to understanding aesthetic experiences. Therefore, many attempts have been made to relate objective properties of images to perceived complexity in artworks and other images. It has been argued that visual complexity is a multidimensional construct mainly consisting of two dimensions: A quantitative dimension that increases complexity through number of elements, and a structural dimension representing order negatively related to complexity. The objective of this work is to study human perception of visual complexity utilizing two large independent sets of abstract patterns. A wide range of computational measures of complexity was calculated, further combined using linear models as well as machine learning (random forests), and compared with data from human evaluations. Our results confirm the adequacy of existing two-factor models of perceived visual complexity consisting of a quantitative and a structural factor (in our case mirror symmetry) for both of our stimulus sets. In addition, a non-linear transformation of mirror symmetry giving more influence to small deviations from symmetry greatly increased explained variance. Thus, we again demonstrate the multidimensional nature of human complexity perception and present comprehensive quantitative models of the visual complexity of abstract patterns, which might be useful for future experiments and applications. PMID:29099832

  3. A spatial Bayesian network model to assess the benefits of early warning for urban flood risk to people

    NASA Astrophysics Data System (ADS)

    Balbi, S.; Villa, F.; Mojtahed, V.; Hegetschweiler, K. T.; Giupponi, C.

    2015-10-01

    This article presents a novel methodology to assess flood risk to people by integrating people's vulnerability and ability to cushion hazards through coping and adapting. The proposed approach extends traditional risk assessments beyond material damages; complements quantitative and semi-quantitative data with subjective and local knowledge, improving the use of commonly available information; produces estimates of model uncertainty by providing probability distributions for all of its outputs. Flood risk to people is modeled using a spatially explicit Bayesian network model calibrated on expert opinion. Risk is assessed in terms of: (1) likelihood of non-fatal physical injury; (2) likelihood of post-traumatic stress disorder; (3) likelihood of death. The study area covers the lower part of the Sihl valley (Switzerland) including the city of Zurich. The model is used to estimate the benefits of improving an existing Early Warning System, taking into account the reliability, lead-time and scope (i.e. coverage of people reached by the warning). Model results indicate that the potential benefits of an improved early warning in terms of avoided human impacts are particularly relevant in case of a major flood event: about 75 % of fatalities, 25 % of injuries and 18 % of post-traumatic stress disorders could be avoided.

  4. Informing Environmental Water Management Decisions: Using Conditional Probability Networks to Address the Information Needs of Planning and Implementation Cycles.

    PubMed

    Horne, Avril C; Szemis, Joanna M; Webb, J Angus; Kaur, Simranjit; Stewardson, Michael J; Bond, Nick; Nathan, Rory

    2018-03-01

    One important aspect of adaptive management is the clear and transparent documentation of hypotheses, together with the use of predictive models (complete with any assumptions) to test those hypotheses. Documentation of such models can improve the ability to learn from management decisions and supports dialog between stakeholders. A key challenge is how best to represent the existing scientific knowledge to support decision-making. Such challenges are currently emerging in the field of environmental water management in Australia, where managers are required to prioritize the delivery of environmental water on an annual basis, using a transparent and evidence-based decision framework. We argue that the development of models of ecological responses to environmental water use needs to support both the planning and implementation cycles of adaptive management. Here we demonstrate an approach based on the use of Conditional Probability Networks to translate existing ecological knowledge into quantitative models that include temporal dynamics to support adaptive environmental flow management. It equally extends to other applications where knowledge is incomplete, but decisions must still be made.

  5. Informing Environmental Water Management Decisions: Using Conditional Probability Networks to Address the Information Needs of Planning and Implementation Cycles

    NASA Astrophysics Data System (ADS)

    Horne, Avril C.; Szemis, Joanna M.; Webb, J. Angus; Kaur, Simranjit; Stewardson, Michael J.; Bond, Nick; Nathan, Rory

    2018-03-01

    One important aspect of adaptive management is the clear and transparent documentation of hypotheses, together with the use of predictive models (complete with any assumptions) to test those hypotheses. Documentation of such models can improve the ability to learn from management decisions and supports dialog between stakeholders. A key challenge is how best to represent the existing scientific knowledge to support decision-making. Such challenges are currently emerging in the field of environmental water management in Australia, where managers are required to prioritize the delivery of environmental water on an annual basis, using a transparent and evidence-based decision framework. We argue that the development of models of ecological responses to environmental water use needs to support both the planning and implementation cycles of adaptive management. Here we demonstrate an approach based on the use of Conditional Probability Networks to translate existing ecological knowledge into quantitative models that include temporal dynamics to support adaptive environmental flow management. It equally extends to other applications where knowledge is incomplete, but decisions must still be made.

  6. The Dynamics of Human Body Weight Change

    PubMed Central

    Chow, Carson C.; Hall, Kevin D.

    2008-01-01

    An imbalance between energy intake and energy expenditure will lead to a change in body weight (mass) and body composition (fat and lean masses). A quantitative understanding of the processes involved, which currently remains lacking, will be useful in determining the etiology and treatment of obesity and other conditions resulting from prolonged energy imbalance. Here, we show that a mathematical model of the macronutrient flux balances can capture the long-term dynamics of human weight change; all previous models are special cases of this model. We show that the generic dynamic behavior of body composition for a clamped diet can be divided into two classes. In the first class, the body composition and mass are determined uniquely. In the second class, the body composition can exist at an infinite number of possible states. Surprisingly, perturbations of dietary energy intake or energy expenditure can give identical responses in both model classes, and existing data are insufficient to distinguish between these two possibilities. Nevertheless, this distinction has important implications for the efficacy of clinical interventions that alter body composition and mass. PMID:18369435

  7. The epistemological status of general circulation models

    NASA Astrophysics Data System (ADS)

    Loehle, Craig

    2018-03-01

    Forecasts of both likely anthropogenic effects on climate and consequent effects on nature and society are based on large, complex software tools called general circulation models (GCMs). Forecasts generated by GCMs have been used extensively in policy decisions related to climate change. However, the relation between underlying physical theories and results produced by GCMs is unclear. In the case of GCMs, many discretizations and approximations are made, and simulating Earth system processes is far from simple and currently leads to some results with unknown energy balance implications. Statistical testing of GCM forecasts for degree of agreement with data would facilitate assessment of fitness for use. If model results need to be put on an anomaly basis due to model bias, then both visual and quantitative measures of model fit depend strongly on the reference period used for normalization, making testing problematic. Epistemology is here applied to problems of statistical inference during testing, the relationship between the underlying physics and the models, the epistemic meaning of ensemble statistics, problems of spatial and temporal scale, the existence or not of an unforced null for climate fluctuations, the meaning of existing uncertainty estimates, and other issues. Rigorous reasoning entails carefully quantifying levels of uncertainty.

  8. Probabilistic modeling of discourse-aware sentence processing.

    PubMed

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  9. Current Challenges in the First Principle Quantitative Modelling of the Lower Hybrid Current Drive in Tokamaks

    NASA Astrophysics Data System (ADS)

    Peysson, Y.; Bonoli, P. T.; Chen, J.; Garofalo, A.; Hillairet, J.; Li, M.; Qian, J.; Shiraiwa, S.; Decker, J.; Ding, B. J.; Ekedahl, A.; Goniche, M.; Zhai, X.

    2017-10-01

    The Lower Hybrid (LH) wave is widely used in existing tokamaks for tailoring current density profile or extending pulse duration to steady-state regimes. Its high efficiency makes it particularly attractive for a fusion reactor, leading to consider it for this purpose in ITER tokamak. Nevertheless, if basics of the LH wave in tokamak plasma are well known, quantitative modeling of experimental observations based on first principles remains a highly challenging exercise, despite considerable numerical efforts achieved so far. In this context, a rigorous methodology must be carried out in the simulations to identify the minimum number of physical mechanisms that must be considered to reproduce experimental shot to shot observations and also scalings (density, power spectrum). Based on recent simulations carried out for EAST, Alcator C-Mod and Tore Supra tokamaks, the state of the art in LH modeling is reviewed. The capability of fast electron bremsstrahlung, internal inductance li and LH driven current at zero loop voltage to constrain all together LH simulations is discussed, as well as the needs of further improvements (diagnostics, codes, LH model), for robust interpretative and predictive simulations.

  10. Overview: early history of crop growth and photosynthesis modeling.

    PubMed

    El-Sharkawy, Mabrouk A

    2011-02-01

    As in industrial and engineering systems, there is a need to quantitatively study and analyze the many constituents of complex natural biological systems as well as agro-ecosystems via research-based mechanistic modeling. This objective is normally addressed by developing mathematically built descriptions of multilevel biological processes to provide biologists a means to integrate quantitatively experimental research findings that might lead to a better understanding of the whole systems and their interactions with surrounding environments. Aided with the power of computational capacities associated with computer technology then available, pioneering cropping systems simulations took place in the second half of the 20th century by several research groups across continents. This overview summarizes that initial pioneering effort made to simulate plant growth and photosynthesis of crop canopies, focusing on the discovery of gaps that exist in the current scientific knowledge. Examples are given for those gaps where experimental research was needed to improve the validity and application of the constructed models, so that their benefit to mankind was enhanced. Such research necessitates close collaboration among experimentalists and model builders while adopting a multidisciplinary/inter-institutional approach. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  11. Finite-Size Scaling of a First-Order Dynamical Phase Transition: Adaptive Population Dynamics and an Effective Model

    NASA Astrophysics Data System (ADS)

    Nemoto, Takahiro; Jack, Robert L.; Lecomte, Vivien

    2017-03-01

    We analyze large deviations of the time-averaged activity in the one-dimensional Fredrickson-Andersen model, both numerically and analytically. The model exhibits a dynamical phase transition, which appears as a singularity in the large deviation function. We analyze the finite-size scaling of this phase transition numerically, by generalizing an existing cloning algorithm to include a multicanonical feedback control: this significantly improves the computational efficiency. Motivated by these numerical results, we formulate an effective theory for the model in the vicinity of the phase transition, which accounts quantitatively for the observed behavior. We discuss potential applications of the numerical method and the effective theory in a range of more general contexts.

  12. 75 FR 68468 - List of Fisheries for 2011

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ...-existent; therefore, quantitative data on the frequency of incidental mortality and serious injury is... currently available for most of these marine mammals on the high seas, and quantitative comparison of...

  13. Mathematical model investigation of long-term transport of ocean-dumped sewage sludge related to remote sensing

    NASA Technical Reports Server (NTRS)

    Kuo, C. Y.; Modena, T. D.

    1979-01-01

    An existing, three-dimensional, Eulerian-Lagrangian finite-difference model was modified and used to examine the transport processes of dumped sewage sludge in the New York Bight. Both in situ and laboratory data were utilized in an attempt to approximate model inputs such as mean current speed, horizontal diffusion coefficients, particle size distributions, and specific gravities. The results presented are a quantitative description of the fate of a negatively buoyant sewage sludge plume resulting from continuous and instantaneous barge releases. Concentrations of the sludge near the surface were compared qualitatively with those remotely sensed. Laboratory study was performed to investigate the behavior of sewage sludge dumping in various ambient density conditions.

  14. Microstructure development in Kolmogorov, Johnson-Mehl, and Avrami nucleation and growth kinetics

    NASA Astrophysics Data System (ADS)

    Pineda, Eloi; Crespo, Daniel

    1999-08-01

    A statistical model with the ability to evaluate the microstructure developed in nucleation and growth kinetics is built in the framework of the Kolmogorov, Johnson-Mehl, and Avrami theory. A populational approach is used to compute the observed grain-size distribution. The impingement process which delays grain growth is analyzed, and the effective growth rate of each population is estimated considering the previous grain history. The proposed model is integrated for a wide range of nucleation and growth protocols, including constant nucleation, pre-existing nuclei, and intermittent nucleation with interface or diffusion-controlled grain growth. The results are compared with Monte Carlo simulations, giving quantitative agreement even in cases where previous models fail.

  15. Tropical Pacific moisture variability: Its detection, synoptic structure and consequences in the general circulation

    NASA Technical Reports Server (NTRS)

    Mcguirk, James P.

    1990-01-01

    Satellite data analysis tools are developed and implemented for the diagnosis of atmospheric circulation systems over the tropical Pacific Ocean. The tools include statistical multi-variate procedures, a multi-spectral radiative transfer model, and the global spectral forecast model at NMC. Data include in-situ observations; satellite observations from VAS (moisture, infrared and visible) NOAA polar orbiters (including Tiros Operational Satellite System (TOVS) multi-channel sounding data and OLR grids) and scanning multichannel microwave radiometer (SMMR); and European Centre for Medium Weather Forecasts (ECHMWF) analyses. A primary goal is a better understanding of the relation between synoptic structures of the area, particularly tropical plumes, and the general circulation, especially the Hadley circulation. A second goal is the definition of the quantitative structure and behavior of all Pacific tropical synoptic systems. Finally, strategies are examined for extracting new and additional information from existing satellite observations. Although moisture structure is emphasized, thermal patterns are also analyzed. Both horizontal and vertical structures are studied and objective quantitative results are emphasized.

  16. Big data to smart data in Alzheimer's disease: The brain health modeling initiative to foster actionable knowledge.

    PubMed

    Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane

    2016-09-01

    Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Field Assessment of Energy Audit Tools for Retrofit Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, J.; Bohac, D.; Nelson, C.

    2013-07-01

    This project focused on the use of home energy ratings as a tool to promote energy retrofits in existing homes. A home energy rating provides a quantitative appraisal of a home’s energy performance, usually compared to a benchmark such as the average energy use of similar homes in the same region. Rating systems based on energy performance models, the focus of this report, can establish a home’s achievable energy efficiency potential and provide a quantitative assessment of energy savings after retrofits are completed, although their accuracy needs to be verified by actual measurement or billing data. Ratings can also showmore » homeowners where they stand compared to their neighbors, thus creating social pressure to conform to or surpass others. This project field-tested three different building performance models of varying complexity, in order to assess their value as rating systems in the context of a residential retrofit program: Home Energy Score, SIMPLE, and REM/Rate.« less

  18. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    PubMed

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  19. Setting priorities in health research using the model proposed by the World Health Organization: development of a quantitative methodology using tuberculosis in South Africa as a worked example.

    PubMed

    Hacking, Damian; Cleary, Susan

    2016-02-09

    Setting priorities is important in health research given the limited resources available for research. Various guidelines exist to assist in the priority setting process; however, priority setting still faces significant challenges such as the clear ranking of identified priorities. The World Health Organization (WHO) proposed a Disability Adjusted Life Year (DALY)-based model to rank priorities by research area (basic, health systems and biomedical) by dividing the DALYs into 'unavertable with existing interventions', 'avertable with improved efficiency' and 'avertable with existing but non-cost-effective interventions', respectively. However, the model has conceptual flaws and no clear methodology for its construction. Therefore, the aim of this paper was to amend the model to address these flaws, and develop a clear methodology by using tuberculosis in South Africa as a worked example. An amended model was constructed to represent total DALYs as the product of DALYs per person and absolute burden of disease. These figures were calculated for all countries from WHO datasets. The lowest figures achieved by any country were assumed to represent 'unavertable with existing interventions' if extrapolated to South Africa. The ratio of 'cost per patient treated' (adjusted for purchasing power and outcome weighted) between South Africa and the best country was used to calculate the 'avertable with improved efficiency section'. Finally, 'avertable with existing but non-cost-effective interventions' was calculated using Disease Control Priorities Project efficacy data, and the ratio between the best intervention and South Africa's current intervention, irrespective of cost. The amended model shows that South Africa has a tuberculosis burden of 1,009,837.3 DALYs; 0.009% of DALYs are unavertable with existing interventions and 96.3% of DALYs could be averted with improvements in efficiency. Of the remaining DALYs, a further 56.9% could be averted with existing but non-cost-effective interventions. The amended model was successfully constructed using limited data sources. The generalizability of the data used is the main limitation of the model. More complex formulas are required to deal with such potential confounding variables; however, the results act as starting point for development of a more robust model.

  20. The Role of Introductory Geosciences in Students' Quantitative Literacy

    NASA Astrophysics Data System (ADS)

    Wenner, J. M.; Manduca, C.; Baer, E. M.

    2006-12-01

    Quantitative literacy is more than mathematics; it is about reasoning with data. Colleges and universities have begun to recognize the distinction between mathematics and quantitative literacy, modifying curricula to reflect the need for numerate citizens. Although students may view geology as 'rocks for jocks', the geosciences are truthfully rife with data, making introductory geoscience topics excellent context for developing the quantitative literacy of students with diverse backgrounds. In addition, many news items that deal with quantitative skills, such as the global warming phenomenon, have their basis in the Earth sciences and can serve as timely examples of the importance of quantitative literacy for all students in introductory geology classrooms. Participants at a workshop held in 2006, 'Infusing Quantitative Literacy into Introductory Geoscience Courses,' discussed and explored the challenges and opportunities associated with the inclusion of quantitative material and brainstormed about effective practices for imparting quantitative literacy to students with diverse backgrounds. The tangible results of this workshop add to the growing collection of quantitative materials available through the DLESE- and NSF-supported Teaching Quantitative Skills in the Geosciences website, housed at SERC. There, faculty can find a collection of pages devoted to the successful incorporation of quantitative literacy in introductory geoscience. The resources on the website are designed to help faculty to increase their comfort with presenting quantitative ideas to students with diverse mathematical abilities. A methods section on "Teaching Quantitative Literacy" (http://serc.carleton.edu/quantskills/methods/quantlit/index.html) focuses on connecting quantitative concepts with geoscience context and provides tips, trouble-shooting advice and examples of quantitative activities. The goal in this section is to provide faculty with material that can be readily incorporated into existing introductory geoscience courses. In addition, participants at the workshop (http://serc.carleton.edu/quantskills/workshop06/index.html) submitted and modified more than 20 activities and model courses (with syllabi) designed to use best practices for helping introductory geoscience students to become quantitatively literate. We present insights from the workshop and other sources for a framework that can aid in increasing quantitative literacy of students from a variety of backgrounds in the introductory geoscience classroom.

  1. Integrating Quantitative Thinking into an Introductory Biology Course Improves Students' Mathematical Reasoning in Biological Contexts

    ERIC Educational Resources Information Center

    Hester, Susan; Buxner, Sanlyn; Elfring, Lisa; Nagy, Lisa

    2014-01-01

    Recent calls for improving undergraduate biology education have emphasized the importance of students learning to apply quantitative skills to biological problems. Motivated by students' apparent inability to transfer their existing quantitative skills to biological contexts, we designed and taught an introductory molecular and cell biology course…

  2. 12 CFR Appendix A to Part 1310 - Financial Stability Oversight Council Guidance for Nonbank Financial Company Determinations

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... its review. In the first stage of the process, the Council will apply six uniform quantitative... quantitative thresholds using information available through existing public and regulatory sources, nonbank..., rather than applying a broadly applicable quantitative metric. The Council believes that the threat a...

  3. 12 CFR Appendix A to Part 1310 - Financial Stability Oversight Council Guidance for Nonbank Financial Company Determinations

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... its review. In the first stage of the process, the Council will apply six uniform quantitative... quantitative thresholds using information available through existing public and regulatory sources, nonbank..., rather than applying a broadly applicable quantitative metric. The Council believes that the threat a...

  4. Phenomenological model for coupled multi-axial piezoelectricity

    NASA Astrophysics Data System (ADS)

    Wei, Yuchen; Pellegrino, Sergio

    2018-03-01

    A quantitative calibration of an existing phenomenological model for polycrystalline ferroelectric ceramics is presented. The model relies on remnant strain and polarization as independent variables. Innovative experimental and numerical model identification procedures are developed for the characterization of the coupled electro-mechanical, multi-axial nonlinear constitutive law. Experiments were conducted on thin PZT-5A4E plates subjected to cross-thickness electric field. Unimorph structures with different thickness ratios between PZT-5A4E plate and substrate were tested, to subject the piezo plates to coupled electro-mechanical fields. Material state histories in electric field-strain-polarization space and stress-strain-polarization space were recorded. An optimization procedure is employed for the determination of the model parameters, and the calibrated constitutive law predicts both the uncoupled and coupled experimental observations accurately.

  5. Cumulative Risk and Impact Modeling on Environmental Chemical and Social Stressors.

    PubMed

    Huang, Hongtai; Wang, Aolin; Morello-Frosch, Rachel; Lam, Juleen; Sirota, Marina; Padula, Amy; Woodruff, Tracey J

    2018-03-01

    The goal of this review is to identify cumulative modeling methods used to evaluate combined effects of exposures to environmental chemicals and social stressors. The specific review question is: What are the existing quantitative methods used to examine the cumulative impacts of exposures to environmental chemical and social stressors on health? There has been an increase in literature that evaluates combined effects of exposures to environmental chemicals and social stressors on health using regression models; very few studies applied other data mining and machine learning techniques to this problem. The majority of studies we identified used regression models to evaluate combined effects of multiple environmental and social stressors. With proper study design and appropriate modeling assumptions, additional data mining methods may be useful to examine combined effects of environmental and social stressors.

  6. 1, 2, 3, 4: Infusing Quantitative Literacy into Introductory Biology

    PubMed Central

    Momsen, Jennifer L.; Moyerbrailean, Gregory A.; Ebert-May, Diane; Long, Tammy M.; Wyse, Sara; Linton, Debra

    2010-01-01

    Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills. PMID:20810965

  7. Information-theoretic model comparison unifies saliency metrics

    PubMed Central

    Kümmerer, Matthias; Wallis, Thomas S. A.; Bethge, Matthias

    2015-01-01

    Learning the properties of an image associated with human gaze placement is important both for understanding how biological systems explore the environment and for computer vision applications. There is a large literature on quantitative eye movement models that seeks to predict fixations from images (sometimes termed “saliency” prediction). A major problem known to the field is that existing model comparison metrics give inconsistent results, causing confusion. We argue that the primary reason for these inconsistencies is because different metrics and models use different definitions of what a “saliency map” entails. For example, some metrics expect a model to account for image-independent central fixation bias whereas others will penalize a model that does. Here we bring saliency evaluation into the domain of information by framing fixation prediction models probabilistically and calculating information gain. We jointly optimize the scale, the center bias, and spatial blurring of all models within this framework. Evaluating existing metrics on these rephrased models produces almost perfect agreement in model rankings across the metrics. Model performance is separated from center bias and spatial blurring, avoiding the confounding of these factors in model comparison. We additionally provide a method to show where and how models fail to capture information in the fixations on the pixel level. These methods are readily extended to spatiotemporal models of fixation scanpaths, and we provide a software package to facilitate their use. PMID:26655340

  8. Modeling of Continuum Manipulators Using Pythagorean Hodograph Curves.

    PubMed

    Singh, Inderjeet; Amara, Yacine; Melingui, Achille; Mani Pathak, Pushparaj; Merzouki, Rochdi

    2018-05-10

    Research on continuum manipulators is increasingly developing in the context of bionic robotics because of their many advantages over conventional rigid manipulators. Due to their soft structure, they have inherent flexibility, which makes it a huge challenge to control them with high performances. Before elaborating a control strategy of such robots, it is essential to reconstruct first the behavior of the robot through development of an approximate behavioral model. This can be kinematic or dynamic depending on the conditions of operation of the robot itself. Kinematically, two types of modeling methods exist to describe the robot behavior; quantitative methods describe a model-based method, and qualitative methods describe a learning-based method. In kinematic modeling of continuum manipulator, the assumption of constant curvature is often considered to simplify the model formulation. In this work, a quantitative modeling method is proposed, based on the Pythagorean hodograph (PH) curves. The aim is to obtain a three-dimensional reconstruction of the shape of the continuum manipulator with variable curvature, allowing the calculation of its inverse kinematic model (IKM). It is noticed that the performances of the PH-based kinematic modeling of continuum manipulators are considerable regarding position accuracy, shape reconstruction, and time/cost of the model calculation, than other kinematic modeling methods, for two cases: free load manipulation and variable load manipulation. This modeling method is applied to the compact bionic handling assistant (CBHA) manipulator for validation. The results are compared with other IKMs developed in case of CBHA manipulator.

  9. Developing a Multiplexed Quantitative Cross-Linking Mass Spectrometry Platform for Comparative Structural Analysis of Protein Complexes.

    PubMed

    Yu, Clinton; Huszagh, Alexander; Viner, Rosa; Novitsky, Eric J; Rychnovsky, Scott D; Huang, Lan

    2016-10-18

    Cross-linking mass spectrometry (XL-MS) represents a recently popularized hybrid methodology for defining protein-protein interactions (PPIs) and analyzing structures of large protein assemblies. In particular, XL-MS strategies have been demonstrated to be effective in elucidating molecular details of PPIs at the peptide resolution, providing a complementary set of structural data that can be utilized to refine existing complex structures or direct de novo modeling of unknown protein structures. To study structural and interaction dynamics of protein complexes, quantitative cross-linking mass spectrometry (QXL-MS) strategies based on isotope-labeled cross-linkers have been developed. Although successful, these approaches are mostly limited to pairwise comparisons. In order to establish a robust workflow enabling comparative analysis of multiple cross-linked samples simultaneously, we have developed a multiplexed QXL-MS strategy, namely, QMIX (Quantitation of Multiplexed, Isobaric-labeled cross (X)-linked peptides) by integrating MS-cleavable cross-linkers with isobaric labeling reagents. This study has established a new analytical platform for quantitative analysis of cross-linked peptides, which can be directly applied for multiplexed comparisons of the conformational dynamics of protein complexes and PPIs at the proteome scale in future studies.

  10. IWGT report on quantitative approaches to genotoxicity risk ...

    EPA Pesticide Factsheets

    This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the need for quantitative dose–response analysis of genetic toxicology data, the existence and appropriate evaluation of threshold responses, and methods to analyze exposure-response relationships and derive points of departure (PoDs) from which acceptable exposure levels could be determined. This report summarizes the QWG discussions and recommendations regarding appropriate approaches to evaluate exposure-related risks of genotoxic damage, including extrapolation below identified PoDs and across test systems and species. Recommendations include the selection of appropriate genetic endpoints and target tissues, uncertainty factors and extrapolation methods to be considered, the importance and use of information on mode of action, toxicokinetics, metabolism, and exposure biomarkers when using quantitative exposure-response data to determine acceptable exposure levels in human populations or to assess the risk associated with known or anticipated exposures. The empirical relationship between genetic damage (mutation and chromosomal aberration) and cancer in animal models was also examined. It was concluded that there is a general correlation between cancer induction and mutagenic and/or clast

  11. Classification-based quantitative analysis of stable isotope labeling by amino acids in cell culture (SILAC) data.

    PubMed

    Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul

    2016-12-01

    Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    NASA Astrophysics Data System (ADS)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  13. The spread of gossip in American schools

    NASA Astrophysics Data System (ADS)

    Lind, P. G.; da Silva, L. R.; Andrade, J. S., Jr.; Herrmann, H. J.

    2007-06-01

    Gossip is defined as a rumor which specifically targets one individual and essentially only propagates within its friendship connections. How fast and how far a gossip can spread is for the first time assessed quantitatively in this study. For that purpose we introduce the "spread factor" and study it on empirical networks of school friendships as well as on various models for social connections. We discover that there exists an ideal number of friendship connections an individual should have to minimize the danger of gossip propagation.

  14. Hypercuboidal renormalization in spin foam quantum gravity

    NASA Astrophysics Data System (ADS)

    Bahr, Benjamin; Steinhaus, Sebastian

    2017-06-01

    In this article, we apply background-independent renormalization group methods to spin foam quantum gravity. It is aimed at extending and elucidating the analysis of a companion paper, in which the existence of a fixed point in the truncated renormalization group flow for the model was reported. Here, we repeat the analysis with various modifications and find that both qualitative and quantitative features of the fixed point are robust in this setting. We also go into details about the various approximation schemes employed in the analysis.

  15. Surface colour photometry of galaxies with Schmidt telescopes.

    NASA Technical Reports Server (NTRS)

    Wray, J. D.

    1972-01-01

    A method is described which owes its practicality to the capability of Schmidt telescopes to record a number of galaxy images on a single plate and to the existence of high speed computer controlled area-scanning precision microdensitometers such as the Photometric Data Systems model 1010. The method of analysis results in quantitative color-index information which is displayed in a manner that allows any user to effectively study the morphological properties of the distribution of color-index in galaxies.

  16. Study on index system of GPS interference effect evaluation

    NASA Astrophysics Data System (ADS)

    Zhang, Kun; Zeng, Fangling; Zhao, Yuan; Zeng, Ruiqi

    2018-05-01

    Satellite navigation interference effect evaluation is the key technology to break through the research of Navigation countermeasure. To evaluate accurately the interference degree and Anti-jamming ability of GPS receiver, this text based on the existing research results of Navigation interference effect evaluation, build the index system of GPS receiver effectiveness evaluation from four levels of signal acquisition, tracking, demodulation and positioning/timing and establish the model for each index. These indexes can accurately and quantitatively describe the interference effect at all levels.

  17. The Tropical Rainfall Measuring Mission: An Overview

    NASA Technical Reports Server (NTRS)

    Kummerow. Christian; Hong, Ye

    1999-01-01

    The importance of quantitative knowledge of tropical rainfall, its associated latent heating and variability is summarized in the context of the global hydrologic cycle. Much of the tropics is covered by oceans. What land exists, is covered largely by rainforests that are only thinly populated. The only way to adequately measure the global tropical rainfall for climate and general circulation models is from space. To address these issues, the TRMM satellite was launched in Nov. 1997. It has been operating successfully ever since.

  18. Using Weighted Entropy to Rank Chemicals in Quantitative High Throughput Screening Experiments

    PubMed Central

    Shockley, Keith R.

    2014-01-01

    Quantitative high throughput screening (qHTS) experiments can simultaneously produce concentration-response profiles for thousands of chemicals. In a typical qHTS study, a large chemical library is subjected to a primary screen in order to identify candidate hits for secondary screening, validation studies or prediction modeling. Different algorithms, usually based on the Hill equation logistic model, have been used to classify compounds as active or inactive (or inconclusive). However, observed concentration-response activity relationships may not adequately fit a sigmoidal curve. Furthermore, it is unclear how to prioritize chemicals for follow-up studies given the large uncertainties that often accompany parameter estimates from nonlinear models. Weighted Shannon entropy can address these concerns by ranking compounds according to profile-specific statistics derived from estimates of the probability mass distribution of response at the tested concentration levels. This strategy can be used to rank all tested chemicals in the absence of a pre-specified model structure or the approach can complement existing activity call algorithms by ranking the returned candidate hits. The weighted entropy approach was evaluated here using data simulated from the Hill equation model. The procedure was then applied to a chemical genomics profiling data set interrogating compounds for androgen receptor agonist activity. PMID:24056003

  19. An analytical model to predict interstitial lubrication of cartilage in migrating contact areas.

    PubMed

    Moore, A C; Burris, D L

    2014-01-03

    For nearly a century, articular cartilage has been known for its exceptional tribological properties. For nearly as long, there have been research efforts to elucidate the responsible mechanisms for application toward biomimetic bearing applications. It is now widely accepted that interstitial fluid pressurization is the primary mechanism responsible for the unusual lubrication and load bearing properties of cartilage. Although the biomechanics community has developed elegant mathematical theories describing the coupling of solid and fluid (biphasic) mechanics and its role in interstitial lubrication, quantitative gaps in our understanding of cartilage tribology have inhibited our ability to predict how tribological conditions and material properties impact tissue function. This paper presents an analytical model of the interstitial lubrication of biphasic materials under migrating contact conditions. Although finite element and other numerical models of cartilage mechanics exist, they typically neglect the important role of the collagen network and are limited to a specific set of input conditions, which limits general applicability. The simplified approach taken in this work aims to capture the broader underlying physics as a starting point for further model development. In agreement with existing literature, the model indicates that a large Peclet number, Pe, is necessary for effective interstitial lubrication. It also predicts that the tensile modulus must be large relative to the compressive modulus. This explains why hydrogels and other biphasic materials do not provide significant interstitial pressure under high Pe conditions. The model quantitatively agrees with in-situ measurements of interstitial load support and the results have interesting implications for tissue engineering and osteoarthritis problems. This paper suggests that a low tensile modulus (from chondromalacia or local collagen rupture after impact, for example) may disrupt interstitial pressurization, increase shear stresses, and activate a condition of progressive surface damage as a potential precursor of osteoarthritis. © 2013 Elsevier Ltd. All rights reserved.

  20. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  1. QSAR prediction of additive and non-additive mixture toxicities of antibiotics and pesticide.

    PubMed

    Qin, Li-Tang; Chen, Yu-Han; Zhang, Xin; Mo, Ling-Yun; Zeng, Hong-Hu; Liang, Yan-Peng

    2018-05-01

    Antibiotics and pesticides may exist as a mixture in real environment. The combined effect of mixture can either be additive or non-additive (synergism and antagonism). However, no effective predictive approach exists on predicting the synergistic and antagonistic toxicities of mixtures. In this study, we developed a quantitative structure-activity relationship (QSAR) model for the toxicities (half effect concentration, EC 50 ) of 45 binary and multi-component mixtures composed of two antibiotics and four pesticides. The acute toxicities of single compound and mixtures toward Aliivibrio fischeri were tested. A genetic algorithm was used to obtain the optimized model with three theoretical descriptors. Various internal and external validation techniques indicated that the coefficient of determination of 0.9366 and root mean square error of 0.1345 for the QSAR model predicted that 45 mixture toxicities presented additive, synergistic, and antagonistic effects. Compared with the traditional concentration additive and independent action models, the QSAR model exhibited an advantage in predicting mixture toxicity. Thus, the presented approach may be able to fill the gaps in predicting non-additive toxicities of binary and multi-component mixtures. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. The stock-flow model of spatial data infrastructure development refined by fuzzy logic.

    PubMed

    Abdolmajidi, Ehsan; Harrie, Lars; Mansourian, Ali

    2016-01-01

    The system dynamics technique has been demonstrated to be a proper method by which to model and simulate the development of spatial data infrastructures (SDI). An SDI is a collaborative effort to manage and share spatial data at different political and administrative levels. It is comprised of various dynamically interacting quantitative and qualitative (linguistic) variables. To incorporate linguistic variables and their joint effects in an SDI-development model more effectively, we suggest employing fuzzy logic. Not all fuzzy models are able to model the dynamic behavior of SDIs properly. Therefore, this paper aims to investigate different fuzzy models and their suitability for modeling SDIs. To that end, two inference and two defuzzification methods were used for the fuzzification of the joint effect of two variables in an existing SDI model. The results show that the Average-Average inference and Center of Area defuzzification can better model the dynamics of SDI development.

  3. Mapping the function of neuronal ion channels in model and experiment

    PubMed Central

    Podlaski, William F; Seeholzer, Alexander; Groschner, Lukas N; Miesenböck, Gero; Ranjan, Rajnish; Vogels, Tim P

    2017-01-01

    Ion channel models are the building blocks of computational neuron models. Their biological fidelity is therefore crucial for the interpretation of simulations. However, the number of published models, and the lack of standardization, make the comparison of ion channel models with one another and with experimental data difficult. Here, we present a framework for the automated large-scale classification of ion channel models. Using annotated metadata and responses to a set of voltage-clamp protocols, we assigned 2378 models of voltage- and calcium-gated ion channels coded in NEURON to 211 clusters. The IonChannelGenealogy (ICGenealogy) web interface provides an interactive resource for the categorization of new and existing models and experimental recordings. It enables quantitative comparisons of simulated and/or measured ion channel kinetics, and facilitates field-wide standardization of experimentally-constrained modeling. DOI: http://dx.doi.org/10.7554/eLife.22152.001 PMID:28267430

  4. The ACCE method: an approach for obtaining quantitative or qualitative estimates of residual confounding that includes unmeasured confounding

    PubMed Central

    Smith, Eric G.

    2015-01-01

    Background:  Nonrandomized studies typically cannot account for confounding from unmeasured factors.  Method:  A method is presented that exploits the recently-identified phenomenon of  “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors.  Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure.  Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results:  Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met.  Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations:  Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions:  To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward.  The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226

  5. Hemisphere Asymmetry of Response to Pharmacologic Treatment in an Alzheimer's Disease Mouse Model.

    PubMed

    Manousopoulou, Antigoni; Saito, Satoshi; Yamamoto, Yumi; Al-Daghri, Nasser M; Ihara, Masafumi; Carare, Roxana O; Garbis, Spiros D

    2016-01-01

    The aim of this study was to examine hemisphere asymmetry of response to pharmacologic treatment in an Alzheimer's disease mouse model using cilostazol as a chemical stimulus. Eight-month-old mice were assigned to vehicle or cilostazol treatment for three months and hemispheres were analyzed using quantitative proteomics. Bioinformatics interpretation showed that following treatment, aggregation of blood platelets significantly decreased in the right hemisphere whereas neurodegeneration significantly decreased and synaptic transmission increased in the left hemisphere only. Our study provides novel evidence on cerebral laterality of pharmacologic activity, with important implications in deciphering regional pharmacodynamic effects of existing drugs thus uncovering novel hemisphere-specific therapeutic targets.

  6. Emerging Patient-Driven Health Care Models: An Examination of Health Social Networks, Consumer Personalized Medicine and Quantified Self-Tracking

    PubMed Central

    Swan, Melanie

    2009-01-01

    A new class of patient-driven health care services is emerging to supplement and extend traditional health care delivery models and empower patient self-care. Patient-driven health care can be characterized as having an increased level of information flow, transparency, customization, collaboration and patient choice and responsibility-taking, as well as quantitative, predictive and preventive aspects. The potential exists to both improve traditional health care systems and expand the concept of health care though new services. This paper examines three categories of novel health services: health social networks, consumer personalized medicine and quantified self-tracking. PMID:19440396

  7. The use of virtual environments for percentage view analysis.

    PubMed

    Schofield, Damian; Cox, Christopher J B

    2005-09-01

    It is recognised that Visual Impact Assessment (VIA), unlike many other aspects of Environmental Impact Assessments (EIA), relies less upon measurement than upon experience and judgement. Hence, it is necessary for a more structured and consistent approach towards VIA, reducing the amount of bias and subjectivity. For proposed developments, there are very few quantitative techniques for the evaluation of visibility, and these existing methods can be highly inaccurate and time consuming. Percentage view changes are one of the few quantitative techniques, and the use of computer technology can reduce the inaccuracy and the time spent evaluating the visibility of either existing or proposed developments. For over 10 years, research work undertaken by the authors at the University of Nottingham has employed Computer Graphics (CG) and Virtual Reality (VR) in civilian and industrial contexts for environmental planning, design visualisation, accident reconstruction, risk analysis, data visualisation and training simulators. This paper describes a method to quantitatively assess the visual impact of proposed developments on the landscape using CG techniques. This method allows the determination of accurate percentage view changes with the use of a computer-generated model of the environment and the application of specialist software that has been developed at the University of Nottingham. The principles are easy to understand and therefore planners, authorisation agencies and members of the public can use and understand the results. A case study is shown to demonstrate the application and the capabilities of the technology.

  8. Modular modelling with Physiome standards

    PubMed Central

    Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.

    2016-01-01

    Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to address this consideration. The principles are illustrated with examples that couple electrophysiology, signalling, metabolism, gene regulation and synthetic biology, together forming an architectural prototype for whole‐cell modelling (including human intervention) in CellML. Such models illustrate how testable units of quantitative biophysical simulation can be constructed. Finally, future relationships between modular models so constructed and Physiome frameworks and tools are discussed, with particular reference to how such frameworks and tools can in turn be extended to complement and gain more benefit from the results of applying the principles. PMID:27353233

  9. Analysis of cell division patterns in the Arabidopsis shoot apical meristem

    DOE PAGES

    Shapiro, Bruce E.; Tobin, Cory; Mjolsness, Eric; ...

    2015-03-30

    The stereotypic pattern of cell shapes in the Arabidopsis shoot apical meristem (SAM) suggests that strict rules govern the placement of new walls during cell division. When a cell in the SAM divides, a new wall is built that connects existing walls and divides the cytoplasm of the daughter cells. Because features that are determined by the placement of new walls such as cell size, shape, and number of neighbors are highly regular, rules must exist for maintaining such order. Here in this paper we present a quantitative model of these rules that incorporates different observed features of cell division.more » Each feature is incorporated into a "potential function" that contributes a single term to a total analog of potential energy. New cell walls are predicted to occur at locations where the potential function is minimized. Quantitative terms that represent the well-known historical rules of plant cell division, such as those given by Hofmeister, Errera, and Sachs are developed and evaluated against observed cell divisions in the epidermal layer (L1) of Arabidopsis thaliana SAM. The method is general enough to allow additional terms for nongeometric properties such as internal concentration gradients and mechanical tensile forces.« less

  10. Field Assessment of Energy Audit Tools for Retrofit Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, J.; Bohac, D.; Nelson, C.

    2013-07-01

    This project focused on the use of home energy ratings as a tool to promote energy retrofits in existing homes. A home energy rating provides a quantitative appraisal of a home's asset performance, usually compared to a benchmark such as the average energy use of similar homes in the same region. Home rating systems can help motivate homeowners in several ways. Ratings can clearly communicate a home's achievable energy efficiency potential, provide a quantitative assessment of energy savings after retrofits are completed, and show homeowners how they rate compared to their neighbors, thus creating an incentive to conform to amore » social standard. An important consideration is how rating tools for the retrofit market will integrate with existing home energy service programs. For residential programs that target energy savings only, home visits should be focused on key efficiency measures for that home. In order to gain wide adoption, a rating tool must be easily integrated into the field process, demonstrate consistency and reasonable accuracy to earn the trust of home energy technicians, and have a low monetary cost and time hurdle for homeowners. Along with the Home Energy Score, this project also evaluated the energy modeling performance of SIMPLE and REM/Rate.« less

  11. Mathematical modelling and quantitative methods.

    PubMed

    Edler, L; Poirier, K; Dourson, M; Kleiner, J; Mileson, B; Nordmann, H; Renwick, A; Slob, W; Walton, K; Würtzen, G

    2002-01-01

    The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.

  12. Large explosive basaltic eruptions at Katla volcano, Iceland: Fragmentation, grain size and eruption dynamics

    NASA Astrophysics Data System (ADS)

    Schmith, Johanne; Höskuldsson, Ármann; Holm, Paul Martin; Larsen, Guðrún

    2018-04-01

    Katla volcano in Iceland produces hazardous large explosive basaltic eruptions on a regular basis, but very little quantitative data for future hazard assessments exist. Here details on fragmentation mechanism and eruption dynamics are derived from a study of deposit stratigraphy with detailed granulometry and grain morphology analysis, granulometric modeling, componentry and the new quantitative regularity index model of fragmentation mechanism. We show that magma/water interaction is important in the ash generation process, but to a variable extent. By investigating the large explosive basaltic eruptions from 1755 and 1625, we document that eruptions of similar size and magma geochemistry can have very different fragmentation dynamics. Our models show that fragmentation in the 1755 eruption was a combination of magmatic degassing and magma/water-interaction with the most magma/water-interaction at the beginning of the eruption. The fragmentation of the 1625 eruption was initially also a combination of both magmatic and phreatomagmatic processes, but magma/water-interaction diminished progressively during the later stages of the eruption. However, intense magma/water interaction was reintroduced during the final stages of the eruption dominating the fine fragmentation at the end. This detailed study of fragmentation changes documents that subglacial eruptions have highly variable interaction with the melt water showing that the amount and access to melt water changes significantly during eruptions. While it is often difficult to reconstruct the progression of eruptions that have no quantitative observational record, this study shows that integrating field observations and granulometry with the new regularity index can form a coherent model of eruption evolution.

  13. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography.

    PubMed

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-09-01

    Partial volume effects (PVEs) are consequences of the limited spatial resolution in emission tomography leading to underestimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multiresolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low-resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model, which may introduce artifacts in regions where no significant correlation exists between anatomical and functional details. A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present, the new model outperformed the 2D global approach, avoiding artifacts and significantly improving quality of the corrected images and their quantitative accuracy. A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multiresolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information.

  14. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography

    PubMed Central

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-01-01

    Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037

  15. A Quantitative Ecological Risk Assessment of the Toxicological Risks from Exxon Valdez Subsurface Oil Residues to Sea Otters at Northern Knight Island, Prince William Sound, Alaska

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Johnson, Charles B.; Garshelis, David L.; Parker, Keith R.

    2010-01-01

    A comprehensive, quantitative risk assessment is presented of the toxicological risks from buried Exxon Valdez subsurface oil residues (SSOR) to a subpopulation of sea otters (Enhydra lutris) at Northern Knight Island (NKI) in Prince William Sound, Alaska, as it has been asserted that this subpopulation of sea otters may be experiencing adverse effects from the SSOR. The central questions in this study are: could the risk to NKI sea otters from exposure to polycyclic aromatic hydrocarbons (PAHs) in SSOR, as characterized in 2001–2003, result in individual health effects, and, if so, could that exposure cause subpopulation-level effects? We follow the U.S. Environmental Protection Agency (USEPA) risk paradigm by: (a) identifying potential routes of exposure to PAHs from SSOR; (b) developing a quantitative simulation model of exposures using the best available scientific information; (c) developing scenarios based on calculated probabilities of sea otter exposures to SSOR; (d) simulating exposures for 500,000 modeled sea otters and extracting the 99.9% quantile most highly exposed individuals; and (e) comparing projected exposures to chronic toxicity reference values. Results indicate that, even under conservative assumptions in the model, maximum-exposed sea otters would not receive a dose of PAHs sufficient to cause any health effects; consequently, no plausible toxicological risk exists from SSOR to the sea otter subpopulation at NKI. PMID:20862194

  16. Using representations in geometry: a model of students' cognitive and affective performance

    NASA Astrophysics Data System (ADS)

    Panaoura, Areti

    2014-05-01

    Self-efficacy beliefs in mathematics, as a dimension of the affective domain, are related with students' performance on solving tasks and mainly on overcoming cognitive obstacles. The present study investigated the interrelations of cognitive performance on geometry and young students' self-efficacy beliefs about using representations for solving geometrical tasks. The emphasis was on confirming a theoretical model for the primary-school and secondary-school students and identifying the differences and similarities for the two ages. A quantitative study was developed and data were collected from 1086 students in Grades 5-8. Confirmatory factor analysis affirmed the existence of a coherent model of affective dimensions about the use of representations for understanding the geometrical concepts, which becomes more stable across the educational levels.

  17. A Quantitative Model for the Dynamics of Serum Prostate-Specific Antigen as a Marker for Cancerous Growth

    PubMed Central

    Swanson, Kristin R.; True, Lawrence D.; Lin, Daniel W.; Buhler, Kent R.; Vessella, Robert; Murray, James D.

    2001-01-01

    Prostate-specific antigen (PSA) is an enzyme produced by both normal and cancerous prostate epithelial cells. Although PSA is the most widely used serum marker to detect and follow patients with prostatic adenocarcinoma, there are certain anomalies in the values of serum levels of PSA that are not understood. We developed a mathematical model for the dynamics of serum levels of PSA as a function of the tumor volume. Our model results show good agreement with experimental observations and provide an explanation for the existence of significant prostatic tumor mass despite a low-serum PSA. This result can be very useful in enhancing the use of serum PSA levels as a marker for cancer growth. PMID:11395397

  18. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  19. A quantitative experiment on the fountain effect in superfluid helium

    NASA Astrophysics Data System (ADS)

    Amigó, M. L.; Herrera, T.; Neñer, L.; Peralta Gavensky, L.; Turco, F.; Luzuriaga, J.

    2017-09-01

    Superfluid helium, a state of matter existing at low temperatures, shows many remarkable properties. One example is the so called fountain effect, where a heater can produce a jet of helium. This converts heat into mechanical motion; a machine with no moving parts, but working only below 2 K. Allen and Jones first demonstrated the effect in 1938, but their work was basically qualitative. We now present data of a quantitative version of the experiment. We have measured the heat supplied, the temperature and the height of the jet produced. We also develop equations, based on the two-fluid model of superfluid helium, that give a satisfactory fit to the data. The experiment has been performed by advanced undergraduate students in our home institution, and illustrates in a vivid way some of the striking properties of the superfluid state.

  20. Near quantitative agreement of model free DFT- MD predictions with XAFS observations of the hydration structure of highly charged transition metal ions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fulton, John L.; Bylaska, Eric J.; Bogatko, Stuart A.

    DFT-MD simulations (PBE96 and PBE0) with MD-XAFS scattering calculations (FEFF9) show near quantitative agreement with new and existing XAFS measurements for a comprehensive series of transition metal ions which interact with their hydration shells via complex mechanisms (high spin, covalency, charge transfer, etc.). This work was supported by the U.S. Department of Energy (DOE), Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is operated for the U.S. DOE by Battelle. A portion of the research was performed using EMSL, a national scientific user facility sponsored by the U.S. DOE's Office ofmore » Biological and Environmental Research and located at Pacific Northwest National Laboratory.« less

  1. B-ALL minimal residual disease flow cytometry: an application of a novel method for optimization of a single-tube model.

    PubMed

    Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C

    2015-05-01

    Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.

  2. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Combinational Reasoning of Quantitative Fuzzy Topological Relations for Simple Fuzzy Regions

    PubMed Central

    Liu, Bo; Li, Dajun; Xia, Yuanping; Ruan, Jian; Xu, Lili; Wu, Huanyi

    2015-01-01

    In recent years, formalization and reasoning of topological relations have become a hot topic as a means to generate knowledge about the relations between spatial objects at the conceptual and geometrical levels. These mechanisms have been widely used in spatial data query, spatial data mining, evaluation of equivalence and similarity in a spatial scene, as well as for consistency assessment of the topological relations of multi-resolution spatial databases. The concept of computational fuzzy topological space is applied to simple fuzzy regions to efficiently and more accurately solve fuzzy topological relations. Thus, extending the existing research and improving upon the previous work, this paper presents a new method to describe fuzzy topological relations between simple spatial regions in Geographic Information Sciences (GIS) and Artificial Intelligence (AI). Firstly, we propose a new definition for simple fuzzy line segments and simple fuzzy regions based on the computational fuzzy topology. And then, based on the new definitions, we also propose a new combinational reasoning method to compute the topological relations between simple fuzzy regions, moreover, this study has discovered that there are (1) 23 different topological relations between a simple crisp region and a simple fuzzy region; (2) 152 different topological relations between two simple fuzzy regions. In the end, we have discussed some examples to demonstrate the validity of the new method, through comparisons with existing fuzzy models, we showed that the proposed method can compute more than the existing models, as it is more expressive than the existing fuzzy models. PMID:25775452

  4. Bringing computational models of bone regeneration to the clinic.

    PubMed

    Carlier, Aurélie; Geris, Liesbet; Lammens, Johan; Van Oosterwyck, Hans

    2015-01-01

    Although the field of bone regeneration has experienced great advancements in the last decades, integrating all the relevant, patient-specific information into a personalized diagnosis and optimal treatment remains a challenging task due to the large number of variables that affect bone regeneration. Computational models have the potential to cope with this complexity and to improve the fundamental understanding of the bone regeneration processes as well as to predict and optimize the patient-specific treatment strategies. However, the current use of computational models in daily orthopedic practice is very limited or inexistent. We have identified three key hurdles that limit the translation of computational models of bone regeneration from bench to bed side. First, there exists a clear mismatch between the scope of the existing and the clinically required models. Second, most computational models are confronted with limited quantitative information of insufficient quality thereby hampering the determination of patient-specific parameter values. Third, current computational models are only corroborated with animal models, whereas a thorough (retrospective and prospective) assessment of the computational model will be crucial to convince the health care providers of the capabilities thereof. These challenges must be addressed so that computational models of bone regeneration can reach their true potential, resulting in the advancement of individualized care and reduction of the associated health care costs. © 2015 Wiley Periodicals, Inc.

  5. Ion beam nanopatterning of III-V semiconductors: Consistency of experimental and simulation trends within a chemistry-driven theory

    DOE PAGES

    El-Atwani, O.; Norris, S. A.; Ludwig, K.; ...

    2015-12-16

    In this study, several proposed mechanisms and theoretical models exist concerning nanostructure evolution on III-V semiconductors (particularly GaSb) via ion beam irradiation. However, making quantitative contact between experiment on the one hand and model-parameter dependent predictions from different theories on the other is usually difficult. In this study, we take a different approach and provide an experimental investigation with a range of targets (GaSb, GaAs, GaP) and ion species (Ne, Ar, Kr, Xe) to determine new parametric trends regarding nanostructure evolution. Concurrently, atomistic simulations using binary collision approximation over the same ion/target combinations were performed to determine parametric trends onmore » several quantities related to existing model. A comparison of experimental and numerical trends reveals that the two are broadly consistent under the assumption that instabilities are driven by chemical instability based on phase separation. Furthermore, the atomistic simulations and a survey of material thermodynamic properties suggest that a plausible microscopic mechanism for this process is an ion-enhanced mobility associated with energy deposition by collision cascades.« less

  6. A model for evaluating the social performance of construction waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuan Hongping, E-mail: hpyuan2005@gmail.com

    Highlights: Black-Right-Pointing-Pointer Scant attention is paid to social performance of construction waste management (CWM). Black-Right-Pointing-Pointer We develop a model for assessing the social performance of CWM. Black-Right-Pointing-Pointer With the model, the social performance of CWM can be quantitatively simulated. - Abstract: It has been determined by existing literature that a lot of research efforts have been made to the economic performance of construction waste management (CWM), but less attention is paid to investigation of the social performance of CWM. This study therefore attempts to develop a model for quantitatively evaluating the social performance of CWM by using a system dynamicsmore » (SD) approach. Firstly, major variables affecting the social performance of CWM are identified and a holistic system for assessing the social performance of CWM is formulated in line with feedback relationships underlying these variables. The developed system is then converted into a SD model through the software iThink. An empirical case study is finally conducted to demonstrate application of the model. Results of model validation indicate that the model is robust and reasonable to reflect the situation of the real system under study. Findings of the case study offer helpful insights into effectively promoting the social performance of CWM of the project investigated. Furthermore, the model exhibits great potential to function as an experimental platform for dynamically evaluating effects of management measures on improving the social performance of CWM of construction projects.« less

  7. pulver: an R package for parallel ultra-rapid p-value computation for linear regression interaction terms.

    PubMed

    Molnos, Sophie; Baumbach, Clemens; Wahl, Simone; Müller-Nurasyid, Martina; Strauch, Konstantin; Wang-Sattler, Rui; Waldenberger, Melanie; Meitinger, Thomas; Adamski, Jerzy; Kastenmüller, Gabi; Suhre, Karsten; Peters, Annette; Grallert, Harald; Theis, Fabian J; Gieger, Christian

    2017-09-29

    Genome-wide association studies allow us to understand the genetics of complex diseases. Human metabolism provides information about the disease-causing mechanisms, so it is usual to investigate the associations between genetic variants and metabolite levels. However, only considering genetic variants and their effects on one trait ignores the possible interplay between different "omics" layers. Existing tools only consider single-nucleotide polymorphism (SNP)-SNP interactions, and no practical tool is available for large-scale investigations of the interactions between pairs of arbitrary quantitative variables. We developed an R package called pulver to compute p-values for the interaction term in a very large number of linear regression models. Comparisons based on simulated data showed that pulver is much faster than the existing tools. This is achieved by using the correlation coefficient to test the null-hypothesis, which avoids the costly computation of inversions. Additional tricks are a rearrangement of the order, when iterating through the different "omics" layers, and implementing this algorithm in the fast programming language C++. Furthermore, we applied our algorithm to data from the German KORA study to investigate a real-world problem involving the interplay among DNA methylation, genetic variants, and metabolite levels. The pulver package is a convenient and rapid tool for screening huge numbers of linear regression models for significant interaction terms in arbitrary pairs of quantitative variables. pulver is written in R and C++, and can be downloaded freely from CRAN at https://cran.r-project.org/web/packages/pulver/ .

  8. The Role of Extracellular Vesicles in Metastasis

    DTIC Science & Technology

    2017-10-01

    quantitative characterization of each cancerous ESV subpopulation’s role in cargo transfer. Specifically, we aim to (1) optimize an existing...the first quantitative data on which ESV subpopulations (exosomes, MVs, or oncosomes) manipulate the tumor microenvironment, the ESV cargo transferred...without cellular contaminants or without damaging the cargo. A second hindrance has been the lack of quantitative methods for measuring very small

  9. Assessment of the Casualty Risk of Multiple Meteorological Hazards in China

    PubMed Central

    Xu, Wei; Zhuo, Li; Zheng, Jing; Ge, Yi; Gu, Zhihui; Tian, Yugang

    2016-01-01

    A study of the frequency, intensity, and risk of extreme climatic events or natural hazards is important for assessing the impacts of climate change. Many models have been developed to assess the risk of multiple hazards, however, most of the existing approaches can only model the relative levels of risk. This paper reports the development of a method for the quantitative assessment of the risk of multiple hazards based on information diffusion. This method was used to assess the risks of loss of human lives from 11 types of meteorological hazards in China at the prefectural and provincial levels. Risk curves of multiple hazards were obtained for each province and the risks of 10-year, 20-year, 50-year, and 100-year return periods were mapped. The results show that the provinces (municipalities, autonomous regions) in southeastern China are at higher risk of multiple meteorological hazards as a result of their geographical location and topography. The results of this study can be used as references for the management of meteorological disasters in China. The model can be used to quantitatively calculate the risks of casualty, direct economic losses, building collapse, and agricultural losses for any hazards at different spatial scales. PMID:26901210

  10. Assessment of the Casualty Risk of Multiple Meteorological Hazards in China.

    PubMed

    Xu, Wei; Zhuo, Li; Zheng, Jing; Ge, Yi; Gu, Zhihui; Tian, Yugang

    2016-02-17

    A study of the frequency, intensity, and risk of extreme climatic events or natural hazards is important for assessing the impacts of climate change. Many models have been developed to assess the risk of multiple hazards, however, most of the existing approaches can only model the relative levels of risk. This paper reports the development of a method for the quantitative assessment of the risk of multiple hazards based on information diffusion. This method was used to assess the risks of loss of human lives from 11 types of meteorological hazards in China at the prefectural and provincial levels. Risk curves of multiple hazards were obtained for each province and the risks of 10-year, 20-year, 50-year, and 100-year return periods were mapped. The results show that the provinces (municipalities, autonomous regions) in southeastern China are at higher risk of multiple meteorological hazards as a result of their geographical location and topography. The results of this study can be used as references for the management of meteorological disasters in China. The model can be used to quantitatively calculate the risks of casualty, direct economic losses, building collapse, and agricultural losses for any hazards at different spatial scales.

  11. Target-based drug discovery for [Formula: see text]-globin disorders: drug target prediction using quantitative modeling with hybrid functional Petri nets.

    PubMed

    Mehraei, Mani; Bashirov, Rza; Tüzmen, Şükrü

    2016-10-01

    Recent molecular studies provide important clues into treatment of [Formula: see text]-thalassemia, sickle-cell anaemia and other [Formula: see text]-globin disorders revealing that increased production of fetal hemoglobin, that is normally suppressed in adulthood, can ameliorate the severity of these diseases. In this paper, we present a novel approach for drug prediction for [Formula: see text]-globin disorders. Our approach is centered upon quantitative modeling of interactions in human fetal-to-adult hemoglobin switch network using hybrid functional Petri nets. In accordance with the reverse pharmacology approach, we pose a hypothesis regarding modulation of specific protein targets that induce [Formula: see text]-globin and consequently fetal hemoglobin. Comparison of simulation results for the proposed strategy with the ones obtained for already existing drugs shows that our strategy is the optimal as it leads to highest level of [Formula: see text]-globin induction and thereby has potential beneficial therapeutic effects on [Formula: see text]-globin disorders. Simulation results enable verification of model coherence demonstrating that it is consistent with qPCR data available for known strategies and/or drugs.

  12. Quantitative Assessment of Commutability for Clinical Viral Load Testing Using a Digital PCR-Based Reference Standard

    PubMed Central

    Tang, L.; Sun, Y.; Buelow, D.; Gu, Z.; Caliendo, A. M.; Pounds, S.

    2016-01-01

    Given recent advances in the development of quantitative standards, particularly WHO international standards, efforts to better understand the commutability of reference materials have been made. Existing approaches in evaluating commutability include prediction intervals and correspondence analysis; however, the results obtained from existing approaches may be ambiguous. We have developed a “deviation-from-ideal” (DFI) approach to evaluate commutability of standards and applied it to the assessment of Epstein-Bar virus (EBV) load testing in four quantitative PCR assays, treating digital PCR as a reference assay. We then discuss advantages and limitations of the DFI approach as well as experimental design to best evaluate the commutability of an assay in practice. PMID:27076654

  13. Interpretation and mapping of geological features using mobile devices for 3D outcrop modelling

    NASA Astrophysics Data System (ADS)

    Buckley, Simon J.; Kehl, Christian; Mullins, James R.; Howell, John A.

    2016-04-01

    Advances in 3D digital geometric characterisation have resulted in widespread adoption in recent years, with photorealistic models utilised for interpretation, quantitative and qualitative analysis, as well as education, in an increasingly diverse range of geoscience applications. Topographic models created using lidar and photogrammetry, optionally combined with imagery from sensors such as hyperspectral and thermal cameras, are now becoming commonplace in geoscientific research. Mobile devices (tablets and smartphones) are maturing rapidly to become powerful field computers capable of displaying and interpreting 3D models directly in the field. With increasingly high-quality digital image capture, combined with on-board sensor pose estimation, mobile devices are, in addition, a source of primary data, which can be employed to enhance existing geological models. Adding supplementary image textures and 2D annotations to photorealistic models is therefore a desirable next step to complement conventional field geoscience. This contribution reports on research into field-based interpretation and conceptual sketching on images and photorealistic models on mobile devices, motivated by the desire to utilise digital outcrop models to generate high quality training images (TIs) for multipoint statistics (MPS) property modelling. Representative training images define sedimentological concepts and spatial relationships between elements in the system, which are subsequently modelled using artificial learning to populate geocellular models. Photorealistic outcrop models are underused sources of quantitative and qualitative information for generating TIs, explored further in this research by linking field and office workflows through the mobile device. Existing textured models are loaded to the mobile device, allowing rendering in a 3D environment. Because interpretation in 2D is more familiar and comfortable for users, the developed application allows new images to be captured with the device's digital camera, and an interface is available for annotating (interpreting) the image using lines and polygons. Image-to-geometry registration is then performed using a developed algorithm, initialised using the coarse pose from the on-board orientation and positioning sensors. The annotations made on the captured images are then available in the 3D model coordinate system for overlay and export. This workflow allows geologists to make interpretations and conceptual models in the field, which can then be linked to and refined in office workflows for later MPS property modelling.

  14. New insights on plant phenological response to temperature revealed from long-term widespread observations in China.

    PubMed

    Zhang, Haicheng; Liu, Shuguang; Regnier, Pierre; Yuan, Wenping

    2018-05-01

    Constraints of temperature on spring plant phenology are closely related to plant growth, vegetation dynamics, and ecosystem carbon cycle. However, the effects of temperature on leaf onset, especially for winter chilling, are still not well understood. Using long-term, widespread in situ phenology observations collected over China for multiple plant species, this study analyzes the quantitative response of leaf onset to temperature, and compares empirical findings with existing theories and modeling approaches, as implemented in 18 phenology algorithms. Results show that the growing degree days (GDD) required for leaf onset vary distinctly among plant species and geographical locations as well as at organizational levels (species and community), pointing to diverse adaptation strategies. Chilling durations (CHD) needed for releasing bud dormancy decline monotonously from cold to warm areas with very limited interspecies variations. Results also reveal that winter chilling is a crucial component of phenology models, and its effect is better captured with an index that accounts for the inhomogeneous effectiveness of low temperature to chilling rate than with the conventional CHD index. The impact of spring warming on leaf onset is nonlinear, better represented by a logistical function of temperature than by the linear function currently implemented in biosphere models. The optimized base temperatures for thermal accumulation and the optimal chilling temperatures are species-dependent and average at 6.9 and 0.2°C, respectively. Overall, plants' chilling requirement is not a constant, and more chilling generally results in less requirement of thermal accumulation for leaf onset. Our results clearly demonstrate multiple deficiencies of the parameters (e.g., base temperature) and algorithms (e.g., method for calculating GDD) in conventional phenology models to represent leaf onset. Therefore, this study not only advances our mechanistic and quantitative understanding of temperature controls on leaf onset but also provides critical information for improving existing phenology models. © 2017 John Wiley & Sons Ltd.

  15. A Review of Flood Loss Models as Basis for Harmonization and Benchmarking

    PubMed Central

    Kreibich, Heidi; Franco, Guillermo; Marechal, David

    2016-01-01

    Risk-based approaches have been increasingly accepted and operationalized in flood risk management during recent decades. For instance, commercial flood risk models are used by the insurance industry to assess potential losses, establish the pricing of policies and determine reinsurance needs. Despite considerable progress in the development of loss estimation tools since the 1980s, loss estimates still reflect high uncertainties and disparities that often lead to questioning their quality. This requires an assessment of the validity and robustness of loss models as it affects prioritization and investment decision in flood risk management as well as regulatory requirements and business decisions in the insurance industry. Hence, more effort is needed to quantify uncertainties and undertake validations. Due to a lack of detailed and reliable flood loss data, first order validations are difficult to accomplish, so that model comparisons in terms of benchmarking are essential. It is checked if the models are informed by existing data and knowledge and if the assumptions made in the models are aligned with the existing knowledge. When this alignment is confirmed through validation or benchmarking exercises, the user gains confidence in the models. Before these benchmarking exercises are feasible, however, a cohesive survey of existing knowledge needs to be undertaken. With that aim, this work presents a review of flood loss–or flood vulnerability–relationships collected from the public domain and some professional sources. Our survey analyses 61 sources consisting of publications or software packages, of which 47 are reviewed in detail. This exercise results in probably the most complete review of flood loss models to date containing nearly a thousand vulnerability functions. These functions are highly heterogeneous and only about half of the loss models are found to be accompanied by explicit validation at the time of their proposal. This paper exemplarily presents an approach for a quantitative comparison of disparate models via the reduction to the joint input variables of all models. Harmonization of models for benchmarking and comparison requires profound insight into the model structures, mechanisms and underlying assumptions. Possibilities and challenges are discussed that exist in model harmonization and the application of the inventory in a benchmarking framework. PMID:27454604

  16. A Review of Flood Loss Models as Basis for Harmonization and Benchmarking.

    PubMed

    Gerl, Tina; Kreibich, Heidi; Franco, Guillermo; Marechal, David; Schröter, Kai

    2016-01-01

    Risk-based approaches have been increasingly accepted and operationalized in flood risk management during recent decades. For instance, commercial flood risk models are used by the insurance industry to assess potential losses, establish the pricing of policies and determine reinsurance needs. Despite considerable progress in the development of loss estimation tools since the 1980s, loss estimates still reflect high uncertainties and disparities that often lead to questioning their quality. This requires an assessment of the validity and robustness of loss models as it affects prioritization and investment decision in flood risk management as well as regulatory requirements and business decisions in the insurance industry. Hence, more effort is needed to quantify uncertainties and undertake validations. Due to a lack of detailed and reliable flood loss data, first order validations are difficult to accomplish, so that model comparisons in terms of benchmarking are essential. It is checked if the models are informed by existing data and knowledge and if the assumptions made in the models are aligned with the existing knowledge. When this alignment is confirmed through validation or benchmarking exercises, the user gains confidence in the models. Before these benchmarking exercises are feasible, however, a cohesive survey of existing knowledge needs to be undertaken. With that aim, this work presents a review of flood loss-or flood vulnerability-relationships collected from the public domain and some professional sources. Our survey analyses 61 sources consisting of publications or software packages, of which 47 are reviewed in detail. This exercise results in probably the most complete review of flood loss models to date containing nearly a thousand vulnerability functions. These functions are highly heterogeneous and only about half of the loss models are found to be accompanied by explicit validation at the time of their proposal. This paper exemplarily presents an approach for a quantitative comparison of disparate models via the reduction to the joint input variables of all models. Harmonization of models for benchmarking and comparison requires profound insight into the model structures, mechanisms and underlying assumptions. Possibilities and challenges are discussed that exist in model harmonization and the application of the inventory in a benchmarking framework.

  17. Sensitivity analysis of bi-layered ceramic dental restorations.

    PubMed

    Zhang, Zhongpu; Zhou, Shiwei; Li, Qing; Li, Wei; Swain, Michael V

    2012-02-01

    The reliability and longevity of ceramic prostheses have become a major concern. The existing studies have focused on some critical issues from clinical perspectives, but more researches are needed to address fundamental sciences and fabrication issues to ensure the longevity and durability of ceramic prostheses. The aim of this paper was to explore how "sensitive" the thermal and mechanical responses, in terms of changes in temperature and thermal residual stress of the bi-layered ceramic systems and crown models will be with respect to the perturbation of the design variables chosen (e.g. layer thickness and heat transfer coefficient) in a quantitative way. In this study, three bi-layered ceramic models with different geometries are considered: (i) a simple bi-layered plate, (ii) a simple bi-layer triangle, and (iii) an axisymmetric bi-layered crown. The layer thickness and convective heat transfer coefficient (or cooling rate) seem to be more sensitive for the porcelain fused on zirconia substrate models. The resultant sensitivities indicate a critical importance of the heat transfer coefficient and thickness ratio of core to veneer on the temperature distributions and residual stresses in each model. The findings provide a quantitative basis for assessing the effects of fabrication uncertainties and optimizing the design of ceramic prostheses. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  18. Numerical modeling of flow focusing: Quantitative characterization of the flow regimes

    NASA Astrophysics Data System (ADS)

    Mamet, V.; Namy, P.; Dedulle, J.-M.

    2017-09-01

    Among droplet generation technologies, the flow focusing technique is a major process due to its control, stability, and reproducibility. In this process, one fluid (the continuous phase) interacts with another one (the dispersed phase) to create small droplets. Experimental assays in the literature on gas-liquid flow focusing have shown that different jet regimes can be obtained depending on the operating conditions. However, the underlying physical phenomena remain unclear, especially mechanical interactions between the fluids and the oscillation phenomenon of the liquid. In this paper, based on published studies, a numerical diphasic model has been developed to take into consideration the mechanical interaction between phases, using the Cahn-Hilliard method to monitor the interface. Depending on the liquid/gas inputs and the geometrical parameters, various regimes can be obtained, from a steady state regime to an unsteady one with liquid oscillation. In the dispersed phase, the model enables us to compute the evolution of fluid flow, both in space (size of the recirculation zone) and in time (period of oscillation). The transition between unsteady and stationary regimes is assessed in relation to liquid and gas dimensionless numbers, showing the existence of critical thresholds. This model successfully highlights, qualitatively and quantitatively, the influence of the geometry of the nozzle, in particular, its inner diameter.

  19. Fast Identification of Biological Pathways Associated with a Quantitative Trait Using Group Lasso with Overlaps

    PubMed Central

    Silver, Matt; Montana, Giovanni

    2012-01-01

    Where causal SNPs (single nucleotide polymorphisms) tend to accumulate within biological pathways, the incorporation of prior pathways information into a statistical model is expected to increase the power to detect true associations in a genetic association study. Most existing pathways-based methods rely on marginal SNP statistics and do not fully exploit the dependence patterns among SNPs within pathways. We use a sparse regression model, with SNPs grouped into pathways, to identify causal pathways associated with a quantitative trait. Notable features of our “pathways group lasso with adaptive weights” (P-GLAW) algorithm include the incorporation of all pathways in a single regression model, an adaptive pathway weighting procedure that accounts for factors biasing pathway selection, and the use of a bootstrap sampling procedure for the ranking of important pathways. P-GLAW takes account of the presence of overlapping pathways and uses a novel combination of techniques to optimise model estimation, making it fast to run, even on whole genome datasets. In a comparison study with an alternative pathways method based on univariate SNP statistics, our method demonstrates high sensitivity and specificity for the detection of important pathways, showing the greatest relative gains in performance where marginal SNP effect sizes are small. PMID:22499682

  20. Synthetic cannabinoids: In silico prediction of the cannabinoid receptor 1 affinity by a quantitative structure-activity relationship model.

    PubMed

    Paulke, Alexander; Proschak, Ewgenij; Sommer, Kai; Achenbach, Janosch; Wunder, Cora; Toennes, Stefan W

    2016-03-14

    The number of new synthetic psychoactive compounds increase steadily. Among the group of these psychoactive compounds, the synthetic cannabinoids (SCBs) are most popular and serve as a substitute of herbal cannabis. More than 600 of these substances already exist. For some SCBs the in vitro cannabinoid receptor 1 (CB1) affinity is known, but for the majority it is unknown. A quantitative structure-activity relationship (QSAR) model was developed, which allows the determination of the SCBs affinity to CB1 (expressed as binding constant (Ki)) without reference substances. The chemically advance template search descriptor was used for vector representation of the compound structures. The similarity between two molecules was calculated using the Feature-Pair Distribution Similarity. The Ki values were calculated using the Inverse Distance Weighting method. The prediction model was validated using a cross validation procedure. The predicted Ki values of some new SCBs were in a range between 20 (considerably higher affinity to CB1 than THC) to 468 (considerably lower affinity to CB1 than THC). The present QSAR model can serve as a simple, fast and cheap tool to get a first hint of the biological activity of new synthetic cannabinoids or of other new psychoactive compounds. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Structure/activity relationships for biodegradability and their role in environmental assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boethling, R.S.

    1994-12-31

    Assessment of biodegradability is an important part of the review process for both new and existing chemicals under the Toxic Substances Control Act. It is often necessary to estimate biodegradability because experimental data are unavailable. Structure/biodegradability relationships (SBR) are a means to this end. Quantitative SBR have been developed, but this approach has not been very useful because they apply only to a few narrowly defined classes of chemicals. In response to the need for more widely applicable methods, multivariate analysis has been used to develop biodegradability classification models. For example, recent efforts have produced four new models. Two calculatemore » the probability of rapid biodegradation and can be used for classification; the other two models allow semi-quantitative estimation of primary and ultimate biodegradation rates. All are based on multiple regressions against 36 preselected substructures plus molecular weight. Such efforts have been fairly successful by statistical criteria, but in general are hampered by a lack of large and consistent datasets. Knowledge-based expert systems may represent the next step in the evolution of SBR. In principle such systems need not be as severely limited by imperfect datasets. However, the codification of expert knowledge and reasoning is a critical prerequisite. Results of knowledge acquisition exercises and modeling based on them will also be described.« less

  2. A label field fusion bayesian model and its penalized maximum rand estimator for image segmentation.

    PubMed

    Mignotte, Max

    2010-06-01

    This paper presents a novel segmentation approach based on a Markov random field (MRF) fusion model which aims at combining several segmentation results associated with simpler clustering models in order to achieve a more reliable and accurate segmentation result. The proposed fusion model is derived from the recently introduced probabilistic Rand measure for comparing one segmentation result to one or more manual segmentations of the same image. This non-parametric measure allows us to easily derive an appealing fusion model of label fields, easily expressed as a Gibbs distribution, or as a nonstationary MRF model defined on a complete graph. Concretely, this Gibbs energy model encodes the set of binary constraints, in terms of pairs of pixel labels, provided by each segmentation results to be fused. Combined with a prior distribution, this energy-based Gibbs model also allows for definition of an interesting penalized maximum probabilistic rand estimator with which the fusion of simple, quickly estimated, segmentation results appears as an interesting alternative to complex segmentation models existing in the literature. This fusion framework has been successfully applied on the Berkeley image database. The experiments reported in this paper demonstrate that the proposed method is efficient in terms of visual evaluation and quantitative performance measures and performs well compared to the best existing state-of-the-art segmentation methods recently proposed in the literature.

  3. Discrete diffraction managed solitons: Threshold phenomena and rapid decay for general nonlinearities

    NASA Astrophysics Data System (ADS)

    Choi, Mi-Ran; Hundertmark, Dirk; Lee, Young-Ran

    2017-10-01

    We prove a threshold phenomenon for the existence/non-existence of energy minimizing solitary solutions of the diffraction management equation for strictly positive and zero average diffraction. Our methods allow for a large class of nonlinearities; they are, for example, allowed to change sign, and the weakest possible condition, it only has to be locally integrable, on the local diffraction profile. The solutions are found as minimizers of a nonlinear and nonlocal variational problem which is translation invariant. There exists a critical threshold λcr such that minimizers for this variational problem exist if their power is bigger than λcr and no minimizers exist with power less than the critical threshold. We also give simple criteria for the finiteness and strict positivity of the critical threshold. Our proof of existence of minimizers is rather direct and avoids the use of Lions' concentration compactness argument. Furthermore, we give precise quantitative lower bounds on the exponential decay rate of the diffraction management solitons, which confirm the physical heuristic prediction for the asymptotic decay rate. Moreover, for ground state solutions, these bounds give a quantitative lower bound for the divergence of the exponential decay rate in the limit of vanishing average diffraction. For zero average diffraction, we prove quantitative bounds which show that the solitons decay much faster than exponentially. Our results considerably extend and strengthen the results of Hundertmark and Lee [J. Nonlinear Sci. 22, 1-38 (2012) and Commun. Math. Phys. 309(1), 1-21 (2012)].

  4. Stability basin estimates fall risk from observed kinematics, demonstrated on the Sit-to-Stand task.

    PubMed

    Shia, Victor; Moore, Talia Yuki; Holmes, Patrick; Bajcsy, Ruzena; Vasudevan, Ram

    2018-04-27

    The ability to quantitatively measure stability is essential to ensuring the safety of locomoting systems. While the response to perturbation directly reflects the stability of a motion, this experimental method puts human subjects at risk. Unfortunately, existing indirect methods for estimating stability from unperturbed motion have been shown to have limited predictive power. This paper leverages recent advances in dynamical systems theory to accurately estimate the stability of human motion without requiring perturbation. This approach relies on kinematic observations of a nominal Sit-to-Stand motion to construct an individual-specific dynamic model, input bounds, and feedback control that are then used to compute the set of perturbations from which the model can recover. This set, referred to as the stability basin, was computed for 14 individuals, and was able to successfully differentiate between less and more stable Sit-to-Stand strategies for each individual with greater accuracy than existing methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  6. Balancing on tightropes and slacklines

    PubMed Central

    Paoletti, P.; Mahadevan, L.

    2012-01-01

    Balancing on a tightrope or a slackline is an example of a neuromechanical task where the whole body both drives and responds to the dynamics of the external environment, often on multiple timescales. Motivated by a range of neurophysiological observations, here we formulate a minimal model for this system and use optimal control theory to design a strategy for maintaining an upright position. Our analysis of the open and closed-loop dynamics shows the existence of an optimal rope sag where balancing requires minimal effort, consistent with qualitative observations and suggestive of strategies for optimizing balancing performance while standing and walking. Our consideration of the effects of nonlinearities, potential parameter coupling and delays on the overall performance shows that although these factors change the results quantitatively, the existence of an optimal strategy persists. PMID:22513724

  7. The Quantitative Preparation of Future Geoscience Graduate Students

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways. Calculus, calculus-based physics, chemistry, statistics, programming and linear algebra were viewed as important course preparation for a successful graduate experience. A set of recommendations for departments and for new community resources includes ideas for infusing quantitative reasoning throughout the undergraduate experience and mechanisms for learning from successful experiments in both geoscience and mathematics. A full list of participants, summaries of the meeting discussion and recommendations are available at http://serc.carleton.edu/quantskills/winter06/index.html. These documents, crafted by a small but diverse group can serve as a starting point for broader community discussion of the quantitative preparation of future geoscience graduate students.

  8. Incorporating learning goals about modeling into an upper-division physics laboratory experiment

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin M.; Finkelstein, Noah; Lewandowski, H. J.

    2014-09-01

    Implementing a laboratory activity involves a complex interplay among learning goals, available resources, feedback about the existing course, best practices for teaching, and an overall philosophy about teaching labs. Building on our previous work, which described a process of transforming an entire lab course, we now turn our attention to how an individual lab activity on the polarization of light was redesigned to include a renewed emphasis on one broad learning goal: modeling. By using this common optics lab as a concrete case study of a broadly applicable approach, we highlight many aspects of the activity development and show how modeling is used to integrate sophisticated conceptual and quantitative reasoning into the experimental process through the various aspects of modeling: constructing models, making predictions, interpreting data, comparing measurements with predictions, and refining models. One significant outcome is a natural way to integrate an analysis and discussion of systematic error into a lab activity.

  9. A new adaptive L1-norm for optimal descriptor selection of high-dimensional QSAR classification model for anti-hepatitis C virus activity of thiourea derivatives.

    PubMed

    Algamal, Z Y; Lee, M H

    2017-01-01

    A high-dimensional quantitative structure-activity relationship (QSAR) classification model typically contains a large number of irrelevant and redundant descriptors. In this paper, a new design of descriptor selection for the QSAR classification model estimation method is proposed by adding a new weight inside L1-norm. The experimental results of classifying the anti-hepatitis C virus activity of thiourea derivatives demonstrate that the proposed descriptor selection method in the QSAR classification model performs effectively and competitively compared with other existing penalized methods in terms of classification performance on both the training and the testing datasets. Moreover, it is noteworthy that the results obtained in terms of stability test and applicability domain provide a robust QSAR classification model. It is evident from the results that the developed QSAR classification model could conceivably be employed for further high-dimensional QSAR classification studies.

  10. Decision analytic models for Alzheimer's disease: state of the art and future directions.

    PubMed

    Cohen, Joshua T; Neumann, Peter J

    2008-05-01

    Decision analytic policy models for Alzheimer's disease (AD) enable researchers and policy makers to investigate questions about the costs and benefits of a wide range of existing and potential screening, testing, and treatment strategies. Such models permit analysts to compare existing alternatives, explore hypothetical scenarios, and test the strength of underlying assumptions in an explicit, quantitative, and systematic way. Decision analytic models can best be viewed as complementing clinical trials both by filling knowledge gaps not readily addressed by empirical research and by extrapolating beyond the surrogate markers recorded in a trial. We identified and critiqued 13 distinct AD decision analytic policy models published since 1997. Although existing models provide useful insights, they also have a variety of limitations. (1) They generally characterize disease progression in terms of cognitive function and do not account for other distinguishing features, such as behavioral symptoms, functional performance, and the emotional well-being of AD patients and caregivers. (2) Many describe disease progression in terms of a limited number of discrete states, thus constraining the level of detail that can be used to characterize both changes in patient status and the relationships between disease progression and other factors, such as residential status, that influence outcomes of interest. (3) They have focused almost exclusively on evaluating drug treatments, thus neglecting other disease management strategies and combinations of pharmacologic and nonpharmacologic interventions. Future AD models should facilitate more realistic and compelling evaluations of various interventions to address the disease. An improved model will allow decision makers to better characterize the disease, to better assess the costs and benefits of a wide range of potential interventions, and to better evaluate the incremental costs and benefits of specific interventions used in conjunction with other disease management strategies.

  11. Random network model of electrical conduction in two-phase rock

    NASA Astrophysics Data System (ADS)

    Fuji-ta, Kiyoshi; Seki, Masayuki; Ichiki, Masahiro

    2018-05-01

    We developed a cell-type lattice model to clarify the interconnected conductivity mechanism of two-phase rock. We quantified electrical conduction networks in rock and evaluated electrical conductivity models of the two-phase interaction. Considering the existence ratio of conductive and resistive cells in the model, we generated natural matrix cells simulating a natural mineral distribution pattern, using Mersenne Twister random numbers. The most important and prominent feature of the model simulation is a drastic increase in the pseudo-conductivity index for conductor ratio R > 0.22. This index in the model increased from 10-4 to 100 between R = 0.22 and 0.9, a change of four orders of magnitude. We compared our model responses with results from previous model studies. Although the pseudo-conductivity computed by the model differs slightly from that of the previous model, model responses can account for the conductivity change. Our modeling is thus effective for quantitatively estimating the degree of interconnection of rock and minerals.

  12. Automated feature extraction and spatial organization of seafloor pockmarks, Belfast Bay, Maine, USA

    USGS Publications Warehouse

    Andrews, Brian D.; Brothers, Laura L.; Barnhardt, Walter A.

    2010-01-01

    Seafloor pockmarks occur worldwide and may represent millions of m3 of continental shelf erosion, but few numerical analyses of their morphology and spatial distribution of pockmarks exist. We introduce a quantitative definition of pockmark morphology and, based on this definition, propose a three-step geomorphometric method to identify and extract pockmarks from high-resolution swath bathymetry. We apply this GIS-implemented approach to 25 km2 of bathymetry collected in the Belfast Bay, Maine USA pockmark field. Our model extracted 1767 pockmarks and found a linear pockmark depth-to-diameter ratio for pockmarks field-wide. Mean pockmark depth is 7.6 m and mean diameter is 84.8 m. Pockmark distribution is non-random, and nearly half of the field's pockmarks occur in chains. The most prominent chains are oriented semi-normal to the steepest gradient in Holocene sediment thickness. A descriptive model yields field-wide spatial statistics indicating that pockmarks are distributed in non-random clusters. Results enable quantitative comparison of pockmarks in fields worldwide as well as similar concave features, such as impact craters, dolines, or salt pools.

  13. Impacts of Fluid Dynamics Simulation in Study of Nasal Airflow Physiology and Pathophysiology in Realistic Human Three-Dimensional Nose Models

    PubMed Central

    Lee, Heow Peuh; Gordon, Bruce R.

    2012-01-01

    During the past decades, numerous computational fluid dynamics (CFD) studies, constructed from CT or MRI images, have simulated human nasal models. As compared to rhinomanometry and acoustic rhinometry, which provide quantitative information only of nasal airflow, resistance, and cross sectional areas, CFD enables additional measurements of airflow passing through the nasal cavity that help visualize the physiologic impact of alterations in intranasal structures. Therefore, it becomes possible to quantitatively measure, and visually appreciate, the airflow pattern (laminar or turbulent), velocity, pressure, wall shear stress, particle deposition, and temperature changes at different flow rates, in different parts of the nasal cavity. The effects of both existing anatomical factors, as well as post-operative changes, can be assessed. With recent improvements in CFD technology and computing power, there is a promising future for CFD to become a useful tool in planning, predicting, and evaluating outcomes of nasal surgery. This review discusses the possibilities and potential impacts, as well as technical limitations, of using CFD simulation to better understand nasal airflow physiology. PMID:23205221

  14. Quantitative analysis and prediction of G-quadruplex forming sequences in double-stranded DNA

    PubMed Central

    Kim, Minji; Kreig, Alex; Lee, Chun-Ying; Rube, H. Tomas; Calvert, Jacob; Song, Jun S.; Myong, Sua

    2016-01-01

    Abstract G-quadruplex (GQ) is a four-stranded DNA structure that can be formed in guanine-rich sequences. GQ structures have been proposed to regulate diverse biological processes including transcription, replication, translation and telomere maintenance. Recent studies have demonstrated the existence of GQ DNA in live mammalian cells and a significant number of potential GQ forming sequences in the human genome. We present a systematic and quantitative analysis of GQ folding propensity on a large set of 438 GQ forming sequences in double-stranded DNA by integrating fluorescence measurement, single-molecule imaging and computational modeling. We find that short minimum loop length and the thymine base are two main factors that lead to high GQ folding propensity. Linear and Gaussian process regression models further validate that the GQ folding potential can be predicted with high accuracy based on the loop length distribution and the nucleotide content of the loop sequences. Our study provides important new parameters that can inform the evaluation and classification of putative GQ sequences in the human genome. PMID:27095201

  15. Comparing models of the periodic variations in spin-down and beamwidth for PSR B1828-11

    NASA Astrophysics Data System (ADS)

    Ashton, G.; Jones, D. I.; Prix, R.

    2016-05-01

    We build a framework using tools from Bayesian data analysis to evaluate models explaining the periodic variations in spin-down and beamwidth of PSR B1828-11. The available data consist of the time-averaged spin-down rate, which displays a distinctive double-peaked modulation, and measurements of the beamwidth. Two concepts exist in the literature that are capable of explaining these variations; we formulate predictive models from these and quantitatively compare them. The first concept is phenomenological and stipulates that the magnetosphere undergoes periodic switching between two metastable states as first suggested by Lyne et al. The second concept, precession, was first considered as a candidate for the modulation of B1828-11 by Stairs et al. We quantitatively compare models built from these concepts using a Bayesian odds ratio. Because the phenomenological switching model itself was informed by these data in the first place, it is difficult to specify appropriate parameter-space priors that can be trusted for an unbiased model comparison. Therefore, we first perform a parameter estimation using the spin-down data, and then use the resulting posterior distributions as priors for model comparison on the beamwidth data. We find that a precession model with a simple circular Gaussian beam geometry fails to appropriately describe the data, while allowing for a more general beam geometry provides a good fit to the data. The resulting odds between the precession model (with a general beam geometry) and the switching model are estimated as 102.7±0.5 in favour of the precession model.

  16. From nucleation to coalescence of Cu2O islands during in situ oxidation of Cu(001)

    NASA Astrophysics Data System (ADS)

    Yang, J. C.; Evan, D.; Tropia, L.

    2002-07-01

    The nucleation, growth, and coalescence of Cu2O islands due to oxidation of Cu(001) films were visualized by in situ ultrahigh-vacuum transmission electron microscopy. We have previously demonstrated that the nucleation and initial growth of copper oxides is dominated by oxygen surface diffusion. These surface models have been extended to quantitatively represent the coalescence behavior of copper oxidation in the framework of the Johnson-Mehl-Avrami-Kolmogorov theory. An excellent agreement exists between the experimental data of nucleation to coalescence with the surface model. The implication could be an alternate paradigm for passivation and oxidation, since classic theories assume uniform film growth.

  17. XDesign: an open-source software package for designing X-ray imaging phantoms and experiments.

    PubMed

    Ching, Daniel J; Gürsoy, Dogˇa

    2017-03-01

    The development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  18. Development of working hypotheses linking management of the Missouri River to population dynamics of Scaphirhynchus albus (pallid sturgeon)

    USGS Publications Warehouse

    Jacobson, Robert B.; Parsley, Michael J.; Annis, Mandy L.; Colvin, Michael E.; Welker, Timothy L.; James, Daniel A.

    2016-01-20

    The initial set of candidate hypotheses provides a useful starting point for quantitative modeling and adaptive management of the river and species. We anticipate that hypotheses will change from the set of working management hypotheses as adaptive management progresses. More importantly, hypotheses that have been filtered out of our multistep process are still being considered. These filtered hypotheses are archived and if existing hypotheses are determined to be inadequate to explain observed population dynamics, new hypotheses can be created or filtered hypotheses can be reinstated.

  19. XDesign: An open-source software package for designing X-ray imaging phantoms and experiments

    DOE PAGES

    Ching, Daniel J.; Gursoy, Dogˇa

    2017-02-21

    Here, the development of new methods or utilization of current X-ray computed tomography methods is impeded by the substantial amount of expertise required to design an X-ray computed tomography experiment from beginning to end. In an attempt to make material models, data acquisition schemes and reconstruction algorithms more accessible to researchers lacking expertise in some of these areas, a software package is described here which can generate complex simulated phantoms and quantitatively evaluate new or existing data acquisition schemes and image reconstruction algorithms for targeted applications.

  20. Quantitative determination of wool in textile by near-infrared spectroscopy and multivariate models.

    PubMed

    Chen, Hui; Tan, Chao; Lin, Zan

    2018-08-05

    The wool content in textiles is a key quality index and the corresponding quantitative analysis takes an important position due to common adulterations in both raw and finished textiles. Conventional methods are maybe complicated, destructive, time-consuming, environment-unfriendly. Developing a quick, easy-to-use and green alternative method is interesting. The work focuses on exploring the feasibility of combining near-infrared (NIR) spectroscopy and several partial least squares (PLS)-based algorithms and elastic component regression (ECR) algorithms for measuring wool content in textile. A total of 108 cloth samples with wool content ranging from 0% to 100% (w/w) were collected and all the compositions are really existent in the market. The dataset was divided equally into the training and test sets for developing and validating calibration models. When using local PLS, the original spectrum axis was split into 20 sub-intervals. No obvious difference of performance can be seen for the local PLS models. The ECR model is comparable or superior to the other models due its flexibility, i.e., being transition state from PCR to PLS. It seems that ECR combined with NIR technique may be a potential method for determining wool content in textile products. In addition, it might have regulatory advantages to avoid time-consuming and environmental-unfriendly chemical analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. On the complex relationship between energy expenditure and longevity: Reconciling the contradictory empirical results with a simple theoretical model.

    PubMed

    Hou, Chen; Amunugama, Kaushalya

    2015-07-01

    The relationship between energy expenditure and longevity has been a central theme in aging studies. Empirical studies have yielded controversial results, which cannot be reconciled by existing theories. In this paper, we present a simple theoretical model based on first principles of energy conservation and allometric scaling laws. The model takes into considerations the energy tradeoffs between life history traits and the efficiency of the energy utilization, and offers quantitative and qualitative explanations for a set of seemingly contradictory empirical results. We show that oxidative metabolism can affect cellular damage and longevity in different ways in animals with different life histories and under different experimental conditions. Qualitative data and the linearity between energy expenditure, cellular damage, and lifespan assumed in previous studies are not sufficient to understand the complexity of the relationships. Our model provides a theoretical framework for quantitative analyses and predictions. The model is supported by a variety of empirical studies, including studies on the cellular damage profile during ontogeny; the intra- and inter-specific correlations between body mass, metabolic rate, and lifespan; and the effects on lifespan of (1) diet restriction and genetic modification of growth hormone, (2) the cold and exercise stresses, and (3) manipulations of antioxidant. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  2. An ecohydrologic model for a shallow groundwater urban environment.

    PubMed

    Arden, Sam; Ma, Xin Cissy; Brown, Mark

    2014-01-01

    The urban environment is a patchwork of natural and artificial surfaces that results in complex interactions with and impacts to natural hydrologic cycles. Evapotranspiration is a major hydrologic flow that is often altered through urbanization, although the mechanisms of change are sometimes difficult to tease out due to difficulty in effectively simulating soil-plant-atmosphere interactions. This paper introduces a simplified yet realistic model that is a combination of existing surface runoff and ecohydrology models designed to increase the quantitative understanding of complex urban hydrologic processes. Results demonstrate that the model is capable of simulating the long-term variability of major hydrologic fluxes as a function of impervious surface, temperature, water table elevation, canopy interception, soil characteristics, precipitation and complex mechanisms of plant water uptake. These understandings have potential implications for holistic urban water system management.

  3. Human eyeball model reconstruction and quantitative analysis.

    PubMed

    Xing, Qi; Wei, Qi

    2014-01-01

    Determining shape of the eyeball is important to diagnose eyeball disease like myopia. In this paper, we present an automatic approach to precisely reconstruct three dimensional geometric shape of eyeball from MR Images. The model development pipeline involved image segmentation, registration, B-Spline surface fitting and subdivision surface fitting, neither of which required manual interaction. From the high resolution resultant models, geometric characteristics of the eyeball can be accurately quantified and analyzed. In addition to the eight metrics commonly used by existing studies, we proposed two novel metrics, Gaussian Curvature Analysis and Sphere Distance Deviation, to quantify the cornea shape and the whole eyeball surface respectively. The experiment results showed that the reconstructed eyeball models accurately represent the complex morphology of the eye. The ten metrics parameterize the eyeball among different subjects, which can potentially be used for eye disease diagnosis.

  4. Perspectives on Non-Animal Alternatives for Assessing Sensitization Potential in Allergic Contact Dermatitis

    PubMed Central

    Sharma, Nripen S.; Jindal, Rohit; Mitra, Bhaskar; Lee, Serom; Li, Lulu; Maguire, Tim J.; Schloss, Rene; Yarmush, Martin L.

    2014-01-01

    Skin sensitization remains a major environmental and occupational health hazard. Animal models have been used as the gold standard method of choice for estimating chemical sensitization potential. However, a growing international drive and consensus for minimizing animal usage have prompted the development of in vitro methods to assess chemical sensitivity. In this paper, we examine existing approaches including in silico models, cell and tissue based assays for distinguishing between sensitizers and irritants. The in silico approaches that have been discussed include Quantitative Structure Activity Relationships (QSAR) and QSAR based expert models that correlate chemical molecular structure with biological activity and mechanism based read-across models that incorporate compound electrophilicity. The cell and tissue based assays rely on an assortment of mono and co-culture cell systems in conjunction with 3D skin models. Given the complexity of allergen induced immune responses, and the limited ability of existing systems to capture the entire gamut of cellular and molecular events associated with these responses, we also introduce a microfabricated platform that can capture all the key steps involved in allergic contact sensitivity. Finally, we describe the development of an integrated testing strategy comprised of two or three tier systems for evaluating sensitization potential of chemicals. PMID:24741377

  5. Telehealth as gatekeeper: policy implications for geography and scope of services.

    PubMed

    Kraetschmer, Nancy M; Deber, Raisa B; Dick, Paul; Jennett, Penny

    2009-09-01

    Why, despite enthusiasm, is telehealth still a relatively minor part of healthcare delivery in many health systems? We examined two less-considered policy issues: (1) the scope of services being offered by telehealth and how this matches existing arrangements for insured services; and (2) how the ability of telehealth services to minimize barriers associated with geography is dealt with in a system organized and financed on geographical boundaries. Fifty-three semistructured interviews with key stakeholders involved in the management of 43 Canadian telehealth programs were conducted. In addition, quantitative activity data were analyzed from 33 telehealth programs. Two telehealth approaches emerged: telephone-based (N = 3), and video-conferencing-based (N = 40). Most programs reflected, rather than superceded, existing geographical boundaries; with the technology being used, the videoconferencing models imposed significant barriers to unfettered access by outlying communities because they required sites to acquire expensive technology, be affiliated with an existing telehealth network, and schedule visits in advance. In consequence, much activity was administrative and educational, rather than clinical, and often extended beyond the set of mandatory insured services. Despite high hopes that telehealth would improve access to care for rural/remote areas, gatekeeping inherent in certain telehealth systems imposes barriers to unfettered use by rural/remote areas, although it does facilitate other valued activities. Policy approaches are needed to promote a closer match between the expectations for telehealth and the realities reflected by many existing models.

  6. Self-consistent approach for neutral community models with speciation

    NASA Astrophysics Data System (ADS)

    Haegeman, Bart; Etienne, Rampal S.

    2010-03-01

    Hubbell’s neutral model provides a rich theoretical framework to study ecological communities. By incorporating both ecological and evolutionary time scales, it allows us to investigate how communities are shaped by speciation processes. The speciation model in the basic neutral model is particularly simple, describing speciation as a point-mutation event in a birth of a single individual. The stationary species abundance distribution of the basic model, which can be solved exactly, fits empirical data of distributions of species’ abundances surprisingly well. More realistic speciation models have been proposed such as the random-fission model in which new species appear by splitting up existing species. However, no analytical solution is available for these models, impeding quantitative comparison with data. Here, we present a self-consistent approximation method for neutral community models with various speciation modes, including random fission. We derive explicit formulas for the stationary species abundance distribution, which agree very well with simulations. We expect that our approximation method will be useful to study other speciation processes in neutral community models as well.

  7. Assessing pesticide risks to threatened and endangered species using population models: Findings and recommendations from a CropLife America Science Forum.

    PubMed

    Forbes, V E; Brain, R; Edwards, D; Galic, N; Hall, T; Honegger, J; Meyer, C; Moore, D R J; Nacci, D; Pastorok, R; Preuss, T G; Railsback, S F; Salice, C; Sibly, R M; Tenhumberg, B; Thorbek, P; Wang, M

    2015-07-01

    This brief communication reports on the main findings and recommendations from the 2014 Science Forum organized by CropLife America. The aim of the Forum was to gain a better understanding of the current status of population models and how they could be used in ecological risk assessments for threatened and endangered species potentially exposed to pesticides in the United States. The Forum panelists' recommendations are intended to assist the relevant government agencies with implementation of population modeling in future endangered species risk assessments for pesticides. The Forum included keynote presentations that provided an overview of current practices, highlighted the findings of a recent National Academy of Sciences report and its implications, reviewed the main categories of existing population models and the types of risk expressions that can be produced as model outputs, and provided examples of how population models are currently being used in different legislative contexts. The panel concluded that models developed for listed species assessments should provide quantitative risk estimates, incorporate realistic variability in environmental and demographic factors, integrate complex patterns of exposure and effects, and use baseline conditions that include present factors that have caused the species to be listed (e.g., habitat loss, invasive species) or have resulted in positive management action. Furthermore, the panel advocates for the formation of a multipartite advisory committee to provide best available knowledge and guidance related to model implementation and use, to address such needs as more systematic collection, digitization, and dissemination of data for listed species; consideration of the newest developments in good modeling practice; comprehensive review of existing population models and their applicability for listed species assessments; and development of case studies using a few well-tested models for particular species to demonstrate proof of concept. To advance our common goals, the panel recommends the following as important areas for further research and development: quantitative analysis of the causes of species listings to guide model development; systematic assessment of the relative role of toxicity versus other factors in driving pesticide risk; additional study of how interactions between density dependence and pesticides influence risk; and development of pragmatic approaches to assessing indirect effects of pesticides on listed species. © 2015 SETAC.

  8. A strategic management model for evaluation of health, safety and environmental performance.

    PubMed

    Abbaspour, Majid; Toutounchian, Solmaz; Roayaei, Emad; Nassiri, Parvin

    2012-05-01

    Strategic health, safety, and environmental management system (HSE-MS) involves systematic and cooperative planning in each phase of the lifecycle of a project to ensure that interaction among the industry group, client, contractor, stakeholder, and host community exists with the highest level of health, safety, and environmental standard performances. Therefore, it seems necessary to assess the HSE-MS performance of contractor(s) by a comparative strategic management model with the aim of continuous improvement. The present Strategic Management Model (SMM) has been illustrated by a case study and the results show that the model is a suitable management tool for decision making in a contract environment, especially in oil and gas fields and based on accepted international standards within the framework of management deming cycle. To develop this model, a data bank has been created, which includes the statistical data calculated by converting the HSE performance qualitative data into quantitative values. Based on this fact, the structure of the model has been formed by defining HSE performance indicators according to the HSE-MS model. Therefore, 178 indicators have been selected which have been grouped into four attributes. Model output provides quantitative measures of HSE-MS performance as a percentage of an ideal level with maximum possible score for each attribute. Defining the strengths and weaknesses of the contractor(s) is another capability of this model. On the other hand, this model provides a ranking that could be used as the basis for decision making at the contractors' pre-qualification phase or during the execution of the project.

  9. Revised planetary protection policy for solar system exploration.

    PubMed

    DeVincenzi, D L; Stabekis, P D

    1984-01-01

    In order to control contamination of planets by terrestrial microorganisms and organic constituents, U.S. planetary missions have been governed by a planetary protection (or planetary quarantine) policy which has changed little since 1972. This policy has recently been reviewed in light of new information obtained from planetary exploration during the past decade and because of changes to, or uncertainties in, some parameters used in the existing quantitative approach. On the basis of this analysis, a revised planetary protection policy with the following key features is proposed: deemphasizing the use of mathematical models and quantitative analyses; establishing requirements for target planet/mission type (i.e., orbiter, lander, etc.) combinations; considering sample return missions a separate category; simplifying documentation; and imposing implementing procedures (i.e., trajectory biasing, cleanroom assembly, spacecraft sterilization, etc.) by exception, i.e., only if the planet/mission combination warrants such controls.

  10. Animal versus human oral drug bioavailability: Do they correlate?

    PubMed Central

    Musther, Helen; Olivares-Morales, Andrés; Hatley, Oliver J.D.; Liu, Bo; Rostami Hodjegan, Amin

    2014-01-01

    Oral bioavailability is a key consideration in development of drug products, and the use of preclinical species in predicting bioavailability in human has long been debated. In order to clarify whether any correlation between human and animal bioavailability exist, an extensive analysis of the published literature data was conducted. Due to the complex nature of bioavailability calculations inclusion criteria were applied to ensure integrity of the data. A database of 184 compounds was assembled. Linear regression for the reported compounds indicated no strong or predictive correlations to human data for all species, individually and combined. The lack of correlation in this extended dataset highlights that animal bioavailability is not quantitatively predictive of bioavailability in human. Although qualitative (high/low bioavailability) indications might be possible, models taking into account species-specific factors that may affect bioavailability are recommended for developing quantitative prediction. PMID:23988844

  11. An approach to computing direction relations between separated object groups

    NASA Astrophysics Data System (ADS)

    Yan, H.; Wang, Z.; Li, J.

    2013-06-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on Gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups; and then it constructs the Voronoi Diagram between the two groups using the triangular network; after this, the normal of each Vornoi edge is calculated, and the quantitative expression of the direction relations is constructed; finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  12. An approach to computing direction relations between separated object groups

    NASA Astrophysics Data System (ADS)

    Yan, H.; Wang, Z.; Li, J.

    2013-09-01

    Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.

  13. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    EPA Science Inventory

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  14. Using Uncertainty Quantification to Guide Development and Improvements of a Regional-Scale Model of the Coastal Lowlands Aquifer System Spanning Texas, Louisiana, Mississippi, Alabama and Florida

    NASA Astrophysics Data System (ADS)

    Foster, L. K.; Clark, B. R.; Duncan, L. L.; Tebo, D. T.; White, J.

    2017-12-01

    Several historical groundwater models exist within the Coastal Lowlands Aquifer System (CLAS), which spans the Gulf Coastal Plain in Texas, Louisiana, Mississippi, Alabama, and Florida. The largest of these models, called the Gulf Coast Regional Aquifer System Analysis (RASA) model, has been brought into a new framework using the Newton formulation for MODFLOW-2005 (MODFLOW-NWT) and serves as the starting point of a new investigation underway by the U.S. Geological Survey to improve understanding of the CLAS and provide predictions of future groundwater availability within an uncertainty quantification (UQ) framework. The use of an UQ framework will not only provide estimates of water-level observation worth, hydraulic parameter uncertainty, boundary-condition uncertainty, and uncertainty of future potential predictions, but it will also guide the model development process. Traditionally, model development proceeds from dataset construction to the process of deterministic history matching, followed by deterministic predictions using the model. This investigation will combine the use of UQ with existing historical models of the study area to assess in a quantitative framework the effect model package and property improvements have on the ability to represent past-system states, as well as the effect on the model's ability to make certain predictions of water levels, water budgets, and base-flow estimates. Estimates of hydraulic property information and boundary conditions from the existing models and literature, forming the prior, will be used to make initial estimates of model forecasts and their corresponding uncertainty, along with an uncalibrated groundwater model run within an unconstrained Monte Carlo analysis. First-Order Second-Moment (FOSM) analysis will also be used to investigate parameter and predictive uncertainty, and guide next steps in model development prior to rigorous history matching by using PEST++ parameter estimation code.

  15. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    PubMed

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it was possible to interchange certain descriptors (i.e. molecular weight and melting point) without incurring a loss of model quality. Such synergy suggested that a model constructed from discrete terms in an equation may not be the most appropriate way of representing mechanistic understandings of skin absorption.

  16. Models of Jovian decametric radiation. [astronomical models of decametric waves

    NASA Technical Reports Server (NTRS)

    Smith, R. A.

    1975-01-01

    A critical review is presented of theoretical models of Jovian decametric radiation, with particular emphasis on the Io-modulated emission. The problem is divided into three broad aspects: (1) the mechanism coupling Io's orbital motion to the inner exosphere, (2) the consequent instability mechanism by which electromagnetic waves are amplified, and (3) the subsequent propagation of the waves in the source region and the Jovian plasmasphere. At present there exists no comprehensive theory that treats all of these aspects quantitatively within a single framework. Acceleration of particles by plasma sheaths near Io is proposed as an explanation for the coupling mechanism, while most of the properties of the emission may be explained in the context of cyclotron instability of a highly anisotropic distribution of streaming particles.

  17. Error Discounting in Probabilistic Category Learning

    PubMed Central

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    Some current theories of probabilistic categorization assume that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report two probabilistic-categorization experiments that investigated error discounting by shifting feedback probabilities to new values after different amounts of training. In both experiments, responding gradually became less responsive to errors, and learning was slowed for some time after the feedback shift. Both results are indicative of error discounting. Quantitative modeling of the data revealed that adding a mechanism for error discounting significantly improved the fits of an exemplar-based and a rule-based associative learning model, as well as of a recency-based model of categorization. We conclude that error discounting is an important component of probabilistic learning. PMID:21355666

  18. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1982-01-01

    Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.

  19. Analysis and improvement measures of flight delay in China

    NASA Astrophysics Data System (ADS)

    Zang, Yuhang

    2017-03-01

    Firstly, this paper establishes the principal component regression model to analyze the data quantitatively, based on principal component analysis to get the three principal component factors of flight delays. Then the least square method is used to analyze the factors and obtained the regression equation expression by substitution, and then found that the main reason for flight delays is airlines, followed by weather and traffic. Aiming at the above problems, this paper improves the controllable aspects of traffic flow control. For reasons of traffic flow control, an adaptive genetic queuing model is established for the runway terminal area. This paper, establish optimization method that fifteen planes landed simultaneously on the three runway based on Beijing capital international airport, comparing the results with the existing FCFS algorithm, the superiority of the model is proved.

  20. Cellular interface morphologies in directional solidification. III - The effects of heat transfer and solid diffusivity

    NASA Technical Reports Server (NTRS)

    Ungar, Lyle H.; Bennett, Mark J.; Brown, Robert A.

    1985-01-01

    The shape and stability of two-dimensional finite-amplitude cellular interfaces arising during directional solidification are compared for several solidification models that account differently for latent heat released at the interface, unequal thermal conductivities of melt and solid, and solute diffusivity in the solid. Finite-element analysis and computer-implemented perturbation methods are used to analyze the families of steadily growing cellular forms that evolve from the planar state. In all models a secondary bifurcation between different families of finite-amplitude cells exists that halves the spatial wavelength of the stable interface. The quantitative location of this transition is very dependent on the details of the model. Large amounts of solute diffusion in the solid retard the growth of large-amplitude cells.

  1. Numerical modelling of instantaneous plate tectonics

    NASA Technical Reports Server (NTRS)

    Minster, J. B.; Haines, E.; Jordan, T. H.; Molnar, P.

    1974-01-01

    Assuming lithospheric plates to be rigid, 68 spreading rates, 62 fracture zones trends, and 106 earthquake slip vectors are systematically inverted to obtain a self-consistent model of instantaneous relative motions for eleven major plates. The inverse problem is linearized and solved iteratively by a maximum-likelihood procedure. Because the uncertainties in the data are small, Gaussian statistics are shown to be adequate. The use of a linear theory permits (1) the calculation of the uncertainties in the various angular velocity vectors caused by uncertainties in the data, and (2) quantitative examination of the distribution of information within the data set. The existence of a self-consistent model satisfying all the data is strong justification of the rigid plate assumption. Slow movement between North and South America is shown to be resolvable.

  2. Climate Shocks and Migration: An Agent-Based Modeling Approach.

    PubMed

    Entwisle, Barbara; Williams, Nathalie E; Verdery, Ashton M; Rindfuss, Ronald R; Walsh, Stephen J; Malanson, George P; Mucha, Peter J; Frizzelle, Brian G; McDaniel, Philip M; Yao, Xiaozheng; Heumann, Benjamin W; Prasartkul, Pramote; Sawangdee, Yothin; Jampaklay, Aree

    2016-09-01

    This is a study of migration responses to climate shocks. We construct an agent-based model that incorporates dynamic linkages between demographic behaviors, such as migration, marriage, and births, and agriculture and land use, which depend on rainfall patterns. The rules and parameterization of our model are empirically derived from qualitative and quantitative analyses of a well-studied demographic field site, Nang Rong district, Northeast Thailand. With this model, we simulate patterns of migration under four weather regimes in a rice economy: 1) a reference, 'normal' scenario; 2) seven years of unusually wet weather; 3) seven years of unusually dry weather; and 4) seven years of extremely variable weather. Results show relatively small impacts on migration. Experiments with the model show that existing high migration rates and strong selection factors, which are unaffected by climate change, are likely responsible for the weak migration response.

  3. Climate Shocks and Migration: An Agent-Based Modeling Approach

    PubMed Central

    Entwisle, Barbara; Williams, Nathalie E.; Verdery, Ashton M.; Rindfuss, Ronald R.; Walsh, Stephen J.; Malanson, George P.; Mucha, Peter J.; Frizzelle, Brian G.; McDaniel, Philip M.; Yao, Xiaozheng; Heumann, Benjamin W.; Prasartkul, Pramote; Sawangdee, Yothin; Jampaklay, Aree

    2016-01-01

    This is a study of migration responses to climate shocks. We construct an agent-based model that incorporates dynamic linkages between demographic behaviors, such as migration, marriage, and births, and agriculture and land use, which depend on rainfall patterns. The rules and parameterization of our model are empirically derived from qualitative and quantitative analyses of a well-studied demographic field site, Nang Rong district, Northeast Thailand. With this model, we simulate patterns of migration under four weather regimes in a rice economy: 1) a reference, ‘normal’ scenario; 2) seven years of unusually wet weather; 3) seven years of unusually dry weather; and 4) seven years of extremely variable weather. Results show relatively small impacts on migration. Experiments with the model show that existing high migration rates and strong selection factors, which are unaffected by climate change, are likely responsible for the weak migration response. PMID:27594725

  4. Finding identifiable parameter combinations in nonlinear ODE models and the rational reparameterization of their input-output equations.

    PubMed

    Meshkat, Nicolette; Anderson, Chris; Distefano, Joseph J

    2011-09-01

    When examining the structural identifiability properties of dynamic system models, some parameters can take on an infinite number of values and yet yield identical input-output data. These parameters and the model are then said to be unidentifiable. Finding identifiable combinations of parameters with which to reparameterize the model provides a means for quantitatively analyzing the model and computing solutions in terms of the combinations. In this paper, we revisit and explore the properties of an algorithm for finding identifiable parameter combinations using Gröbner Bases and prove useful theoretical properties of these parameter combinations. We prove a set of M algebraically independent identifiable parameter combinations can be found using this algorithm and that there exists a unique rational reparameterization of the input-output equations over these parameter combinations. We also demonstrate application of the procedure to a nonlinear biomodel. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Oceanic Fluxes of Mass, Heat and Freshwater: A Global Estimate and Perspective

    NASA Technical Reports Server (NTRS)

    MacDonald, Alison Marguerite

    1995-01-01

    Data from fifteen globally distributed, modern, high resolution, hydrographic oceanic transects are combined in an inverse calculation using large scale box models. The models provide estimates of the global meridional heat and freshwater budgets and are used to examine the sensitivity of the global circulation, both inter and intra-basin exchange rates, to a variety of external constraints provided by estimates of Ekman, boundary current and throughflow transports. A solution is found which is consistent with both the model physics and the global data set, despite a twenty five year time span and a lack of seasonal consistency among the data. The overall pattern of the global circulation suggested by the models is similar to that proposed in previously published local studies and regional reviews. However, significant qualitative and quantitative differences exist. These differences are due both to the model definition and to the global nature of the data set.

  6. Integrating landslide and liquefaction hazard and loss estimates with existing USGS real-time earthquake information products

    USGS Publications Warehouse

    Allstadt, Kate E.; Thompson, Eric M.; Hearne, Mike; Nowicki Jessee, M. Anna; Zhu, J.; Wald, David J.; Tanyas, Hakan

    2017-01-01

    The U.S. Geological Survey (USGS) has made significant progress toward the rapid estimation of shaking and shakingrelated losses through their Did You Feel It? (DYFI), ShakeMap, ShakeCast, and PAGER products. However, quantitative estimates of the extent and severity of secondary hazards (e.g., landsliding, liquefaction) are not currently included in scenarios and real-time post-earthquake products despite their significant contributions to hazard and losses for many events worldwide. We are currently running parallel global statistical models for landslides and liquefaction developed with our collaborators in testing mode, but much work remains in order to operationalize these systems. We are expanding our efforts in this area by not only improving the existing statistical models, but also by (1) exploring more sophisticated, physics-based models where feasible; (2) incorporating uncertainties; and (3) identifying and undertaking research and product development to provide useful landslide and liquefaction estimates and their uncertainties. Although our existing models use standard predictor variables that are accessible globally or regionally, including peak ground motions, topographic slope, and distance to water bodies, we continue to explore readily available proxies for rock and soil strength as well as other susceptibility terms. This work is based on the foundation of an expanding, openly available, case-history database we are compiling along with historical ShakeMaps for each event. The expected outcome of our efforts is a robust set of real-time secondary hazards products that meet the needs of a wide variety of earthquake information users. We describe the available datasets and models, developments currently underway, and anticipated products. 

  7. Effects of Inventory Bias on Landslide Susceptibility Calculations

    NASA Technical Reports Server (NTRS)

    Stanley, T. A.; Kirschbaum, D. B.

    2017-01-01

    Many landslide inventories are known to be biased, especially inventories for large regions such as Oregon's SLIDO or NASA's Global Landslide Catalog. These biases must affect the results of empirically derived susceptibility models to some degree. We evaluated the strength of the susceptibility model distortion from postulated biases by truncating an unbiased inventory. We generated a synthetic inventory from an existing landslide susceptibility map of Oregon, then removed landslides from this inventory to simulate the effects of reporting biases likely to affect inventories in this region, namely population and infrastructure effects. Logistic regression models were fitted to the modified inventories. Then the process of biasing a susceptibility model was repeated with SLIDO data. We evaluated each susceptibility model with qualitative and quantitative methods. Results suggest that the effects of landslide inventory bias on empirical models should not be ignored, even if those models are, in some cases, useful. We suggest fitting models in well-documented areas and extrapolating across the study region as a possible approach to modeling landslide susceptibility with heavily biased inventories.

  8. Effects of Inventory Bias on Landslide Susceptibility Calculations

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas; Kirschbaum, Dalia B.

    2017-01-01

    Many landslide inventories are known to be biased, especially inventories for large regions such as Oregons SLIDO or NASAs Global Landslide Catalog. These biases must affect the results of empirically derived susceptibility models to some degree. We evaluated the strength of the susceptibility model distortion from postulated biases by truncating an unbiased inventory. We generated a synthetic inventory from an existing landslide susceptibility map of Oregon, then removed landslides from this inventory to simulate the effects of reporting biases likely to affect inventories in this region, namely population and infrastructure effects. Logistic regression models were fitted to the modified inventories. Then the process of biasing a susceptibility model was repeated with SLIDO data. We evaluated each susceptibility model with qualitative and quantitative methods. Results suggest that the effects of landslide inventory bias on empirical models should not be ignored, even if those models are, in some cases, useful. We suggest fitting models in well-documented areas and extrapolating across the study region as a possible approach to modelling landslide susceptibility with heavily biased inventories.

  9. Using Bayesian regression to test hypotheses about relationships between parameters and covariates in cognitive models.

    PubMed

    Boehm, Udo; Steingroever, Helen; Wagenmakers, Eric-Jan

    2018-06-01

    An important tool in the advancement of cognitive science are quantitative models that represent different cognitive variables in terms of model parameters. To evaluate such models, their parameters are typically tested for relationships with behavioral and physiological variables that are thought to reflect specific cognitive processes. However, many models do not come equipped with the statistical framework needed to relate model parameters to covariates. Instead, researchers often revert to classifying participants into groups depending on their values on the covariates, and subsequently comparing the estimated model parameters between these groups. Here we develop a comprehensive solution to the covariate problem in the form of a Bayesian regression framework. Our framework can be easily added to existing cognitive models and allows researchers to quantify the evidential support for relationships between covariates and model parameters using Bayes factors. Moreover, we present a simulation study that demonstrates the superiority of the Bayesian regression framework to the conventional classification-based approach.

  10. Quantitative and predictive model of kinetic regulation by E. coli TPP riboswitches

    PubMed Central

    Guedich, Sondés; Puffer-Enders, Barbara; Baltzinger, Mireille; Hoffmann, Guillaume; Da Veiga, Cyrielle; Jossinet, Fabrice; Thore, Stéphane; Bec, Guillaume; Ennifar, Eric; Burnouf, Dominique; Dumas, Philippe

    2016-01-01

    ABSTRACT Riboswitches are non-coding elements upstream or downstream of mRNAs that, upon binding of a specific ligand, regulate transcription and/or translation initiation in bacteria, or alternative splicing in plants and fungi. We have studied thiamine pyrophosphate (TPP) riboswitches regulating translation of thiM operon and transcription and translation of thiC operon in E. coli, and that of THIC in the plant A. thaliana. For all, we ascertained an induced-fit mechanism involving initial binding of the TPP followed by a conformational change leading to a higher-affinity complex. The experimental values obtained for all kinetic and thermodynamic parameters of TPP binding imply that the regulation by A. thaliana riboswitch is governed by mass-action law, whereas it is of kinetic nature for the two bacterial riboswitches. Kinetic regulation requires that the RNA polymerase pauses after synthesis of each riboswitch aptamer to leave time for TPP binding, but only when its concentration is sufficient. A quantitative model of regulation highlighted how the pausing time has to be linked to the kinetic rates of initial TPP binding to obtain an ON/OFF switch in the correct concentration range of TPP. We verified the existence of these pauses and the model prediction on their duration. Our analysis also led to quantitative estimates of the respective efficiency of kinetic and thermodynamic regulations, which shows that kinetically regulated riboswitches react more sharply to concentration variation of their ligand than thermodynamically regulated riboswitches. This rationalizes the interest of kinetic regulation and confirms empirical observations that were obtained by numerical simulations. PMID:26932506

  11. Quantitative and predictive model of kinetic regulation by E. coli TPP riboswitches.

    PubMed

    Guedich, Sondés; Puffer-Enders, Barbara; Baltzinger, Mireille; Hoffmann, Guillaume; Da Veiga, Cyrielle; Jossinet, Fabrice; Thore, Stéphane; Bec, Guillaume; Ennifar, Eric; Burnouf, Dominique; Dumas, Philippe

    2016-01-01

    Riboswitches are non-coding elements upstream or downstream of mRNAs that, upon binding of a specific ligand, regulate transcription and/or translation initiation in bacteria, or alternative splicing in plants and fungi. We have studied thiamine pyrophosphate (TPP) riboswitches regulating translation of thiM operon and transcription and translation of thiC operon in E. coli, and that of THIC in the plant A. thaliana. For all, we ascertained an induced-fit mechanism involving initial binding of the TPP followed by a conformational change leading to a higher-affinity complex. The experimental values obtained for all kinetic and thermodynamic parameters of TPP binding imply that the regulation by A. thaliana riboswitch is governed by mass-action law, whereas it is of kinetic nature for the two bacterial riboswitches. Kinetic regulation requires that the RNA polymerase pauses after synthesis of each riboswitch aptamer to leave time for TPP binding, but only when its concentration is sufficient. A quantitative model of regulation highlighted how the pausing time has to be linked to the kinetic rates of initial TPP binding to obtain an ON/OFF switch in the correct concentration range of TPP. We verified the existence of these pauses and the model prediction on their duration. Our analysis also led to quantitative estimates of the respective efficiency of kinetic and thermodynamic regulations, which shows that kinetically regulated riboswitches react more sharply to concentration variation of their ligand than thermodynamically regulated riboswitches. This rationalizes the interest of kinetic regulation and confirms empirical observations that were obtained by numerical simulations.

  12. What Good Are Statistics that Don't Generalize?

    ERIC Educational Resources Information Center

    Shaffer, David Williamson; Serlin, Ronald C.

    2004-01-01

    Quantitative and qualitative inquiry are sometimes portrayed as distinct and incompatible paradigms for research in education. Approaches to combining qualitative and quantitative research typically "integrate" the two methods by letting them co-exist independently within a single research study. Here we describe intra-sample statistical analysis…

  13. 40 CFR 125.95 - As an owner or operator of a Phase II existing facility, what must I collect and submit when I...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...

  14. 40 CFR 125.95 - As an owner or operator of a Phase II existing facility, what must I collect and submit when I...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...

  15. 40 CFR 125.95 - As an owner or operator of a Phase II existing facility, what must I collect and submit when I...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...

  16. 40 CFR 125.95 - As an owner or operator of a Phase II existing facility, what must I collect and submit when I...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...

  17. 40 CFR 125.95 - As an owner or operator of a Phase II existing facility, what must I collect and submit when I...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...

  18. 19 CFR 206.14 - Contents of petition.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...; (e) Data showing injury. Quantitative data indicating the nature and extent of injury to the domestic... maintain existing levels of expenditures for research and development; (iii) The extent to which the U.S... adjustment to import competition. (i) Imports from NAFTA countries. Quantitative data indicating the share of...

  19. 19 CFR 206.14 - Contents of petition.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...; (e) Data showing injury. Quantitative data indicating the nature and extent of injury to the domestic... maintain existing levels of expenditures for research and development; (iii) The extent to which the U.S... adjustment to import competition. (i) Imports from NAFTA countries. Quantitative data indicating the share of...

  20. 19 CFR 206.14 - Contents of petition.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...; (e) Data showing injury. Quantitative data indicating the nature and extent of injury to the domestic... maintain existing levels of expenditures for research and development; (iii) The extent to which the U.S... adjustment to import competition. (i) Imports from NAFTA countries. Quantitative data indicating the share of...

  1. 19 CFR 206.14 - Contents of petition.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...; (e) Data showing injury. Quantitative data indicating the nature and extent of injury to the domestic... maintain existing levels of expenditures for research and development; (iii) The extent to which the U.S... adjustment to import competition. (i) Imports from NAFTA countries. Quantitative data indicating the share of...

  2. 19 CFR 206.14 - Contents of petition.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...; (e) Data showing injury. Quantitative data indicating the nature and extent of injury to the domestic... maintain existing levels of expenditures for research and development; (iii) The extent to which the U.S... adjustment to import competition. (i) Imports from NAFTA countries. Quantitative data indicating the share of...

  3. Large scale landslide susceptibility assessment using the statistical methods of logistic regression and BSA - study case: the sub-basin of the small Niraj (Transylvania Depression, Romania)

    NASA Astrophysics Data System (ADS)

    Roşca, S.; Bilaşco, Ş.; Petrea, D.; Fodorean, I.; Vescan, I.; Filip, S.; Măguţ, F.-L.

    2015-11-01

    The existence of a large number of GIS models for the identification of landslide occurrence probability makes difficult the selection of a specific one. The present study focuses on the application of two quantitative models: the logistic and the BSA models. The comparative analysis of the results aims at identifying the most suitable model. The territory corresponding to the Niraj Mic Basin (87 km2) is an area characterised by a wide variety of the landforms with their morphometric, morphographical and geological characteristics as well as by a high complexity of the land use types where active landslides exist. This is the reason why it represents the test area for applying the two models and for the comparison of the results. The large complexity of input variables is illustrated by 16 factors which were represented as 72 dummy variables, analysed on the basis of their importance within the model structures. The testing of the statistical significance corresponding to each variable reduced the number of dummy variables to 12 which were considered significant for the test area within the logistic model, whereas for the BSA model all the variables were employed. The predictability degree of the models was tested through the identification of the area under the ROC curve which indicated a good accuracy (AUROC = 0.86 for the testing area) and predictability of the logistic model (AUROC = 0.63 for the validation area).

  4. Quantitative phenomenological model of the BOLD contrast mechanism

    NASA Astrophysics Data System (ADS)

    Dickson, John D.; Ash, Tom W. J.; Williams, Guy B.; Sukstanskii, Alexander L.; Ansorge, Richard E.; Yablonskiy, Dmitriy A.

    2011-09-01

    Different theoretical models of the BOLD contrast mechanism are used for many applications including BOLD quantification (qBOLD) and vessel size imaging, both in health and disease. Each model simplifies the system under consideration, making approximations about the structure of the blood vessel network and diffusion of water molecules through inhomogeneities in the magnetic field created by deoxyhemoglobin-containing blood vessels. In this study, Monte-Carlo methods are used to simulate the BOLD MR signal generated by diffusing water molecules in the presence of long, cylindrical blood vessels. Using these simulations we introduce a new, phenomenological model that is far more accurate over a range of blood oxygenation levels and blood vessel radii than existing models. This model could be used to extract physiological parameters of the blood vessel network from experimental data in BOLD-based experiments. We use our model to establish ranges of validity for the existing analytical models of Yablonskiy and Haacke, Kiselev and Posse, Sukstanskii and Yablonskiy (extended to the case of arbitrary time in the spin echo sequence) and Bauer et al. (extended to the case of randomly oriented cylinders). Although these models are shown to be accurate in the limits of diffusion under which they were derived, none of them is accurate for the whole physiological range of blood vessels radii and blood oxygenation levels. We also show the extent of systematic errors that are introduced due to the approximations of these models when used for BOLD signal quantification.

  5. Evolution and polymorphism in the multilocus Levene model with no or weak epistasis.

    PubMed

    Bürger, Reinhard

    2010-09-01

    Evolution and the maintenance of polymorphism under the multilocus Levene model with soft selection are studied. The number of loci and alleles, the number of demes, the linkage map, and the degree of dominance are arbitrary, but epistasis is absent or weak. We prove that, without epistasis and under mild, generic conditions, every trajectory converges to a stationary point in linkage equilibrium. Consequently, the equilibrium and stability structure can be determined by investigating the much simpler gene-frequency dynamics on the linkage-equilibrium manifold. For a haploid species an analogous result is shown. For weak epistasis, global convergence to quasi-linkage equilibrium is established. As an application, the maintenance of multilocus polymorphism is explored if the degree of dominance is intermediate at every locus and epistasis is absent or weak. If there are at least two demes, then arbitrarily many multiallelic loci can be maintained polymorphic at a globally asymptotically stable equilibrium. Because this holds for an open set of parameters, such equilibria are structurally stable. If the degree of dominance is not only intermediate but also deme independent, and loci are diallelic, an open set of parameters yielding an internal equilibrium exists only if the number of loci is strictly less than the number of demes. Otherwise, a fully polymorphic equilibrium exists only nongenerically, and if it exists, it consists of a manifold of equilibria. Its dimension is determined. In the absence of genotype-by-environment interaction, however, a manifold of equilibria occurs for an open set of parameters. In this case, the equilibrium structure is not robust to small deviations from no genotype-by-environment interaction. In a quantitative-genetic setting, the assumptions of no epistasis and intermediate dominance are equivalent to assuming that in every deme directional selection acts on a trait that is determined additively, i.e., by nonepistatic loci with dominance. Some of our results are exemplified in this quantitative-genetic context. Copyright 2010 Elsevier Inc. All rights reserved.

  6. Marginal iodide deficiency and thyroid function: dose-response analysis for quantitative pharmacokinetic modeling.

    PubMed

    Gilbert, M E; McLanahan, E D; Hedge, J; Crofton, K M; Fisher, J W; Valentín-Blasini, L; Blount, B C

    2011-04-28

    Severe iodine deficiency (ID) results in adverse health outcomes and remains a benchmark for understanding the effects of developmental hypothyroidism. The implications of marginal ID, however, remain less well known. The current study examined the relationship between graded levels of ID in rats and serum thyroid hormones, thyroid iodine content, and urinary iodide excretion. The goals of this study were to provide parametric and dose-response information for development of a quantitative model of the thyroid axis. Female Long Evans rats were fed casein-based diets containing varying iodine (I) concentrations for 8 weeks. Diets were created by adding 975, 200, 125, 25, or 0 μg/kg I to the base diet (~25 μg I/kg chow) to produce 5 nominal I levels, ranging from excess (basal+added I, Treatment 1: 1000 μg I/kg chow) to deficient (Treatment 5: 25 μg I/kg chow). Food intake and body weight were monitored throughout and on 2 consecutive days each week over the 8-week exposure period, animals were placed in metabolism cages to capture urine. Food, water intake, and body weight gain did not differ among treatment groups. Serum T4 was dose-dependently reduced relative to Treatment 1 with significant declines (19 and 48%) at the two lowest I groups, and no significant changes in serum T3 or TSH were detected. Increases in thyroid weight and decreases in thyroidal and urinary iodide content were observed as a function of decreasing I in the diet. Data were compared with predictions from a recently published biologically based dose-response (BBDR) model for ID. Relative to model predictions, female Long Evans rats under the conditions of this study appeared more resilient to low I intake. These results challenge existing models and provide essential information for development of quantitative BBDR models for ID during pregnancy and lactation. Published by Elsevier Ireland Ltd.

  7. Quantitative extraction of the bedrock exposure rate based on unmanned aerial vehicle data and Landsat-8 OLI image in a karst environment

    NASA Astrophysics Data System (ADS)

    Wang, Hongyan; Li, Qiangzi; Du, Xin; Zhao, Longcai

    2017-12-01

    In the karst regions of southwest China, rocky desertification is one of the most serious problems in land degradation. The bedrock exposure rate is an important index to assess the degree of rocky desertification in karst regions. Because of the inherent merits of macro-scale, frequency, efficiency, and synthesis, remote sensing is a promising method to monitor and assess karst rocky desertification on a large scale. However, actual measurement of the bedrock exposure rate is difficult and existing remote-sensing methods cannot directly be exploited to extract the bedrock exposure rate owing to the high complexity and heterogeneity of karst environments. Therefore, using unmanned aerial vehicle (UAV) and Landsat-8 Operational Land Imager (OLI) data for Xingren County, Guizhou Province, quantitative extraction of the bedrock exposure rate based on multi-scale remote-sensing data was developed. Firstly, we used an object-oriented method to carry out accurate classification of UAVimages. From the results of rock extraction, the bedrock exposure rate was calculated at the 30 m grid scale. Parts of the calculated samples were used as training data; other data were used for model validation. Secondly, in each grid the band reflectivity of Landsat-8 OLI data was extracted and a variety of rock and vegetation indexes (e.g., NDVI and SAVI) were calculated. Finally, a network model was established to extract the bedrock exposure rate. The correlation coefficient of the network model was 0.855, that of the validation model was 0.677 and the root mean square error of the validation model was 0.073. This method is valuable for wide-scale estimation of bedrock exposure rate in karst environments. Using the quantitative inversion model, a distribution map of the bedrock exposure rate in Xingren County was obtained.

  8. Distress in significant others of patients with chronic fatigue syndrome: A systematic review of the literature.

    PubMed

    Harris, Kamelia; Band, Rebecca J; Cooper, Hazel; Macintyre, Vanessa G; Mejia, Anilena; Wearden, Alison J

    2016-11-01

    The objective of this study was to systematically review existing empirical research assessing levels and correlates of distress in significant others of patients with chronic fatigue syndrome/myalgic encephalomyelitis (CFS/ME). Systematic searches in CINAHL, Web of Science and PsycINFO were conducted in August 2014. The search was repeated in January 2015 to check for newly published articles. Studies published in English with quantitative, qualitative, or mixed designs exploring distress, poor subjective health, poor mental health, reduced quality of life and well-being, and symptoms of depression and anxiety in significant others (>18 years) of children and adults with CFS/ME were included. Quality appraisal of included studies was carried out. Quantitative and qualitative studies were summarized separately. Six articles met eligibility criteria. Two quantitative studies with significant others of adult patients, and one quantitative and two mixed-methods studies with significant others of child patients showed moderate to high levels of distress. One qualitative study (adult patients) found minimal evidence of distress and that acceptance of CFS/ME was related to better adjustment. In the quantitative and mixed-methods studies, significant others who attributed some level of responsibility for symptoms to the patient, or who were female, or whose partners had poorer mental health, had higher levels of distress. The small number of studies to date, the contrary evidence from a qualitative study, and the limited data available on levels of distress in significant others of patients with CFS/ME mean that our conclusion that distress levels are elevated is provisional. We recommend that future qualitative studies focus on this particular topic. Further longitudinal studies exploring correlates of distress within the context of a predictive theoretical model would be helpful. Statement of contribution What is already known on this subject? Chronic fatigue syndrome (CFS/ME) entails considerable economic, social, and personal costs. Uncertainties exist around diagnosis and management. This may lead to particular difficulties for significant others trying to support patients. What does this study add? Few studies have examined distress and its correlates in significant others of people with CFS/ME. Significant others report elevated levels of distress on quantitative measures. © 2016 The British Psychological Society.

  9. A synthesis of sedimentary records of Australian environmental change during the last 2000 years

    NASA Astrophysics Data System (ADS)

    Tyler, J. J.; Karoly, D. J.; Gell, P.; Goodwin, I. D.

    2013-12-01

    Our understanding of Southern Hemispheric climate variability on multidecadal to multicentennial timescales is limited by a scarcity of quantitative, highly resolved climate records, a problem which is particularly manifest in Australia. To date there are no quantitative, annually resolved records from within continental Australia which extend further back in time than the most recent c. 300 years [Neukom and Gergis, 2012; PAGES 2k Consortium, 2013]. By contrast, a number of marine, lake, peat and speleothem sedimentary records exist, some of which span multiple millennia at sub-decadal resolution. Here we report a database of existing sedimentary records of environmental change in Australia [Freeman et al., 2011], of which 25 have sample resolutions < 100 years/sample and which span > 500 years in duration. The majority of these records are located in southeastern Australia, providing an invaluable resource with which to examine regional scale climate and environmental change. Although most of the records can not be quantitatively related to climate variability, Empirical Orthogonal Functions coupled with Monte Carlo iterative age modelling, demonstrate coherent patterns of environmental and ecological change. This coherency, as well as comparisons with a limited number of quantitative records, suggests that regional hydroclimatic changes were responsible for the observed patterns. Here, we discuss the implications of these findings with respect to Southern Hemisphere climate during the last 2000 years. In addition, we review the progress and potential of ongoing research in the region. References: Freeman, R., I. D. Goodwin, and T. Donovan (2011), Paleoclimate data synthesis and data base for the reconstruction of climate variability and impacts in NSW over the past 2000 years., Climate Futures Technical Report, 1/2011, 50 pages. Neukom, R., and J. Gergis (2012), Southern Hemisphere high-resolution palaeoclimate records of the last 2000 years, Holocene, 22(5), 501-524, doi:10.1177/0959683611427335. PAGES 2k Consortium (2013), Continental-scale temperature variability during the past two millennia, Nature Geoscience, 6, 339-346.

  10. Genome Scale Modeling in Systems Biology: Algorithms and Resources

    PubMed Central

    Najafi, Ali; Bidkhori, Gholamreza; Bozorgmehr, Joseph H.; Koch, Ina; Masoudi-Nejad, Ali

    2014-01-01

    In recent years, in silico studies and trial simulations have complemented experimental procedures. A model is a description of a system, and a system is any collection of interrelated objects; an object, moreover, is some elemental unit upon which observations can be made but whose internal structure either does not exist or is ignored. Therefore, any network analysis approach is critical for successful quantitative modeling of biological systems. This review highlights some of most popular and important modeling algorithms, tools, and emerging standards for representing, simulating and analyzing cellular networks in five sections. Also, we try to show these concepts by means of simple example and proper images and graphs. Overall, systems biology aims for a holistic description and understanding of biological processes by an integration of analytical experimental approaches along with synthetic computational models. In fact, biological networks have been developed as a platform for integrating information from high to low-throughput experiments for the analysis of biological systems. We provide an overview of all processes used in modeling and simulating biological networks in such a way that they can become easily understandable for researchers with both biological and mathematical backgrounds. Consequently, given the complexity of generated experimental data and cellular networks, it is no surprise that researchers have turned to computer simulation and the development of more theory-based approaches to augment and assist in the development of a fully quantitative understanding of cellular dynamics. PMID:24822031

  11. Development of quantitative exposure data for a pooled exposure-response analysis of 10 silica cohorts.

    PubMed

    Mannetje, Andrea 't; Steenland, Kyle; Checkoway, Harvey; Koskela, Riitta-Sisko; Koponen, Matti; Attfield, Michael; Chen, Jingqiong; Hnizdo, Eva; DeKlerk, Nicholas; Dosemeci, Mustafa

    2002-08-01

    Comprehensive quantitative silica exposure estimates over time, measured in the same units across a number of cohorts, would make possible a pooled exposure-response analysis for lung cancer. Such an analysis would help clarify the continuing controversy regarding whether silica causes lung cancer. Existing quantitative exposure data for 10 silica-exposed cohorts were retrieved from the original investigators. Occupation- and time-specific exposure estimates were either adopted/adapted or developed for each cohort, and converted to milligram per cubic meter (mg/m(3)) respirable crystalline silica. Quantitative exposure assignments were typically based on a large number (thousands) of raw measurements, or otherwise consisted of exposure estimates by experts (for two cohorts). Median exposure level of the cohorts ranged between 0.04 and 0.59 mg/m(3) respirable crystalline silica. Exposure estimates were partially validated via their successful prediction of silicosis in these cohorts. Existing data were successfully adopted or modified to create comparable quantitative exposure estimates over time for 10 silica-exposed cohorts, permitting a pooled exposure-response analysis. The difficulties encountered in deriving common exposure estimates across cohorts are discussed. Copyright 2002 Wiley-Liss, Inc.

  12. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    PubMed

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  13. Nonstandard Work Schedules and Partnership Quality: Quantitative and Qualitative Findings

    ERIC Educational Resources Information Center

    Mills, Melinda; Taht, Kadri

    2010-01-01

    This article questions existing findings and provides new evidence about the consequences of nonstandard work schedules on partnership quality. Using quantitative couple data from The Netherlands Kinship Panel Study (NKPS) (N = 3,016) and semistructured qualitative interviews (N = 34), we found that, for women, schedules with varying hours…

  14. Conceptions and Practices of Assessment: A Case of Teachers Representing Improvement Conception

    ERIC Educational Resources Information Center

    Azis, Astuti

    2015-01-01

    Despite numerous quantitative studies on teachers' conceptions and practices of assessment, little research exists regarding the unique assessment environment of Indonesia. This study uses both quantitative and qualitative data to examine how Indonesian junior high school teachers understand assessment and how their conceptions of assessment…

  15. Dissociative Conceptual and Quantitative Problem Solving Outcomes across Interactive Engagement and Traditional Format Introductory Physics

    ERIC Educational Resources Information Center

    McDaniel, Mark A.; Stoen, Siera M.; Frey, Regina F.; Markow, Zachary E.; Hynes, K. Mairin; Zhao, Jiuqing; Cahill, Michael J.

    2016-01-01

    The existing literature indicates that interactive-engagement (IE) based general physics classes improve conceptual learning relative to more traditional lecture-oriented classrooms. Very little research, however, has examined quantitative problem-solving outcomes from IE based relative to traditional lecture-based physics classes. The present…

  16. Servant Leadership and Its Impact on Classroom Climate and Student Achievement

    ERIC Educational Resources Information Center

    Mulligan, Daniel F.

    2016-01-01

    The purpose of this quantitative research was to see to what degree a relationship existed between servant leadership, classroom climate, and student achievement in a collegiate environment. This was a quantitative, correlational study. The foundational theories for this research included servant leadership and organizational climate that pertain…

  17. Accelerated Colorimetric Micro-assay for Screening Mold Inhibitors

    Treesearch

    Carol A. Clausen; Vina W. Yang

    2014-01-01

    Rapid quantitative laboratory test methods are needed to screen potential antifungal agents. Existing laboratory test methods are relatively time consuming, may require specialized test equipment and rely on subjective visual ratings. A quantitative, colorimetric micro-assay has been developed that uses XTT tetrazolium salt to metabolically assess mold spore...

  18. A Quantitative Assessment of Test Anxiety and Human-Animal Interaction in College Students

    ERIC Educational Resources Information Center

    Dluzynski, Jessica L.

    2017-01-01

    Existing research on human-animal interactions has established that engaging with an animal may reduce anxiety-like behaviors (Acheson et al., 2013; Sobota, Mihara, Forrest, Featherstone, & Siegel, 2015; Yates, 2012) and lower physiological response in stressful situations (Campo & Uchino, 2013). This quantitative research study expanded…

  19. Model Selection in Historical Research Using Approximate Bayesian Computation

    PubMed Central

    Rubio-Campillo, Xavier

    2016-01-01

    Formal Models and History Computational models are increasingly being used to study historical dynamics. This new trend, which could be named Model-Based History, makes use of recently published datasets and innovative quantitative methods to improve our understanding of past societies based on their written sources. The extensive use of formal models allows historians to re-evaluate hypotheses formulated decades ago and still subject to debate due to the lack of an adequate quantitative framework. The initiative has the potential to transform the discipline if it solves the challenges posed by the study of historical dynamics. These difficulties are based on the complexities of modelling social interaction, and the methodological issues raised by the evaluation of formal models against data with low sample size, high variance and strong fragmentation. Case Study This work examines an alternate approach to this evaluation based on a Bayesian-inspired model selection method. The validity of the classical Lanchester’s laws of combat is examined against a dataset comprising over a thousand battles spanning 300 years. Four variations of the basic equations are discussed, including the three most common formulations (linear, squared, and logarithmic) and a new variant introducing fatigue. Approximate Bayesian Computation is then used to infer both parameter values and model selection via Bayes Factors. Impact Results indicate decisive evidence favouring the new fatigue model. The interpretation of both parameter estimations and model selection provides new insights into the factors guiding the evolution of warfare. At a methodological level, the case study shows how model selection methods can be used to guide historical research through the comparison between existing hypotheses and empirical evidence. PMID:26730953

  20. How Many Wolves (Canis lupus) Fit into Germany? The Role of Assumptions in Predictive Rule-Based Habitat Models for Habitat Generalists

    PubMed Central

    Fechter, Dominik; Storch, Ilse

    2014-01-01

    Due to legislative protection, many species, including large carnivores, are currently recolonizing Europe. To address the impending human-wildlife conflicts in advance, predictive habitat models can be used to determine potentially suitable habitat and areas likely to be recolonized. As field data are often limited, quantitative rule based models or the extrapolation of results from other studies are often the techniques of choice. Using the wolf (Canis lupus) in Germany as a model for habitat generalists, we developed a habitat model based on the location and extent of twelve existing wolf home ranges in Eastern Germany, current knowledge on wolf biology, different habitat modeling techniques and various input data to analyze ten different input parameter sets and address the following questions: (1) How do a priori assumptions and different input data or habitat modeling techniques affect the abundance and distribution of potentially suitable wolf habitat and the number of wolf packs in Germany? (2) In a synthesis across input parameter sets, what areas are predicted to be most suitable? (3) Are existing wolf pack home ranges in Eastern Germany consistent with current knowledge on wolf biology and habitat relationships? Our results indicate that depending on which assumptions on habitat relationships are applied in the model and which modeling techniques are chosen, the amount of potentially suitable habitat estimated varies greatly. Depending on a priori assumptions, Germany could accommodate between 154 and 1769 wolf packs. The locations of the existing wolf pack home ranges in Eastern Germany indicate that wolves are able to adapt to areas densely populated by humans, but are limited to areas with low road densities. Our analysis suggests that predictive habitat maps in general, should be interpreted with caution and illustrates the risk for habitat modelers to concentrate on only one selection of habitat factors or modeling technique. PMID:25029506

  1. A new theory of plant-microbe nutrient competition resolves inconsistencies between observations and model predictions.

    PubMed

    Zhu, Qing; Riley, William J; Tang, Jinyun

    2017-04-01

    Terrestrial plants assimilate anthropogenic CO 2 through photosynthesis and synthesizing new tissues. However, sustaining these processes requires plants to compete with microbes for soil nutrients, which therefore calls for an appropriate understanding and modeling of nutrient competition mechanisms in Earth System Models (ESMs). Here, we survey existing plant-microbe competition theories and their implementations in ESMs. We found no consensus regarding the representation of nutrient competition and that observational and theoretical support for current implementations are weak. To reconcile this situation, we applied the Equilibrium Chemistry Approximation (ECA) theory to plant-microbe nitrogen competition in a detailed grassland 15 N tracer study and found that competition theories in current ESMs fail to capture observed patterns and the ECA prediction simplifies the complex nature of nutrient competition and quantitatively matches the 15 N observations. Since plant carbon dynamics are strongly modulated by soil nutrient acquisition, we conclude that (1) predicted nutrient limitation effects on terrestrial carbon accumulation by existing ESMs may be biased and (2) our ECA-based approach may improve predictions by mechanistically representing plant-microbe nutrient competition. © 2016 by the Ecological Society of America.

  2. Human Birth Weight and Reproductive Immunology: Testing for Interactions between Maternal and Offspring KIR and HLA-C Genes.

    PubMed

    Clark, Michelle M; Chazara, Olympe; Sobel, Eric M; Gjessing, Håkon K; Magnus, Per; Moffett, Ashley; Sinsheimer, Janet S

    2016-01-01

    Maternal and offspring cell contact at the site of placentation presents a plausible setting for maternal-fetal genotype (MFG) interactions affecting fetal growth. We test hypotheses regarding killer cell immunoglobulin-like receptor (KIR) and HLA-C MFG effects on human birth weight by extending the quantitative MFG (QMFG) test. Until recently, association testing for MFG interactions had limited applications. To improve the ability to test for these interactions, we developed the extended QMFG test, a linear mixed-effect model that can use multi-locus genotype data from families. We demonstrate the extended QMFG test's statistical properties. We also show that if an offspring-only model is fit when MFG effects exist, associations can be missed or misattributed. Furthermore, imprecisely modeling the effects of both KIR and HLA-C could result in a failure to replicate if these loci's allele frequencies differ among populations. To further illustrate the extended QMFG test's advantages, we apply the extended QMFG test to a UK cohort study and the Norwegian Mother and Child Cohort (MoBa) study. We find a significant KIR-HLA-C interaction effect on birth weight. More generally, the QMFG test can detect genetic associations that may be missed by standard genome-wide association studies for quantitative traits. © 2017 S. Karger AG, Basel.

  3. Characterization of 3D joint space morphology using an electrostatic model (with application to osteoarthritis)

    NASA Astrophysics Data System (ADS)

    Cao, Qian; Thawait, Gaurav; Gang, Grace J.; Zbijewski, Wojciech; Reigel, Thomas; Brown, Tyler; Corner, Brian; Demehri, Shadpour; Siewerdsen, Jeffrey H.

    2015-02-01

    Joint space morphology can be indicative of the risk, presence, progression, and/or treatment response of disease or trauma. We describe a novel methodology of characterizing joint space morphology in high-resolution 3D images (e.g. cone-beam CT (CBCT)) using a model based on elementary electrostatics that overcomes a variety of basic limitations of existing 2D and 3D methods. The method models each surface of a joint as a conductor at fixed electrostatic potential and characterizes the intra-articular space in terms of the electric field lines resulting from the solution of Gauss’ Law and the Laplace equation. As a test case, the method was applied to discrimination of healthy and osteoarthritic subjects (N = 39) in 3D images of the knee acquired on an extremity CBCT system. The method demonstrated improved diagnostic performance (area under the receiver operating characteristic curve, AUC > 0.98) compared to simpler methods of quantitative measurement and qualitative image-based assessment by three expert musculoskeletal radiologists (AUC = 0.87, p-value = 0.007). The method is applicable to simple (e.g. the knee or elbow) or multi-axial joints (e.g. the wrist or ankle) and may provide a useful means of quantitatively assessing a variety of joint pathologies.

  4. Emissions Prediction and Measurement for Liquid-Fueled TVC Combustor with and without Water Injection

    NASA Technical Reports Server (NTRS)

    Brankovic, A.; Ryder, R. C., Jr.; Hendricks, R. C.; Liu, N.-S.; Shouse, D. T.; Roquemore, W. M.

    2005-01-01

    An investigation is performed to evaluate the performance of a computational fluid dynamics (CFD) tool for the prediction of the reacting flow in a liquid-fueled combustor that uses water injection for control of pollutant emissions. The experiment consists of a multisector, liquid-fueled combustor rig operated at different inlet pressures and temperatures, and over a range of fuel/air and water/fuel ratios. Fuel can be injected directly into the main combustion airstream and into the cavities. Test rig performance is characterized by combustor exit quantities such as temperature and emissions measurements using rakes and overall pressure drop from upstream plenum to combustor exit. Visualization of the flame is performed using gray scale and color still photographs and high-frame-rate videos. CFD simulations are performed utilizing a methodology that includes computer-aided design (CAD) solid modeling of the geometry, parallel processing over networked computers, and graphical and quantitative post-processing. Physical models include liquid fuel droplet dynamics and evaporation, with combustion modeled using a hybrid finite-rate chemistry model developed for Jet-A fuel. CFD and experimental results are compared for cases with cavity-only fueling, while numerical studies of cavity and main fueling was also performed. Predicted and measured trends in combustor exit temperature, CO and NOx are in general agreement at the different water/fuel loading rates, although quantitative differences exist between the predictions and measurements.

  5. Quantifying reactive transport processes governing arsenic mobility in a Bengal Delta aquifer

    NASA Astrophysics Data System (ADS)

    Rawson, Joey; Neidhardt, Harald; Siade, Adam; Berg, Michael; Prommer, Henning

    2017-04-01

    Over the last few decades significant progress has been made to characterize the extent and severity of groundwater arsenic pollution in S/SE Asia, and to understand the underlying geochemical processes. However, comparably little effort has been made to merge the findings from this research into quantitative frameworks that allow for a process-based quantitative analysis of observed arsenic behavior and predictions of its future fate. Therefore, this study developed and tested field-scale numerical modelling approaches to represent the primary and secondary geochemical processes associated with the reductive dissolution of Fe-oxy(hydr)oxides and the concomitant release of sorbed arsenic. We employed data from an in situ field experiment in the Bengal Delta Plain, which investigated the influence of labile organic matter (sucrose) on the mobility of Fe, Mn, and As. The data collected during the field experiment were used to guide our model development and to constrain the model parameterisation. Our results show that sucrose oxidation coupled to the reductive dissolution of Fe-oxy(hydr)oxides was accompanied by multiple secondary geochemical reactions that are not easily and uniquely identifiable and quantifiable. Those secondary reactions can explain the disparity between the observed Fe and As behavior. Our modelling results suggest that a significant fraction of the released As is scavenged through (co-)precipitation with newly formed Fe-minerals, specifically magnetite, rather than through sorption to pre-existing and freshly precipitated iron minerals.

  6. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.

    PubMed

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-11-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.

  7. Improved Protein Arrays for Quantitative Systems Analysis of the Dynamics of Signaling Pathway Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Chin-Rang

    Astronauts and workers in nuclear plants who repeatedly exposed to low doses of ionizing radiation (IR, <10 cGy) are likely to incur specific changes in signal transduction and gene expression in various tissues of their body. Remarkable advances in high throughput genomics and proteomics technologies enable researchers to broaden their focus from examining single gene/protein kinetics to better understanding global gene/protein expression profiling and biological pathway analyses, namely Systems Biology. An ultimate goal of systems biology is to develop dynamic mathematical models of interacting biological systems capable of simulating living systems in a computer. This Glue Grant is to complementmore » Dr. Boothman’s existing DOE grant (No. DE-FG02-06ER64186) entitled “The IGF1/IGF-1R-MAPK-Secretory Clusterin (sCLU) Pathway: Mediator of a Low Dose IR-Inducible Bystander Effect” to develop sensitive and quantitative proteomic technology that suitable for low dose radiobiology researches. An improved version of quantitative protein array platform utilizing linear Quantum dot signaling for systematically measuring protein levels and phosphorylation states for systems biology modeling is presented. The signals are amplified by a confocal laser Quantum dot scanner resulting in ~1000-fold more sensitivity than traditional Western blots and show the good linearity that is impossible for the signals of HRP-amplification. Therefore this improved protein array technology is suitable to detect weak responses of low dose radiation. Software is developed to facilitate the quantitative readout of signaling network activities. Kinetics of EGFRvIII mutant signaling was analyzed to quantify cross-talks between EGFR and other signaling pathways.« less

  8. On a viscous critical-stress model of martensitic phase transitions

    NASA Astrophysics Data System (ADS)

    Weatherwax, John; Vaynblat, Dimitri; Bruno, Oscar; Rosales, Ruben

    2007-09-01

    The solid-to-solid phase transitions that result from shock loading of certain materials, such as the graphite-to-diamond transition and the α-ɛ transition in iron, have long been subjects of a substantial theoretical and experimental literature. Recently a model for such transitions was introduced which, based on a CS condition (CS) and without use of fitting parameters, accounts quantitatively for existing observations in a number of systems [Bruno and Vaynblat, Proc. R. Soc. London, Ser. A 457, 2871 (2001)]. While the results of the CS model match the main features of the available experimental data, disagreements in some details between the predictions of this model and experiment, attributable to an ideal character of the CS model, do exist. In this article we present a version of the CS model, the viscous CS model (vCS), as well as a numerical method for its solution. This model and the corresponding solver results in a much improved overall CS modeling capability. The innovations we introduce include: (1) Enhancement of the model by inclusion of viscous phase-transition effects; as well as a numerical solver that allows for a fully rigorous treatment of both, the (2) Rarefaction fans (which had previously been approximated by "rarefaction discontinuities"), and (3) viscous phase-transition effects, that are part of the vCS model. In particular we show that the vCS model accounts accurately for well known "gradual" rises in the α-ɛ transition which, in the original CS model, were somewhat crudely approximated as jump discontinuities.

  9. Novel flood risk assessment framework for rapid decision making

    NASA Astrophysics Data System (ADS)

    Valyrakis, Manousos; Koursari, Eftychia; Solley, Mark

    2016-04-01

    The impacts of catastrophic flooding, have significantly increased over the last few decades. This is due to primarily the increased urbanisation in ever-expanding mega-cities as well as due to the intensification both in magnitude and frequency of extreme hydrologic events. Herein a novel conceptual framework is presented that incorporates the use of real-time information to inform and update low dimensionality hydraulic models, to allow for rapid decision making towards preventing loss of life and safeguarding critical infrastructure. In particular, a case study from the recent UK floods in the area of Whitesands (Dumfries), is presented to demonstrate the utility of this approach. It is demonstrated that effectively combining a wealth of readily available qualitative information (such as crowdsourced visual documentation or using live data from sensing techniques), with existing quantitative data, can help appropriately update hydraulic models and reduce modelling uncertainties in future flood risk assessments. This approach is even more useful in cases where hydraulic models are limited, do not exist or were not needed before unpredicted dynamic modifications to the river system took place (for example in the case of reduced or eliminated hydraulic capacity due to blockages). The low computational cost and rapid assessment this framework offers, render it promising for innovating in flood management.

  10. Transient thermal analysis for radioactive liquid mixing operations in a large-scaled tank

    DOE PAGES

    Lee, S. Y.; Smith, III, F. G.

    2014-07-25

    A transient heat balance model was developed to assess the impact of a Submersible Mixer Pump (SMP) on radioactive liquid temperature during the process of waste mixing and removal for the high-level radioactive materials stored in Savannah River Site (SRS) tanks. The model results will be mainly used to determine the SMP design impacts on the waste tank temperature during operations and to develop a specification for a new SMP design to replace existing longshaft mixer pumps used during waste removal. The present model was benchmarked against the test data obtained by the tank measurement to examine the quantitative thermalmore » response of the tank and to establish the reference conditions of the operating variables under no SMP operation. The results showed that the model predictions agreed with the test data of the waste temperatures within about 10%.« less

  11. Species distributions models in wildlife planning: agricultural policy and wildlife management in the great plains

    USGS Publications Warehouse

    Fontaine, Joseph J.; Jorgensen, Christopher; Stuber, Erica F.; Gruber, Lutz F.; Bishop, Andrew A.; Lusk, Jeffrey J.; Zach, Eric S.; Decker, Karie L.

    2017-01-01

    We know economic and social policy has implications for ecosystems at large, but the consequences for a given geographic area or specific wildlife population are more difficult to conceptualize and communicate. Species distribution models, which extrapolate species-habitat relationships across ecological scales, are capable of predicting population changes in distribution and abundance in response to management and policy, and thus, are an ideal means for facilitating proactive management within a larger policy framework. To illustrate the capabilities of species distribution modeling in scenario planning for wildlife populations, we projected an existing distribution model for ring-necked pheasants (Phasianus colchicus) onto a series of alternative future landscape scenarios for Nebraska, USA. Based on our scenarios, we qualitatively and quantitatively estimated the effects of agricultural policy decisions on pheasant populations across Nebraska, in specific management regions, and at wildlife management areas. 

  12. Source-term development for a contaminant plume for use by multimedia risk assessment models

    NASA Astrophysics Data System (ADS)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    2000-02-01

    Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.

  13. Optical time-of-flight and absorbance imaging of biologic media.

    PubMed

    Benaron, D A; Stevenson, D K

    1993-03-05

    Imaging the interior of living bodies with light may assist in the diagnosis and treatment of a number of clinical problems, which include the early detection of tumors and hypoxic cerebral injury. An existing picosecond time-of-flight and absorbance (TOFA) optical system has been used to image a model biologic system and a rat. Model measurements confirmed TOFA principles in systems with a high degree of photon scattering; rat images, which were constructed from the variable time delays experienced by a fixed fraction of early-arriving transmitted photons, revealed identifiable internal structure. A combination of light-based quantitative measurement and TOFA localization may have applications in continuous, noninvasive monitoring for structural imaging and spatial chemometric analysis in humans.

  14. Optical Time-of-Flight and Absorbance Imaging of Biologic Media

    NASA Astrophysics Data System (ADS)

    Benaron, David A.; Stevenson, David K.

    1993-03-01

    Imaging the interior of living bodies with light may assist in the diagnosis and treatment of a number of clinical problems, which include the early detection of tumors and hypoxic cerebral injury. An existing picosecond time-of-flight and absorbance (TOFA) optical system has been used to image a model biologic system and a rat. Model measurements confirmed TOFA principles in systems with a high degree of photon scattering; rat images, which were constructed from the variable time delays experienced by a fixed fraction of early-arriving transmitted photons, revealed identifiable internal structure. A combination of light-based quantitative measurement and TOFA localization may have applications in continuous, noninvasive monitoring for structural imaging and spatial chemometric analysis in humans.

  15. Analysis of the Sheltered Instruction Observation Protocol Model on Academic Performance of English Language Learners

    NASA Astrophysics Data System (ADS)

    Ingram, Sandra W.

    This quantitative comparative descriptive study involved analyzing archival data from end-of-course (EOC) test scores in biology of English language learners (ELLs) taught or not taught using the sheltered instruction observation protocol (SIOP) model. The study includes descriptions and explanations of the benefits of the SIOP model to ELLs, especially in content area subjects such as biology. Researchers have shown that ELLs in high school lag behind their peers in academic achievement in content area subjects. Much of the research on the SIOP model took place in elementary and middle school, and more research was necessary at the high school level. This study involved analyzing student records from archival data to describe and explain if the SIOP model had an effect on the EOC test scores of ELLs taught or not taught using it. The sample consisted of 527 Hispanic students (283 females and 244 males) from Grades 9-12. An independent sample t-test determined if a significant difference existed in the mean EOC test scores of ELLs taught using the SIOP model as opposed to ELLs not taught using the SIOP model. The results indicated that a significant difference existed between EOC test scores of ELLs taught using the SIOP model and ELLs not taught using the SIOP model (p = .02). A regression analysis indicated a significant difference existed in the academic performance of ELLs taught using the SIOP model in high school science, controlling for free and reduced-price lunch (p = .001) in predicting passing scores on the EOC test in biology at the school level. The data analyzed for free and reduced-price lunch together with SIOP data indicated that both together were not significant (p = .175) for predicting passing scores on the EOC test in high school biology. Future researchers should repeat the study with student-level data as opposed to school-level data, and data should span at least three years.

  16. Statistical surrogate models for prediction of high-consequence climate change.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick

    2011-09-01

    In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest.more » A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.« less

  17. A new medical image segmentation model based on fractional order differentiation and level set

    NASA Astrophysics Data System (ADS)

    Chen, Bo; Huang, Shan; Xie, Feifei; Li, Lihong; Chen, Wensheng; Liang, Zhengrong

    2018-03-01

    Segmenting medical images is still a challenging task for both traditional local and global methods because the image intensity inhomogeneous. In this paper, two contributions are made: (i) on the one hand, a new hybrid model is proposed for medical image segmentation, which is built based on fractional order differentiation, level set description and curve evolution; and (ii) on the other hand, three popular definitions of Fourier-domain, Grünwald-Letnikov (G-L) and Riemann-Liouville (R-L) fractional order differentiation are investigated and compared through experimental results. Because of the merits of enhancing high frequency features of images and preserving low frequency features of images in a nonlinear manner by the fractional order differentiation definitions, one fractional order differentiation definition is used in our hybrid model to perform segmentation of inhomogeneous images. The proposed hybrid model also integrates fractional order differentiation, fractional order gradient magnitude and difference image information. The widely-used dice similarity coefficient metric is employed to evaluate quantitatively the segmentation results. Firstly, experimental results demonstrated that a slight difference exists among the three expressions of Fourier-domain, G-L, RL fractional order differentiation. This outcome supports our selection of one of the three definitions in our hybrid model. Secondly, further experiments were performed for comparison between our hybrid segmentation model and other existing segmentation models. A noticeable gain was seen by our hybrid model in segmenting intensity inhomogeneous images.

  18. A quantitative approach to assessing the efficacy of occupant protection programs: A case study from Montana.

    PubMed

    Manlove, Kezia; Stanley, Laura; Peck, Alyssa

    2015-10-01

    Quantitative evaluation of vehicle occupant protection programs is critical for ensuring efficient government resource allocation, but few methods exist for conducting evaluation across multiple programs simultaneously. Here we present an analysis of occupant protection efficacy in the state of Montana. This approach relies on seat belt compliance rates as measured by the National Occupant Protection Usage Survey (NOPUS). A hierarchical logistic regression model is used to estimate the impacts of four Montana Department of Transportation (MDT)-funded occupant protection programs used in the state of Montana, following adjustment for a suite of potential confounders. Activity from two programs, Buckle Up coalitions and media campaigns, are associated with increased seat belt use in Montana, whereas the impact of another program, Selective Traffic Enforcement, is potentially masked by other program activity. A final program, Driver's Education, is not associated with any shift in seat belt use. This method allows for a preliminary quantitative estimation of program impacts without requiring states to obtain any new seat belt use data. This approach provides states a preliminary look at program impacts, and a means for carefully planning future program allocation and investigation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Development and application of a DNA microarray-based yeast two-hybrid system

    PubMed Central

    Suter, Bernhard; Fontaine, Jean-Fred; Yildirimman, Reha; Raskó, Tamás; Schaefer, Martin H.; Rasche, Axel; Porras, Pablo; Vázquez-Álvarez, Blanca M.; Russ, Jenny; Rau, Kirstin; Foulle, Raphaele; Zenkner, Martina; Saar, Kathrin; Herwig, Ralf; Andrade-Navarro, Miguel A.; Wanker, Erich E.

    2013-01-01

    The yeast two-hybrid (Y2H) system is the most widely applied methodology for systematic protein–protein interaction (PPI) screening and the generation of comprehensive interaction networks. We developed a novel Y2H interaction screening procedure using DNA microarrays for high-throughput quantitative PPI detection. Applying a global pooling and selection scheme to a large collection of human open reading frames, proof-of-principle Y2H interaction screens were performed for the human neurodegenerative disease proteins huntingtin and ataxin-1. Using systematic controls for unspecific Y2H results and quantitative benchmarking, we identified and scored a large number of known and novel partner proteins for both huntingtin and ataxin-1. Moreover, we show that this parallelized screening procedure and the global inspection of Y2H interaction data are uniquely suited to define specific PPI patterns and their alteration by disease-causing mutations in huntingtin and ataxin-1. This approach takes advantage of the specificity and flexibility of DNA microarrays and of the existence of solid-related statistical methods for the analysis of DNA microarray data, and allows a quantitative approach toward interaction screens in human and in model organisms. PMID:23275563

  20. Correlation of quantitative computed tomographic subchondral bone density and ash density in horses.

    PubMed

    Drum, M G; Les, C M; Park, R D; Norrdin, R W; McIlwraith, C W; Kawcak, C E

    2009-02-01

    The purpose of this study was to compare subchondral bone density obtained using quantitative computed tomography with ash density values from intact equine joints, and to determine if there are measurable anatomic variations in mean subchondral bone density. Five adult equine metacarpophalangeal joints were scanned with computed tomography (CT), disarticulated, and four 1-cm(3) regions of interest (ROI) cut from the distal third metacarpal bone. Bone cubes were ashed, and percent mineralization and ash density were recorded. Three-dimensional models were created of the distal third metacarpal bone from CT images. Four ROIs were measured on the distal aspect of the third metacarpal bone at axial and abaxial sites of the medial and lateral condyles for correlation with ash samples. Overall correlations of mean quantitative CT (QCT) density with ash density (r=0.82) and percent mineralization (r=0.93) were strong. There were significant differences between abaxial and axial ROIs for mean QCT density, percent bone mineralization and ash density (p<0.05). QCT appears to be a good measure of bone density in equine subchondral bone. Additionally, differences existed between axial and abaxial subchondral bone density in the equine distal third metacarpal bone.

  1. Adaptive Grouping Cloud Model Shuffled Frog Leaping Algorithm for Solving Continuous Optimization Problems

    PubMed Central

    Liu, Haorui; Yi, Fengyan; Yang, Heli

    2016-01-01

    The shuffled frog leaping algorithm (SFLA) easily falls into local optimum when it solves multioptimum function optimization problem, which impacts the accuracy and convergence speed. Therefore this paper presents grouped SFLA for solving continuous optimization problems combined with the excellent characteristics of cloud model transformation between qualitative and quantitative research. The algorithm divides the definition domain into several groups and gives each group a set of frogs. Frogs of each region search in their memeplex, and in the search process the algorithm uses the “elite strategy” to update the location information of existing elite frogs through cloud model algorithm. This method narrows the searching space and it can effectively improve the situation of a local optimum; thus convergence speed and accuracy can be significantly improved. The results of computer simulation confirm this conclusion. PMID:26819584

  2. Estimating Fallout Building Attributes from Architectural Features and Global Earthquake Model (GEM) Building Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillon, Michael B.; Kane, Staci R.

    A nuclear explosion has the potential to injure or kill tens to hundreds of thousands (or more) of people through exposure to fallout (external gamma) radiation. Existing buildings can protect their occupants (reducing fallout radiation exposures) by placing material and distance between fallout particles and individuals indoors. Prior efforts have determined an initial set of building attributes suitable to reasonably assess a given building’s protection against fallout radiation. The current work provides methods to determine the quantitative values for these attributes from (a) common architectural features and data and (b) buildings described using the Global Earthquake Model (GEM) taxonomy. Thesemore » methods will be used to improve estimates of fallout protection for operational US Department of Defense (DoD) and US Department of Energy (DOE) consequence assessment models.« less

  3. Models of Mars' atmosphere (1974)

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Atmospheric models for support of design and mission planning of space vehicles that are to orbit the planet Mars, enter its atmosphere, or land on the surface are presented. Quantitative data for the Martian atmosphere were obtained from Earth-base observations and from spacecraft that have orbited Mars or passed within several planetary radii. These data were used in conjunction with existing theories of planetary atmospheres to predict other characteristics of the Martian atmosphere. Earth-based observations provided information on the composition, temperature, and optical properties of Mars with rather coarse spatial resolution, whereas spacecraft measurements yielded data on composition, temperature, pressure, density, and atmospheric structure with moderately good spatial resolution. The models provide the temperature, pressure, and density profiles required to perform basic aerodynamic analyses. The profiles are supplemented by computed values of viscosity, specific heat, and speed of sound.

  4. Stochastic availability analysis of operational data systems in the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Issa, T. N.

    1991-01-01

    Existing availability models of standby redundant systems consider only an operator's performance and its interaction with the hardware performance. In the case of operational data systems in the Deep Space Network (DSN), in addition to an operator system interface, a controller reconfigures the system and links a standby unit into the network data path upon failure of the operating unit. A stochastic (Markovian) process technique is used to model and analyze the availability performance and occurrence of degradation due to partial failures are quantitatively incorporated into the model. Exact expressions of the steady state availability and proportion degraded performance measures are derived for the systems under study. The interaction among the hardware, operator, and controller performance parameters and that interaction's effect on data availability are evaluated and illustrated for an operational data processing system.

  5. Compound analysis via graph kernels incorporating chirality.

    PubMed

    Brown, J B; Urata, Takashi; Tamura, Takeyuki; Arai, Midori A; Kawabata, Takeo; Akutsu, Tatsuya

    2010-12-01

    High accuracy is paramount when predicting biochemical characteristics using Quantitative Structural-Property Relationships (QSPRs). Although existing graph-theoretic kernel methods combined with machine learning techniques are efficient for QSPR model construction, they cannot distinguish topologically identical chiral compounds which often exhibit different biological characteristics. In this paper, we propose a new method that extends the recently developed tree pattern graph kernel to accommodate stereoisomers. We show that Support Vector Regression (SVR) with a chiral graph kernel is useful for target property prediction by demonstrating its application to a set of human vitamin D receptor ligands currently under consideration for their potential anti-cancer effects.

  6. Landau instability and mobility edges of the interacting one-dimensional Bose gas in weak random potentials

    NASA Astrophysics Data System (ADS)

    Cherny, Alexander Yu; Caux, Jean-Sébastien; Brand, Joachim

    2018-01-01

    We study the frictional force exerted on the trapped, interacting 1D Bose gas under the influence of a moving random potential. Specifically we consider weak potentials generated by optical speckle patterns with finite correlation length. We show that repulsive interactions between bosons lead to a superfluid response and suppression of frictional force, which can inhibit the onset of Anderson localisation. We perform a quantitative analysis of the Landau instability based on the dynamic structure factor of the integrable Lieb-Liniger model and demonstrate the existence of effective mobility edges.

  7. Spatiotemporal dynamics of oscillatory cellular patterns in three-dimensional directional solidification.

    PubMed

    Bergeon, N; Tourret, D; Chen, L; Debierre, J-M; Guérin, R; Ramirez, A; Billia, B; Karma, A; Trivedi, R

    2013-05-31

    We report results of directional solidification experiments conducted on board the International Space Station and quantitative phase-field modeling of those experiments. The experiments image for the first time in situ the spatially extended dynamics of three-dimensional cellular array patterns formed under microgravity conditions where fluid flow is suppressed. Experiments and phase-field simulations reveal the existence of oscillatory breathing modes with time periods of several 10's of minutes. Oscillating cells are usually noncoherent due to array disorder, with the exception of small areas where the array structure is regular and stable.

  8. Predictive model of thrombospondin-1 and vascular endothelial growth factor in breast tumor tissue.

    PubMed

    Rohrs, Jennifer A; Sulistio, Christopher D; Finley, Stacey D

    2016-01-01

    Angiogenesis, the formation of new blood capillaries from pre-existing vessels, is a hallmark of cancer. Thus far, strategies for reducing tumor angiogenesis have focused on inhibiting pro-angiogenic factors, while less is known about the therapeutic effects of mimicking the actions of angiogenesis inhibitors. Thrombospondin-1 (TSP1) is an important endogenous inhibitor of angiogenesis that has been investigated as an anti-angiogenic agent. TSP1 impedes the growth of new blood vessels in many ways, including crosstalk with pro-angiogenic factors. Due to the complexity of TSP1 signaling, a predictive systems biology model would provide quantitative understanding of the angiogenic balance in tumor tissue. Therefore, we have developed a molecular-detailed, mechanistic model of TSP1 and vascular endothelial growth factor (VEGF), a promoter of angiogenesis, in breast tumor tissue. The model predicts the distribution of the angiogenic factors in tumor tissue, revealing that TSP1 is primarily in an inactive, cleaved form due to the action of proteases, rather than bound to its cellular receptors or to VEGF. The model also predicts the effects of enhancing TSP1's interactions with its receptors and with VEGF. To provide additional predictions that can guide the development of new anti-angiogenic drugs, we simulate administration of exogenous TSP1 mimetics that bind specific targets. The model predicts that the CD47-binding TSP1 mimetic dramatically decreases the ratio of receptor-bound VEGF to receptor-bound TSP1, in favor of anti-angiogenesis. Thus, we have established a model that provides a quantitative framework to study the response to TSP1 mimetics.

  9. Quantitative descriptions of generalized arousal, an elementary function of the vertebrate brain

    PubMed Central

    Quinkert, Amy Wells; Vimal, Vivek; Weil, Zachary M.; Reeke, George N.; Schiff, Nicholas D.; Banavar, Jayanth R.; Pfaff, Donald W.

    2011-01-01

    We review a concept of the most primitive, fundamental function of the vertebrate CNS, generalized arousal (GA). Three independent lines of evidence indicate the existence of GA: statistical, genetic, and mechanistic. Here we ask, is this concept amenable to quantitative analysis? Answering in the affirmative, four quantitative approaches have proven useful: (i) factor analysis, (ii) information theory, (iii) deterministic chaos, and (iv) application of a Gaussian equation. It strikes us that, to date, not just one but at least four different quantitative approaches seem necessary for describing different aspects of scientific work on GA. PMID:21555568

  10. A test for selection employing quantitative trait locus and mutation accumulation data.

    PubMed

    Rice, Daniel P; Townsend, Jeffrey P

    2012-04-01

    Evolutionary biologists attribute much of the phenotypic diversity observed in nature to the action of natural selection. However, for many phenotypic traits, especially quantitative phenotypic traits, it has been challenging to test for the historical action of selection. An important challenge for biologists studying quantitative traits, therefore, is to distinguish between traits that have evolved under the influence of strong selection and those that have evolved neutrally. Most existing tests for selection employ molecular data, but selection also leaves a mark on the genetic architecture underlying a trait. In particular, the distribution of quantitative trait locus (QTL) effect sizes and the distribution of mutational effects together provide information regarding the history of selection. Despite the increasing availability of QTL and mutation accumulation data, such data have not yet been effectively exploited for this purpose. We present a model of the evolution of QTL and employ it to formulate a test for historical selection. To provide a baseline for neutral evolution of the trait, we estimate the distribution of mutational effects from mutation accumulation experiments. We then apply a maximum-likelihood-based method of inference to estimate the range of selection strengths under which such a distribution of mutations could generate the observed QTL. Our test thus represents the first integration of population genetic theory and QTL data to measure the historical influence of selection.

  11. On normality, ethnicity, and missing values in quantitative trait locus mapping

    PubMed Central

    Labbe, Aurélie; Wormald, Hanna

    2005-01-01

    Background This paper deals with the detection of significant linkage for quantitative traits using a variance components approach. Microsatellite markers were obtained for the Genetic Analysis Workshop 14 Collaborative Study on the Genetics of Alcoholism data. Ethnic heterogeneity, highly skewed quantitative measures, and a high rate of missing values are all present in this dataset and well known to impact upon linkage analysis. This makes it a good candidate for investigation. Results As expected, we observed a number of changes in LOD scores, especially for chromosomes 1, 7, and 18, along with the three factors studied. A dramatic example of such changes can be found in chromosome 7. Highly significant linkage to one of the quantitative traits became insignificant when a proper normalizing transformation of the trait was used and when analysis was carried out on an ethnically homogeneous subset of the original pedigrees. Conclusion In agreement with existing literature, transforming a trait to ensure normality using a Box-Cox transformation is highly recommended in order to avoid false-positive linkages. Furthermore, pedigrees should be sorted by ethnic groups and analyses should be carried out separately. Finally, one should be aware that the inclusion of covariates with a high rate of missing values reduces considerably the number of subjects included in the model. In such a case, the loss in power may be large. Imputation methods are then recommended. PMID:16451664

  12. Estimating Hydrologic Fluxes, Crop Water Use, and Agricultural Land Area in China using Data Assimilation

    NASA Astrophysics Data System (ADS)

    Smith, Tiziana; McLaughlin, Dennis B.; Hoisungwan, Piyatida

    2016-04-01

    Crop production has significantly altered the terrestrial environment by changing land use and by altering the water cycle through both co-opted rainfall and surface water withdrawals. As the world's population continues to grow and individual diets become more resource-intensive, the demand for food - and the land and water necessary to produce it - will continue to increase. High-resolution quantitative data about water availability, water use, and agricultural land use are needed to develop sustainable water and agricultural planning and policies. However, existing data covering large areas with high resolution are susceptible to errors and can be physically inconsistent. China is an example of a large area where food demand is expected to increase and a lack of data clouds the resource management dialogue. Some assert that China will have insufficient land and water resources to feed itself, posing a threat to global food security if they seek to increase food imports. Others believe resources are plentiful. Without quantitative data, it is difficult to discern if these concerns are realistic or overly dramatized. This research presents a quantitative approach using data assimilation techniques to characterize hydrologic fluxes, crop water use (defined as crop evapotranspiration), and agricultural land use at 0.5 by 0.5 degree resolution and applies the methodology in China using data from around the year 2000. The approach uses the principles of water balance and of crop water requirements to assimilate existing data with a least-squares estimation technique, producing new estimates of water and land use variables that are physically consistent while minimizing differences from measured data. We argue that this technique for estimating water fluxes and agricultural land use can provide a useful basis for resource management modeling and policy, both in China and around the world.

  13. Leadership Styles at Middle- and Early-College Programs: A Quantitative Descriptive Correlational Study

    ERIC Educational Resources Information Center

    Berksteiner, Earl J.

    2013-01-01

    The purpose of this quantitative descriptive correlational study was to determine if associations existed between middle- and early-college (MEC) principals' leadership styles, teacher motivation, and teacher satisfaction. MEC programs were programs designed to assist high school students who were not served well in a traditional setting (Middle…

  14. The Untapped Promise of Secondary Data Sets in International and Comparative Education Policy Research

    ERIC Educational Resources Information Center

    Chudagr, Amita; Luschei, Thomas F.

    2016-01-01

    The objective of this commentary is to call attention to the feasibility and importance of large-scale, systematic, quantitative analysis in international and comparative education research. We contend that although many existing databases are under- or unutilized in quantitative international-comparative research, these resources present the…

  15. Group Projects and Civic Engagement in a Quantitative Literacy Course

    ERIC Educational Resources Information Center

    Dewar, Jacqueline; Larson, Suzanne; Zachariah, Thomas

    2011-01-01

    We describe our approach to incorporating a civic engagement component into a quantitative literacy (QL) course and the resulting gains in student learning, confidence, and awareness of local civic issues. We revised the existing QL course by including semester-long group projects involving local community issues that students could investigate…

  16. Identification and confirmation of chemical residues by chromatography-mass spectrometry and other techniques

    USDA-ARS?s Scientific Manuscript database

    A quantitative answer cannot exist in an analysis without a qualitative component to give enough confidence that the result meets the analytical needs for the analysis (i.e. the result relates to the analyte and not something else). Just as a quantitative method must typically undergo an empirical ...

  17. APA Reporting Standards in Quantitative Research Dissertations from an Online EdD Program

    ERIC Educational Resources Information Center

    Salgado, Griselle

    2013-01-01

    This study was an investigation of the reporting practices in dissertations with quantitative research designs produced by students enrolled in an online Doctor of Education (EdD) program, one that follows the American Psychological Association (APA) standards for reporting research. Limited, empirical information exists about the competencies in…

  18. Measurements in quantitative research: how to select and report on research instruments.

    PubMed

    Hagan, Teresa L

    2014-07-01

    Measures exist to numerically represent degrees of attributes. Quantitative research is based on measurement and is conducted in a systematic, controlled manner. These measures enable researchers to perform statistical tests, analyze differences between groups, and determine the effectiveness of treatments. If something is not measurable, it cannot be tested.

  19. Leadership for School Numeracy: How School Leaders' Knowledge and Attitudes Impact Student Mathematics Achievement

    ERIC Educational Resources Information Center

    Walker-Glenn, Michelle Lynn

    2010-01-01

    Although most high schools espouse school-wide literacy initiatives, few schools place equal emphasis on numeracy, or quantitative literacy. This lack of attention to quantitative skills is ironic in light of documented deficiencies in student mathematics achievement. While significant research exists regarding best practices for mathematics…

  20. A Quantitative Study into the Information Technology Project Portfolio Practice: The Impact on Information Technology Project Deliverables

    ERIC Educational Resources Information Center

    Yu, Wei

    2013-01-01

    This dissertation applied the quantitative approach to the data gathered from online survey questionnaires regarding the three objects: Information Technology (IT) Portfolio Management, IT-Business Alignment, and IT Project Deliverables. By studying this data, this dissertation uncovered the underlying relationships that exist between the…

  1. How to simulate pedestrian behaviors in seismic evacuation for vulnerability reduction of existing buildings

    NASA Astrophysics Data System (ADS)

    Quagliarini, Enrico; Bernardini, Gabriele; D'Orazio, Marco

    2017-07-01

    Understanding and representing how individuals behave in earthquake emergencies would be essentially to assess the impact of vulnerability reduction strategies on existing buildings in seismic areas. In fact, interactions between individuals and the scenario (modified by the earthquake occurrence) are really important in order to understand the possible additional risks for people, especially during the evacuation phase. The current approach is based on "qualitative" aspects, in order to define best practice guidelines for Civil Protection and populations. On the contrary, a "quantitative" description of human response and evacuation motion in similar conditions is urgently needed. Hence, this work defines the rules for pedestrians' earthquake evacuation in urban scenarios, by taking advantages of previous results of real-world evacuation analyses. In particular, motion laws for pedestrians is defined by modifying the Social Force model equation. The proposed model could be used for evaluating individuals' evacuation process and so for defining operative strategies for interferences reduction in critical urban fabric parts (e.g.: interventions on particular buildings, evacuation strategies definition, city parts projects).

  2. CarbonSAFE Illinois - Macon County

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whittaker, Steve

    CarbonSAFE Illinois is a a Feasibility study to develop an established geologic storage complex in Macon County, Illinois, for commercial-scale storage of industrially sourced CO2. Feasibility activities are focused on the Mt. Simon Storage Complex; a step-out well will be drilled near existing storage sites (i.e., the Midwest Geological Sequestration Consortium’s Illinois Basin – Decatur Project and the Illinois Industrial Carbon Capture and Storage Project) to further establish commercial viability of this complex and to evaluate EOR potential in a co-located oil-field trend. The Archer Daniels Midland facility (ethanol plant), City Water, Light, and Power in Springfield, Illinois (coal-fired powermore » station), and other regional industries are potential sources of anthropogenic CO2 for storage at this complex. Site feasibility will be evaluated through drilling results, static and dynamic modeling, and quantitative risk assessment. Both studies will entail stakeholder engagement, consideration of infrastructure requirements, existing policy, and business models. Project data will help calibrate the National Risk Assessment Partnership (NRAP) Toolkit to better understand the risks of commercial-scale carbon storage.« less

  3. In silico designing of power conversion efficient organic lead dyes for solar cells using todays innovative approaches to assure renewable energy for future

    NASA Astrophysics Data System (ADS)

    Kar, Supratik; Roy, Juganta K.; Leszczynski, Jerzy

    2017-06-01

    Advances in solar cell technology require designing of new organic dye sensitizers for dye-sensitized solar cells with high power conversion efficiency to circumvent the disadvantages of silicon-based solar cells. In silico studies including quantitative structure-property relationship analysis combined with quantum chemical analysis were employed to understand the primary electron transfer mechanism and photo-physical properties of 273 arylamine organic dyes from 11 diverse chemical families explicit to iodine electrolyte. The direct quantitative structure-property relationship models enable identification of the essential electronic and structural attributes necessary for quantifying the molecular prerequisites of 11 classes of arylamine organic dyes, responsible for high power conversion efficiency of dye-sensitized solar cells. Tetrahydroquinoline, N,N'-dialkylaniline and indoline have been least explored classes under arylamine organic dyes for dye-sensitized solar cells. Therefore, the identified properties from the corresponding quantitative structure-property relationship models of the mentioned classes were employed in designing of "lead dyes". Followed by, a series of electrochemical and photo-physical parameters were computed for designed dyes to check the required variables for electron flow of dye-sensitized solar cells. The combined computational techniques yielded seven promising lead dyes each for all three chemical classes considered. Significant (130, 183, and 46%) increment in predicted %power conversion efficiency was observed comparing with the existing dye with highest experimental %power conversion efficiency value for tetrahydroquinoline, N,N'-dialkylaniline and indoline, respectively maintaining required electrochemical parameters.

  4. A novel statistical method for quantitative comparison of multiple ChIP-seq datasets.

    PubMed

    Chen, Li; Wang, Chi; Qin, Zhaohui S; Wu, Hao

    2015-06-15

    ChIP-seq is a powerful technology to measure the protein binding or histone modification strength in the whole genome scale. Although there are a number of methods available for single ChIP-seq data analysis (e.g. 'peak detection'), rigorous statistical method for quantitative comparison of multiple ChIP-seq datasets with the considerations of data from control experiment, signal to noise ratios, biological variations and multiple-factor experimental designs is under-developed. In this work, we develop a statistical method to perform quantitative comparison of multiple ChIP-seq datasets and detect genomic regions showing differential protein binding or histone modification. We first detect peaks from all datasets and then union them to form a single set of candidate regions. The read counts from IP experiment at the candidate regions are assumed to follow Poisson distribution. The underlying Poisson rates are modeled as an experiment-specific function of artifacts and biological signals. We then obtain the estimated biological signals and compare them through the hypothesis testing procedure in a linear model framework. Simulations and real data analyses demonstrate that the proposed method provides more accurate and robust results compared with existing ones. An R software package ChIPComp is freely available at http://web1.sph.emory.edu/users/hwu30/software/ChIPComp.html. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Five-year forward view: lessons from emergency care at the extremes of age.

    PubMed

    Minhas, J S; Minhas, D; Coats, T; Banerjee, J; Roland, D

    2018-03-01

    Objective The progressive rise in demand on NHS emergency care resources is partly attributable to increases in attendances of children and older people. A quality gap exists in the care provision for the old and the young. The Five Year Forward View suggested new models of care but that the "answer is not one-size-fits-all". This article discusses the urgent need for person-centred outcome measures to bridge the gap that exists between demand and provision. Design This review is based on evidence gathered from literature searching across several platforms using a variety of search terms to account for the obvious heterogeneity, drawing on key 'think-tank' evidence. Settings Qualitative and quantitative studies examining approaches to caring for individuals at the extremes of age. Participants Individuals at the extremes of age (infants and older people). Main Outcome Measures Understanding similarities and disparities in the care of individuals at the extremes of age in an emergency and non-emergency context. Results There exists several similarities and disparities in the care of individuals at the extremes of age. The increasing burden of health disease on the economy must acknowledge the challenges that exist in managing patients in emergency settings at the extremes of age and build systems to acknowledge the traits these individuals exhibit. Conclusion Commissioners of services must optimise the models of care delivery by appreciating the similarities and differences between care requirements in these two large groups seeking emergency care.

  6. A flexible and robust approach for segmenting cell nuclei from 2D microscopy images using supervised learning and template matching

    PubMed Central

    Chen, Cheng; Wang, Wei; Ozolek, John A.; Rohde, Gustavo K.

    2013-01-01

    We describe a new supervised learning-based template matching approach for segmenting cell nuclei from microscopy images. The method uses examples selected by a user for building a statistical model which captures the texture and shape variations of the nuclear structures from a given dataset to be segmented. Segmentation of subsequent, unlabeled, images is then performed by finding the model instance that best matches (in the normalized cross correlation sense) local neighborhood in the input image. We demonstrate the application of our method to segmenting nuclei from a variety of imaging modalities, and quantitatively compare our results to several other methods. Quantitative results using both simulated and real image data show that, while certain methods may work well for certain imaging modalities, our software is able to obtain high accuracy across several imaging modalities studied. Results also demonstrate that, relative to several existing methods, the template-based method we propose presents increased robustness in the sense of better handling variations in illumination, variations in texture from different imaging modalities, providing more smooth and accurate segmentation borders, as well as handling better cluttered nuclei. PMID:23568787

  7. Mapping of epistatic quantitative trait loci in four-way crosses.

    PubMed

    He, Xiao-Hong; Qin, Hongde; Hu, Zhongli; Zhang, Tianzhen; Zhang, Yuan-Ming

    2011-01-01

    Four-way crosses (4WC) involving four different inbred lines often appear in plant and animal commercial breeding programs. Direct mapping of quantitative trait loci (QTL) in these commercial populations is both economical and practical. However, the existing statistical methods for mapping QTL in a 4WC population are built on the single-QTL genetic model. This simple genetic model fails to take into account QTL interactions, which play an important role in the genetic architecture of complex traits. In this paper, therefore, we attempted to develop a statistical method to detect epistatic QTL in 4WC population. Conditional probabilities of QTL genotypes, computed by the multi-point single locus method, were used to sample the genotypes of all putative QTL in the entire genome. The sampled genotypes were used to construct the design matrix for QTL effects. All QTL effects, including main and epistatic effects, were simultaneously estimated by the penalized maximum likelihood method. The proposed method was confirmed by a series of Monte Carlo simulation studies and real data analysis of cotton. The new method will provide novel tools for the genetic dissection of complex traits, construction of QTL networks, and analysis of heterosis.

  8. Should community health workers offer support healthcare services to survivors of sexual violence? a systematic review.

    PubMed

    Gatuguta, Anne; Katusiime, Barbra; Seeley, Janet; Colombini, Manuela; Mwanzo, Isaac; Devries, Karen

    2017-10-12

    Sexual violence is widespread, yet relatively few survivors receive healthcare or complete treatment. In low and middle-income countries, community health workers (CHWs) have the potential to provide support services to large numbers of survivors. The aim of this review was to document the role of CHWs in sexual violence services. We aimed to: 1) describe existing models of CHWs services including characteristics of CHWs, services delivered and populations served; 2) explore acceptability of CHWs' services to survivors and feasibility of delivering such services; and 3) document the benefits and challenges of CHW-provided sexual violence services. Quantitative and qualitative studies reporting on CHWs and other community-level paraprofessional volunteer services for sexual violence were eligible for inclusion. CHWs and sexual violence were defined according to WHO criteria. The review was conducted according to the Preferred Reporting Items for Systematic reviews and Meta-Analyses guidelines. Quality of included studies was assessed using two quality assessment tools for quantitative, and, the methodology checklist by the National Institute for Health and Clinical Excellence for qualitative studies. Data were extracted and analysed separately for quantitative and qualitative studies and results integrated using a framework approach. Seven studies conducted in six countries (Democratic Republic of Congo, Rwanda, Burma, United States of America, Scotland, Israel) met the inclusion criteria. Different models of care had diverse CHWs roles including awareness creation, identifying, educating and building relationships with survivors, psychosocial support and follow up. Although sociocultural factors may influence CHWs' performance and willingness of survivors to use their services, studies often did not report on CHWs characteristics. Few studies assessed acceptability of CHWs' to survivors or feasibility of delivery of services. However, participants mentioned a range of benefits including decreased incidence of violence, CHWs being trusted, approachable, non-judgmental and compassionate. Challenges identified were high workload, confidentiality issues and community norms influencing performance. There is a dearth of research on CHWs services for sexual violence. Findings suggest that involving CHWs may be beneficial, but potential challenges and harms related to CHW-provided services exist. No different models of CHW-provided care have been robustly evaluated for effects on patient outcomes. Further research to establish survivors' views on these services, and, their effectiveness is desperately needed.

  9. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    NASA Astrophysics Data System (ADS)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  10. Maintenance and Expansion: Modeling Material Stocks and Flows for Residential Buildings and Transportation Networks in the EU25.

    PubMed

    Wiedenhofer, Dominik; Steinberger, Julia K; Eisenmenger, Nina; Haas, Willi

    2015-08-01

    Material stocks are an important part of the social metabolism. Owing to long service lifetimes of stocks, they not only shape resource flows during construction, but also during use, maintenance, and at the end of their useful lifetime. This makes them an important topic for sustainable development. In this work, a model of stocks and flows for nonmetallic minerals in residential buildings, roads, and railways in the EU25, from 2004 to 2009 is presented. The changing material composition of the stock is modeled using a typology of 72 residential buildings, four road and two railway types, throughout the EU25. This allows for estimating the amounts of materials in in-use stocks of residential buildings and transportation networks, as well as input and output flows. We compare the magnitude of material demands for expansion versus those for maintenance of existing stock. Then, recycling potentials are quantitatively explored by comparing the magnitude of estimated input, waste, and recycling flows from 2004 to 2009 and in a business-as-usual scenario for 2020. Thereby, we assess the potential impacts of the European Waste Framework Directive, which strives for a significant increase in recycling. We find that in the EU25, consisting of highly industrialized countries, a large share of material inputs are directed at maintaining existing stocks. Proper management of existing transportation networks and residential buildings is therefore crucial for the future size of flows of nonmetallic minerals.

  11. Maintenance and Expansion: Modeling Material Stocks and Flows for Residential Buildings and Transportation Networks in the EU25

    PubMed Central

    Steinberger, Julia K.; Eisenmenger, Nina; Haas, Willi

    2015-01-01

    Summary Material stocks are an important part of the social metabolism. Owing to long service lifetimes of stocks, they not only shape resource flows during construction, but also during use, maintenance, and at the end of their useful lifetime. This makes them an important topic for sustainable development. In this work, a model of stocks and flows for nonmetallic minerals in residential buildings, roads, and railways in the EU25, from 2004 to 2009 is presented. The changing material composition of the stock is modeled using a typology of 72 residential buildings, four road and two railway types, throughout the EU25. This allows for estimating the amounts of materials in in‐use stocks of residential buildings and transportation networks, as well as input and output flows. We compare the magnitude of material demands for expansion versus those for maintenance of existing stock. Then, recycling potentials are quantitatively explored by comparing the magnitude of estimated input, waste, and recycling flows from 2004 to 2009 and in a business‐as‐usual scenario for 2020. Thereby, we assess the potential impacts of the European Waste Framework Directive, which strives for a significant increase in recycling. We find that in the EU25, consisting of highly industrialized countries, a large share of material inputs are directed at maintaining existing stocks. Proper management of existing transportation networks and residential buildings is therefore crucial for the future size of flows of nonmetallic minerals. PMID:27524878

  12. The ferromagnetic-spin glass transition in PdMn alloys: symmetry breaking of ferromagnetism and spin glass studied by a multicanonical method.

    PubMed

    Kato, Tomohiko; Saita, Takahiro

    2011-03-16

    The magnetism of Pd(1-x)Mn(x) is investigated theoretically. A localized spin model for Mn spins that interact with short-range antiferromagnetic interactions and long-range ferromagnetic interactions via itinerant d electrons is set up, with no adjustable parameters. A multicanonical Monte Carlo simulation, combined with a procedure of symmetry breaking, is employed to discriminate between the ferromagnetic and spin glass orders. The transition temperature and the low-temperature phase are determined from the temperature variation of the specific heat and the probability distributions of the ferromagnetic order parameter and the spin glass order parameter at different concentrations. The calculation results reveal that only the ferromagnetic phase exists at x < 0.02, that only the spin glass phase exists at x > 0.04, and that the two phases coexist at intermediate concentrations. This result agrees semi-quantitatively with experimental results.

  13. A quantitative investigation of the fracture pump-in/flowback test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plahn, S.V.; Nolte, K.G.; Miska, S.

    1995-12-31

    Fracture closure pressure is an important parameter for fracture treatment design and evaluation. The pump-in/flowback (PIFB) test is frequently used to estimate its magnitude. The test is attractive because bottomhole pressures during flowback develop a distinct and repeatable signature. This is in contrast to the pump-in/shut-in test where strong indications of fracture closure are rarely seen. Various techniques exist for extracting closure pressure from the flowback pressure response. Unfortunately, these procedures give different estimates for closure pressure and their theoretical bases are not well established. We present results that place the PIFB test on a more solid foundation. A numericalmore » model is used to simulate the PIFB test and glean physical mechanisms contributing to the response. Based on our simulation results, we propose an interpretation procedure which gives better estimates for closure pressure than existing techniques.« less

  14. Pharmacologic studies in vulnerable populations: Using the pediatric experience.

    PubMed

    Zimmerman, Kanecia; Gonzalez, Daniel; Swamy, Geeta K; Cohen-Wolkowiez, Michael

    2015-11-01

    Historically, few data exist to guide dosing in children and pregnant women. Multiple barriers to inclusion of these vulnerable populations in clinical trials have led to this paucity of data. However, federal legislation targeted at pediatric therapeutics, innovative clinical trial design, use of quantitative clinical pharmacology methods, pediatric thought leadership, and collaboration have successfully overcome many existing barriers. This success has resulted in improved knowledge on pharmacokinetics, safety, and efficacy of therapeutics in children. To date, research in pregnant women has not been characterized by similar success. Wide gaps in knowledge remain despite the common use of therapeutics in pregnancy. Given the similar barriers to drug research and development in pediatric and pregnant populations, the route toward success in children may serve as a model for the advancement of drug development and appropriate drug administration in pregnant women. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Physical Justification for Negative Remanent Magnetization in Homogeneous Nanoparticles

    PubMed Central

    Gu, Shuo; He, Weidong; Zhang, Ming; Zhuang, Taisen; Jin, Yi; ElBidweihy, Hatem; Mao, Yiwu; Dickerson, James H.; Wagner, Michael J.; Torre, Edward Della; Bennett, Lawrence H.

    2014-01-01

    The phenomenon of negative remanent magnetization (NRM) has been observed experimentally in a number of heterogeneous magnetic systems and has been considered anomalous. The existence of NRM in homogenous magnetic materials is still in debate, mainly due to the lack of compelling support from experimental data and a convincing theoretical explanation for its thermodynamic validation. Here we resolve the long-existing controversy by presenting experimental evidence and physical justification that NRM is real in a prototype homogeneous ferromagnetic nanoparticle, an europium sulfide nanoparticle. We provide novel insights into major and minor hysteresis behavior that illuminate the true nature of the observed inverted hysteresis and validate its thermodynamic permissibility and, for the first time, present counterintuitive magnetic aftereffect behavior that is consistent with the mechanism of magnetization reversal, possessing unique capability to identify NRM. The origin and conditions of NRM are explained quantitatively via a wasp-waist model, in combination of energy calculations. PMID:25183061

  16. Organizational culture of a psychiatric hospital and resilience of nursing workers.

    PubMed

    Rocha, Fernanda Ludmilla Rossi; Gaioli, Cheila Cristina Leonardo de Oliveira; Camelo, Silvia Helena Henriques; Mininel, Vivian Aline; Vegro, Thamiris Cavazzani

    2016-01-01

    to analyze the organizational culture of a psychiatric hospital and identify the capacity of resilience of nursing workers. quantitative research. For data collection, were used the Brazilian Instrument for Evaluation of Organizational Culture (IBACO - Instrumento Brasileiro para Avaliação da Cultura Organizacional) and the Resilience Scale (RS). participants reported the existence of centralization of power and devaluation of workers, despite recognizing the existence of collaboration at work and practices for improving interpersonal relations. In relation to the capacity of resilience, 50% of workers showed high level, and 42.9% a medium level of resilience. The correlation tests revealed negative values between the IBACO and RS domains, indicating that the lower the appreciation of individuals in the institution, the greater their capacity of resilience. the organizational values reflect the work organization model in the institution that devalues the workers' needs and requires greater capacity of resilience.

  17. Neuroimaging Feature Terminology: A Controlled Terminology for the Annotation of Brain Imaging Features.

    PubMed

    Iyappan, Anandhi; Younesi, Erfan; Redolfi, Alberto; Vrooman, Henri; Khanna, Shashank; Frisoni, Giovanni B; Hofmann-Apitius, Martin

    2017-01-01

    Ontologies and terminologies are used for interoperability of knowledge and data in a standard manner among interdisciplinary research groups. Existing imaging ontologies capture general aspects of the imaging domain as a whole such as methodological concepts or calibrations of imaging instruments. However, none of the existing ontologies covers the diagnostic features measured by imaging technologies in the context of neurodegenerative diseases. Therefore, the Neuro-Imaging Feature Terminology (NIFT) was developed to organize the knowledge domain of measured brain features in association with neurodegenerative diseases by imaging technologies. The purpose is to identify quantitative imaging biomarkers that can be extracted from multi-modal brain imaging data. This terminology attempts to cover measured features and parameters in brain scans relevant to disease progression. In this paper, we demonstrate the systematic retrieval of measured indices from literature and how the extracted knowledge can be further used for disease modeling that integrates neuroimaging features with molecular processes.

  18. Nordic in Nature: Friluftsliv and Environmental Connectedness

    ERIC Educational Resources Information Center

    Beery, Thomas H.

    2013-01-01

    This study explored the question of whether a relationship exists between the Nordic cultural idea of friluftsliv and the psychological construct of environmental connectedness (EC). This quantitative study employed a correlational design with existing data from the Swedish Outdoor Recreation in Change national survey. Results indicate that there…

  19. The transparency, reliability and utility of tropical rainforest land-use and land-cover change models.

    PubMed

    Rosa, Isabel M D; Ahmed, Sadia E; Ewers, Robert M

    2014-06-01

    Land-use and land-cover (LULC) change is one of the largest drivers of biodiversity loss and carbon emissions globally. We use the tropical rainforests of the Amazon, the Congo basin and South-East Asia as a case study to investigate spatial predictive models of LULC change. Current predictions differ in their modelling approaches, are highly variable and often poorly validated. We carried out a quantitative review of 48 modelling methodologies, considering model spatio-temporal scales, inputs, calibration and validation methods. In addition, we requested model outputs from each of the models reviewed and carried out a quantitative assessment of model performance for tropical LULC predictions in the Brazilian Amazon. We highlight existing shortfalls in the discipline and uncover three key points that need addressing to improve the transparency, reliability and utility of tropical LULC change models: (1) a lack of openness with regard to describing and making available the model inputs and model code; (2) the difficulties of conducting appropriate model validations; and (3) the difficulty that users of tropical LULC models face in obtaining the model predictions to help inform their own analyses and policy decisions. We further draw comparisons between tropical LULC change models in the tropics and the modelling approaches and paradigms in other disciplines, and suggest that recent changes in the climate change and species distribution modelling communities may provide a pathway that tropical LULC change modellers may emulate to further improve the discipline. Climate change models have exerted considerable influence over public perceptions of climate change and now impact policy decisions at all political levels. We suggest that tropical LULC change models have an equally high potential to influence public opinion and impact the development of land-use policies based on plausible future scenarios, but, to do that reliably may require further improvements in the discipline. © 2014 John Wiley & Sons Ltd.

  20. CFD modeling of two-stage ignition in a rapid compression machine: Assessment of zero-dimensional approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mittal, Gaurav; Raju, Mandhapati P.; Sung, Chih-Jen

    2010-07-15

    In modeling rapid compression machine (RCM) experiments, zero-dimensional approach is commonly used along with an associated heat loss model. The adequacy of such approach has not been validated for hydrocarbon fuels. The existence of multi-dimensional effects inside an RCM due to the boundary layer, roll-up vortex, non-uniform heat release, and piston crevice could result in deviation from the zero-dimensional assumption, particularly for hydrocarbons exhibiting two-stage ignition and strong thermokinetic interactions. The objective of this investigation is to assess the adequacy of zero-dimensional approach in modeling RCM experiments under conditions of two-stage ignition and negative temperature coefficient (NTC) response. Computational fluidmore » dynamics simulations are conducted for n-heptane ignition in an RCM and the validity of zero-dimensional approach is assessed through comparisons over the entire NTC region. Results show that the zero-dimensional model based on the approach of 'adiabatic volume expansion' performs very well in adequately predicting the first-stage ignition delays, although quantitative discrepancy for the prediction of the total ignition delays and pressure rise in the first-stage ignition is noted even when the roll-up vortex is suppressed and a well-defined homogeneous core is retained within an RCM. Furthermore, the discrepancy is pressure dependent and decreases as compressed pressure is increased. Also, as ignition response becomes single-stage at higher compressed temperatures, discrepancy from the zero-dimensional simulations reduces. Despite of some quantitative discrepancy, the zero-dimensional modeling approach is deemed satisfactory from the viewpoint of the ignition delay simulation. (author)« less

  1. Experimental Design for Parameter Estimation of Gene Regulatory Networks

    PubMed Central

    Timmer, Jens

    2012-01-01

    Systems biology aims for building quantitative models to address unresolved issues in molecular biology. In order to describe the behavior of biological cells adequately, gene regulatory networks (GRNs) are intensively investigated. As the validity of models built for GRNs depends crucially on the kinetic rates, various methods have been developed to estimate these parameters from experimental data. For this purpose, it is favorable to choose the experimental conditions yielding maximal information. However, existing experimental design principles often rely on unfulfilled mathematical assumptions or become computationally demanding with growing model complexity. To solve this problem, we combined advanced methods for parameter and uncertainty estimation with experimental design considerations. As a showcase, we optimized three simulated GRNs in one of the challenges from the Dialogue for Reverse Engineering Assessment and Methods (DREAM). This article presents our approach, which was awarded the best performing procedure at the DREAM6 Estimation of Model Parameters challenge. For fast and reliable parameter estimation, local deterministic optimization of the likelihood was applied. We analyzed identifiability and precision of the estimates by calculating the profile likelihood. Furthermore, the profiles provided a way to uncover a selection of most informative experiments, from which the optimal one was chosen using additional criteria at every step of the design process. In conclusion, we provide a strategy for optimal experimental design and show its successful application on three highly nonlinear dynamic models. Although presented in the context of the GRNs to be inferred for the DREAM6 challenge, the approach is generic and applicable to most types of quantitative models in systems biology and other disciplines. PMID:22815723

  2. Elementary Writing Assessment Platforms: A Quantitative Examination of Online versus Offline Writing Performance of Fifth-Grade Students

    ERIC Educational Resources Information Center

    Heath, Vickie L.

    2013-01-01

    This quantitative study explored if significant differences exist between how fifth-grade students produce a written response to a narrative prompt using online versus offline writing platforms. The cultural and social trend of instructional and assessment writing paradigms in education is shifting to online writing platforms (National Assessment…

  3. Educational Outcomes of Synchronous and Asynchronous High School Students: A Quantitative Causal-Comparative Study of Online Algebra 1

    ERIC Educational Resources Information Center

    Berry, Sharon

    2017-01-01

    This study used a quantitative, causal-comparative design. It compared educational outcome data from online Algebra 1 courses to determine if a significant difference existed between synchronous and asynchronous students for end-of-course grades, state assessments scores, and student perceptions of their course. The study found that synchronous…

  4. Effects of Computer Programming on Students' Cognitive Performance: A Quantitative Synthesis.

    ERIC Educational Resources Information Center

    Liao, Yuen-Kuang Cliff

    A meta-analysis was performed to synthesize existing data concerning the effects of computer programing on cognitive outcomes of students. Sixty-five studies were located from three sources, and their quantitative data were transformed into a common scale--Effect Size (ES). The analysis showed that 58 (89%) of the study-weighted ESs were positive…

  5. Co-Teaching in Middle School Classrooms: Quantitative Comparative Study of Special Education Student Assessment Performance

    ERIC Educational Resources Information Center

    Reese, De'borah Reese

    2017-01-01

    The purpose of this quantitative comparative study was to determine the existence or nonexistence of performance pass rate differences of special education middle school students on standardized assessments between pre and post co-teaching eras disaggregated by subject area and school. Co-teaching has altered classroom environments in many ways.…

  6. Reverse Brain Drain of South Asian IT Professionals: A Quantitative Repatriation Study

    ERIC Educational Resources Information Center

    Suppiah, Nithiyananthan

    2014-01-01

    The purpose of the present quantitative correlational study was to examine if a relationship existed between the RBD phenomenon and cultural, economic, or political factors of the native countries of South Asian IT professionals living in the United States. The study on reverse brain drain was conducted to explore a growing phenomenon in the…

  7. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  8. Assessing exposure to transformation products of soil-applied organic contaminants in surface water: comparison of model predictions and field data.

    PubMed

    Kern, Susanne; Singer, Heinz; Hollender, Juliane; Schwarzenbach, René P; Fenner, Kathrin

    2011-04-01

    Transformation products (TPs) of chemicals released to soil, for example, pesticides, are regularly detected in surface and groundwater with some TPs even dominating observed pesticide levels. Given the large number of TPs potentially formed in the environment, straightforward prioritization methods based on available data and simple, evaluative models are required to identify TPs with a high aquatic exposure potential. While different such methods exist, none of them has so far been systematically evaluated against field data. Using a dynamic multimedia, multispecies model for TP prioritization, we compared the predicted relative surface water exposure potential of pesticides and their TPs with experimental data for 16 pesticides and 46 TPs measured in a small river draining a Swiss agricultural catchment. Twenty TPs were determined quantitatively using solid-phase extraction liquid chromatography mass spectrometry (SPE-LC-MS/MS), whereas the remaining 26 TPs could only be detected qualitatively because of the lack of analytical reference standards. Accordingly, the two sets of TPs were used for quantitative and qualitative model evaluation, respectively. Quantitative comparison of predicted with measured surface water exposure ratios for 20 pairs of TPs and parent pesticides indicated agreement within a factor of 10, except for chloridazon-desphenyl and chloridazon-methyl-desphenyl. The latter two TPs were found to be present in elevated concentrations during baseflow conditions and in groundwater samples across Switzerland, pointing toward high concentrations in exfiltrating groundwater. A simple leaching relationship was shown to qualitatively agree with the observed baseflow concentrations and to thus be useful in identifying TPs for which the simple prioritization model might underestimate actual surface water concentrations. Application of the model to the 26 qualitatively analyzed TPs showed that most of those TPs categorized as exhibiting a high aquatic exposure potential could be confirmed to be present in the majority of water samples investigated. On the basis of these results, we propose a generally applicable, model-based approach to identify those TPs of soil-applied organic contaminants that exhibit a high aquatic exposure potential to prioritize them for higher-tier, experimental investigations.

  9. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Parker, Keith R.; Murphy, Stephen M.; Day, Robert H.; Bence, A. Edward; Neff, Jerry M.; Wiens, John A.

    2012-01-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001–2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400–4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680

  10. Cognitive and Motivational Factors that Inspire Hispanic Female Students to Pursue STEM-Related Academic Programs that Lead to Careers in Science, Technology, Engineering, and Mathematics

    NASA Astrophysics Data System (ADS)

    Morel-Baker, Sonaliz

    Hispanics, and women in particular, continue to be underrepresented in the fields of science, technology, engineering, and mathematics (STEM). The purpose of this study was to analyze cognitive and motivational factors that inspired Hispanic female college students to major in STEM programs and aspire to academic success. This mixed methods study was conducted using both quantitative and qualitative data collection and analysis techniques in a sequential phase. Quantitative data were collected through the use of the 80-item Honey and Mumford Learning Styles Questionnaire, which was focused on the students' learning styles and how they impact Hispanic female students upon engaging in a STEM-related curriculum. Qualitative data were collected during interviews focusing on factors that led students to select, participate in, and make a commitment to some aspect of a STEM-related program. The questions that were asked during the interviews were intended to examine whether the existence of role models and STEM initiatives motivate Hispanic female students to major in STEM-related academic programs and aspire to academic success. The participants in this study were undergraduate Hispanic female students majoring in STEM-related academic programs and at a four-year university. The results indicate that the majority of the participants (88%) identified as reflectors, 4% as activists, 4% as theorists, and 4% as pragmatists. The results from the interviews suggested that the existence of role models (family members, educators, or STEM professionals) was a factor that motivated Hispanic females to major in STEM-related subjects and that exposure to STEM initiatives during K-12 education motivated Hispanic females to pursue a career in STEM.

  11. An Evaluation of Understandability of Patient Journey Models in Mental Health.

    PubMed

    Percival, Jennifer; McGregor, Carolyn

    2016-07-28

    There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers.

  12. Hierarchical lattice models of hydrogen-bond networks in water

    NASA Astrophysics Data System (ADS)

    Dandekar, Rahul; Hassanali, Ali A.

    2018-06-01

    We develop a graph-based model of the hydrogen-bond network in water, with a view toward quantitatively modeling the molecular-level correlational structure of the network. The networks formed are studied by the constructing the model on two infinite-dimensional lattices. Our models are built bottom up, based on microscopic information coming from atomistic simulations, and we show that the predictions of the model are consistent with known results from ab initio simulations of liquid water. We show that simple entropic models can predict the correlations and clustering of local-coordination defects around tetrahedral waters observed in the atomistic simulations. We also find that orientational correlations between bonds are longer ranged than density correlations, determine the directional correlations within closed loops, and show that the patterns of water wires within these structures are also consistent with previous atomistic simulations. Our models show the existence of density and compressibility anomalies, as seen in the real liquid, and the phase diagram of these models is consistent with the singularity-free scenario previously proposed by Sastry and coworkers [Phys. Rev. E 53, 6144 (1996), 10.1103/PhysRevE.53.6144].

  13. Synthesizing the Effect of Building Condition Quality on Academic Performance

    ERIC Educational Resources Information Center

    Gunter, Tracey; Shao, Jing

    2016-01-01

    Since the late 1970s, researchers have examined the relationship between school building condition and student performance. Though many literature reviews have claimed that a relationship exists, no meta-analysis has quantitatively examined this literature. The purpose of this review was to synthesize the existing literature on the relationship…

  14. A GIS model-based assessment of the environmental distribution of gamma-hexachlorocyclohexane in European soils and waters.

    PubMed

    Vizcaíno, P; Pistocchi, A

    2010-10-01

    The MAPPE GIS based multimedia model is used to produce a quantitative description of the behaviour of gamma-hexachlorocyclohexane (gamma-HCH) in Europe, with emphasis on continental surface waters. The model is found to reasonably reproduce gamma-HCH distributions and variations along the years in atmosphere and soil; for continental surface waters, concentrations were reasonably well predicted for year 1995, when lindane was still used in agriculture, while for 2005, assuming severe restrictions in use, yields to substantial underestimation. Much better results were yielded when same mode of release as in 1995 was considered, supporting the conjecture that for gamma-HCH, emission data rather that model structure and parameterization can be responsible for wrong estimation of concentrations. Future research should be directed to improve the quality of emission data. Joint interpretation of monitoring and modelling results, highlights that lindane emissions in Europe, despite the marked decreasing trend, persist beyond the provisions of existing legislation. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  15. Atomistic and coarse-grained computer simulations of raft-like lipid mixtures.

    PubMed

    Pandit, Sagar A; Scott, H Larry

    2007-01-01

    Computer modeling can provide insights into the existence, structure, size, and thermodynamic stability of localized raft-like regions in membranes. However, the challenges in the construction and simulation of accurate models of heterogeneous membranes are great. The primary obstacle in modeling the lateral organization within a membrane is the relatively slow lateral diffusion rate for lipid molecules. Microsecond or longer time-scales are needed to fully model the formation and stability of a raft in a membra ne. Atomistic simulations currently are not able to reach this scale, but they do provide quantitative information on the intermolecular forces and correlations that are involved in lateral organization. In this chapter, the steps needed to carry out and analyze atomistic simulations of hydrated lipid bilayers having heterogeneous composition are outlined. It is then shown how the data from a molecular dynamics simulation can be used to construct a coarse-grained model for the heterogeneous bilayer that can predict the lateral organization and stability of rafts at up to millisecond time-scales.

  16. Simulation of metastatic progression using a computer model including chemotherapy and radiation therapy.

    PubMed

    Bethge, Anja; Schumacher, Udo; Wedemann, Gero

    2015-10-01

    Despite considerable research efforts, the process of metastasis formation is still a subject of intense discussion, and even established models differ considerably in basic details and in the conclusions drawn from them. Mathematical and computational models add a new perspective to the research as they can quantitatively investigate the processes of metastasis and the effects of treatment. However, existing models look at only one treatment option at a time. We enhanced a previously developed computer model (called CaTSiT) that enables quantitative comparison of different metastasis formation models with clinical and experimental data, to include the effects of chemotherapy, external beam radiation, radioimmunotherapy and radioembolization. CaTSiT is based on a discrete event simulation procedure. The growth of the primary tumor and its metastases is modeled by a piecewise-defined growth function that describes the growth behavior of the primary tumor and metastases during various time intervals. The piecewise-defined growth function is composed of analytical functions describing the growth behavior of the tumor based on characteristics of the tumor, such as dormancy, or the effects of various therapies. The spreading of malignant cells into the blood is modeled by intravasation events, which are generated according to a rate function. Further events in the model describe the behavior of the released malignant cells until the formation of a new metastasis. The model is published under the GNU General Public License version 3. To demonstrate the application of the computer model, a case of a patient with a hepatocellular carcinoma and multiple metastases in the liver was simulated. Besides the untreated case, different treatments were simulated at two time points: one directly after diagnosis of the primary tumor and the other several months later. Except for early applied radioimmunotherapy, no treatment strategy was able to eliminate all metastases. These results emphasize the importance of early diagnosis and of proceeding with treatment even if no clinically detectable metastases are present at the time of diagnosis of the primary tumor. CaTSiT could be a valuable tool for quantitative investigation of the process of tumor growth and metastasis formation, including the effects of various treatment options. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Mullins effect in a filled elastomer under uniaxial tension

    DOE PAGES

    Maiti, A.; Small, W.; Gee, R. H.; ...

    2014-01-16

    Modulus softening and permanent set in filled polymeric materials due to cyclic loading and unloading, commonly known as the Mullins effect, can have a significant impact on their use as support cushions. The quantitative analysis of such behavior is essential to ensure the effectiveness of such materials in long-term deployment. In this work we combine existing ideas of filler-induced modulus enhancement, strain amplification, and irreversible deformation within a simple non-Gaussian constitutive model to quantitatively interpret recent measurements on a relevant PDMS-based elastomeric cushion. Also, we find that the experimental stress-strain data is consistent with the picture that during stretching (loading)more » two effects take place simultaneously: (1) the physical constraints (entanglements) initially present in the polymer network get disentangled, thus leading to a gradual decrease in the effective cross-link density, and (2) the effective filler volume fraction gradually decreases with increasing strain due to the irreversible pulling out of an initially occluded volume of the soft polymer domain.« less

  18. Fuzzy method of recognition of high molecular substances in evidence-based biology

    NASA Astrophysics Data System (ADS)

    Olevskyi, V. I.; Smetanin, V. T.; Olevska, Yu. B.

    2017-10-01

    Nowadays modern requirements to achieving reliable results along with high quality of researches put mathematical analysis methods of results at the forefront. Because of this, evidence-based methods of processing experimental data have become increasingly popular in the biological sciences and medicine. Their basis is meta-analysis, a method of quantitative generalization of a large number of randomized trails contributing to a same special problem, which are often contradictory and performed by different authors. It allows identifying the most important trends and quantitative indicators of the data, verification of advanced hypotheses and discovering new effects in the population genotype. The existing methods of recognizing high molecular substances by gel electrophoresis of proteins under denaturing conditions are based on approximate methods for comparing the contrast of electrophoregrams with a standard solution of known substances. We propose a fuzzy method for modeling experimental data to increase the accuracy and validity of the findings of the detection of new proteins.

  19. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less

  20. Reviews and syntheses: guiding the evolution of the observing system for the carbon cycle through quantitative network design

    NASA Astrophysics Data System (ADS)

    Kaminski, Thomas; Rayner, Peter Julian

    2017-10-01

    Various observational data streams have been shown to provide valuable constraints on the state and evolution of the global carbon cycle. These observations have the potential to reduce uncertainties in past, current, and predicted natural and anthropogenic surface fluxes. In particular such observations provide independent information for verification of actions as requested by the Paris Agreement. It is, however, difficult to decide which variables to sample, and how, where, and when to sample them, in order to achieve an optimal use of the observational capabilities. Quantitative network design (QND) assesses the impact of a given set of existing or hypothetical observations in a modelling framework. QND has been used to optimise in situ networks and assess the benefit to be expected from planned space missions. This paper describes recent progress and highlights aspects that are not yet sufficiently addressed. It demonstrates the advantage of an integrated QND system that can simultaneously evaluate a multitude of observational data streams and assess their complementarity and redundancy.

  1. A Lung Segmental Model of Chronic Pseudomonas Infection in Sheep

    PubMed Central

    Collie, David; Govan, John; Wright, Steven; Thornton, Elisabeth; Tennant, Peter; Smith, Sionagh; Doherty, Catherine; McLachlan, Gerry

    2013-01-01

    Background Chronic lung infection with Pseudomonas aeruginosa is a major contributor to morbidity, mortality and premature death in cystic fibrosis. A new paradigm for managing such infections is needed, as are relevant and translatable animal models to identify and test concepts. We sought to improve on limitations associated with existing models of infection in small animals through developing a lung segmental model of chronic Pseudomonas infection in sheep. Methodology/Principal Findings Using local lung instillation of P. aeruginosa suspended in agar beads we were able to demonstrate that such infection led to the development of a suppurative, necrotising and pyogranulomatous pneumonia centred on the instilled beads. No overt evidence of organ or systemic compromise was apparent in any animal during the course of infection. Infection persisted in the lungs of individual animals for as long as 66 days after initial instillation. Quantitative microbiology applied to bronchoalveolar lavage fluid derived from infected segments proved an insensitive index of the presence of significant infection in lung tissue (>104 cfu/g). Conclusions/Significance The agar bead model of chronic P. aeruginosa lung infection in sheep is a relevant platform to investigate both the pathobiology of such infections as well as novel approaches to their diagnosis and therapy. Particular ethical benefits relate to the model in terms of refining existing approaches by compromising a smaller proportion of the lung with infection and facilitating longitudinal assessment by bronchoscopy, and also potentially reducing animal numbers through facilitating within-animal comparisons of differential therapeutic approaches. PMID:23874438

  2. A qualitative interpretation of 7 August 1972 impulsive phase flare H alpha line profiles

    NASA Technical Reports Server (NTRS)

    Canfield, R. C.

    1982-01-01

    The considered investigation shows that existing models of the formation of the H-alpha line during flares appear to provide clear qualitative evidence that heating of the H-alpha forming regions of the flare chromosphere in the bright H-alpha kernels observed during the impulsive phase of solar flares is not due primarily to heating by Coulomb collisions of a power-law distribution of 10-100 keV electrons with chromospheric material. It appears rather that some shorter-range process, involving possibly conduction or optically thick radiative transfer, is favored. Such a conclusion is clearly relevant to collisionless confinement modelling. However, much work remains to be done before there will be a basis for quantitatively testing the consistency of the considered picture with chromospheric diagnostics.

  3. Second-harmonic diffraction from holographic volume grating.

    PubMed

    Nee, Tsu-Wei

    2006-10-01

    The full polarization property of holographic volume-grating enhanced second-harmonic diffraction (SHD) is investigated theoretically. The nonlinear coefficient is derived from a simple atomic model of the material. By using a simple volume-grating model, the SHD fields and Mueller matrices are first derived. The SHD phase-mismatching effect for a thick sample is analytically investigated. This theory is justified by fitting with published experimental SHD data of thin-film samples. The SHD of an existing polymethyl methacrylate (PMMA) holographic 2-mm-thick volume-grating sample is investigated. This sample has two strong coupling linear diffraction peaks and five SHD peaks. The splitting of SHD peaks is due to the phase-mismatching effect. The detector sensitivity and laser power needed to measure these peak signals are quantitatively estimated.

  4. Implementation and Outcomes of Forensic Housing First Programs.

    PubMed

    Kriegel, Liat S; Henwood, Benjamin F; Gilmer, Todd P

    2016-01-01

    This mixed-method study used administrative data from 68 supportive housing programs and evaluative and qualitative site visit data from a subset of four forensic programs to (a) compare fidelity to the Housing First model and residential client outcomes between forensic and nonforensic programs and (b) investigate whether and how providers working in forensic programs can navigate competing Housing First principles and criminal justice mandates. Quantitative findings suggested that forensic programs were less likely to follow a harm reduction approach to substance use and clients in those programs were more likely to live in congregate settings. Qualitative findings suggested that an interplay of court involvement, limited resources, and risk environments influenced staff decisions regarding housing and treatment. Existing mental health and criminal justice collaborations necessitate adaptation to the Housing First model to accommodate client needs.

  5. Evaluation of a Postdischarge Call System Using the Logic Model.

    PubMed

    Frye, Timothy C; Poe, Terri L; Wilson, Marisa L; Milligan, Gary

    2018-02-01

    This mixed-method study was conducted to evaluate a postdischarge call program for congestive heart failure patients at a major teaching hospital in the southeastern United States. The program was implemented based on the premise that it would improve patient outcomes and overall quality of life, but it had never been evaluated for effectiveness. The Logic Model was used to evaluate the input of key staff members to determine whether the outputs and results of the program matched the expectations of the organization. Interviews, online surveys, reviews of existing patient outcome data, and reviews of publicly available program marketing materials were used to ascertain current program output. After analyzing both qualitative and quantitative data from the evaluation, recommendations were made to the organization to improve the effectiveness of the program.

  6. Mass and energy transfer across the Earth's magnetopause caused by vortex-induced reconnection: Mass and energy transfer by K-H vortex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakamura, T. K. M.; Eriksson, S.; Hasegawa, H.

    When the interplanetary magnetic field (IMF) is strongly northward, a boundary layer that contains a considerable amount of plasma of magnetosheath origin is often observed along and earthward of the low-latitude magnetopause. Such a pre-existing boundary layer, with a higher density than observed in the adjacent magnetosphere, reduces the local Alfvén speed and allows the Kelvin-Helmholtz instability (KHI) to grow more strongly. We employ a three-dimensional fully kinetic simulation to model an event observed by the Magnetospheric Multiscale (MMS) mission in which the spacecraft detected substantial KH waves between a pre-existing boundary layer and the magnetosheath during strong northward IMF.more » Initial results of this simulation [Nakamura et al., 2017] have successfully demonstrated ion-scale signatures of magnetic reconnection induced by the non-linearly developed KH vortex, which are quantitatively consistent with MMS observations. Furthermore, we quantify the simulated mass and energy transfer processes driven by this vortex-induced reconnection (VIR) and show that during this particular MMS event (i) mass enters a new mixing layer formed by the VIR more efficiently from the pre-existing boundary layer side than from the magnetosheath side, (ii) mixed plasmas within the new mixing layer convect tailward along the magnetopause at more than half the magnetosheath flow speed, and (iii) energy dissipation in localized VIR dissipation regions results in a strong parallel electron heating within the mixing layer. Finally, the quantitative agreements between the simulation and MMS observations allow new predictions that elucidate how the mass and energy transfer processes occur near the magnetopause during strong northward IMF.« less

  7. Mass and energy transfer across the Earth's magnetopause caused by vortex-induced reconnection: Mass and energy transfer by K-H vortex

    DOE PAGES

    Nakamura, T. K. M.; Eriksson, S.; Hasegawa, H.; ...

    2017-10-23

    When the interplanetary magnetic field (IMF) is strongly northward, a boundary layer that contains a considerable amount of plasma of magnetosheath origin is often observed along and earthward of the low-latitude magnetopause. Such a pre-existing boundary layer, with a higher density than observed in the adjacent magnetosphere, reduces the local Alfvén speed and allows the Kelvin-Helmholtz instability (KHI) to grow more strongly. We employ a three-dimensional fully kinetic simulation to model an event observed by the Magnetospheric Multiscale (MMS) mission in which the spacecraft detected substantial KH waves between a pre-existing boundary layer and the magnetosheath during strong northward IMF.more » Initial results of this simulation [Nakamura et al., 2017] have successfully demonstrated ion-scale signatures of magnetic reconnection induced by the non-linearly developed KH vortex, which are quantitatively consistent with MMS observations. Furthermore, we quantify the simulated mass and energy transfer processes driven by this vortex-induced reconnection (VIR) and show that during this particular MMS event (i) mass enters a new mixing layer formed by the VIR more efficiently from the pre-existing boundary layer side than from the magnetosheath side, (ii) mixed plasmas within the new mixing layer convect tailward along the magnetopause at more than half the magnetosheath flow speed, and (iii) energy dissipation in localized VIR dissipation regions results in a strong parallel electron heating within the mixing layer. Finally, the quantitative agreements between the simulation and MMS observations allow new predictions that elucidate how the mass and energy transfer processes occur near the magnetopause during strong northward IMF.« less

  8. The transcription factor titration effect dictates level of gene expression.

    PubMed

    Brewster, Robert C; Weinert, Franz M; Garcia, Hernan G; Song, Dan; Rydenfelt, Mattias; Phillips, Rob

    2014-03-13

    Models of transcription are often built around a picture of RNA polymerase and transcription factors (TFs) acting on a single copy of a promoter. However, most TFs are shared between multiple genes with varying binding affinities. Beyond that, genes often exist at high copy number-in multiple identical copies on the chromosome or on plasmids or viral vectors with copy numbers in the hundreds. Using a thermodynamic model, we characterize the interplay between TF copy number and the demand for that TF. We demonstrate the parameter-free predictive power of this model as a function of the copy number of the TF and the number and affinities of the available specific binding sites; such predictive control is important for the understanding of transcription and the desire to quantitatively design the output of genetic circuits. Finally, we use these experiments to dynamically measure plasmid copy number through the cell cycle. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Efficient kinetic method for fluid simulation beyond the Navier-Stokes equation.

    PubMed

    Zhang, Raoyang; Shan, Xiaowen; Chen, Hudong

    2006-10-01

    We present a further theoretical extension to the kinetic-theory-based formulation of the lattice Boltzmann method of Shan [J. Fluid Mech. 550, 413 (2006)]. In addition to the higher-order projection of the equilibrium distribution function and a sufficiently accurate Gauss-Hermite quadrature in the original formulation, a regularization procedure is introduced in this paper. This procedure ensures a consistent order of accuracy control over the nonequilibrium contributions in the Galerkin sense. Using this formulation, we construct a specific lattice Boltzmann model that accurately incorporates up to third-order hydrodynamic moments. Numerical evidence demonstrates that the extended model overcomes some major defects existing in conventionally known lattice Boltzmann models, so that fluid flows at finite Knudsen number Kn can be more quantitatively simulated. Results from force-driven Poiseuille flow simulations predict the Knudsen's minimum and the asymptotic behavior of flow flux at large Kn.

  10. Flavor physics without flavor symmetries

    NASA Astrophysics Data System (ADS)

    Buchmuller, Wilfried; Patel, Ketan M.

    2018-04-01

    We quantitatively analyze a quark-lepton flavor model derived from a six-dimensional supersymmetric theory with S O (10 )×U (1 ) gauge symmetry, compactified on an orbifold with magnetic flux. Two bulk 16 -plets charged under the U (1 ) provide the three quark-lepton generations whereas two uncharged 10 -plets yield two Higgs doublets. At the orbifold fixed points mass matrices are generated with rank one or two. Moreover, the zero modes mix with heavy vectorlike split multiplets. The model possesses no flavor symmetries. Nevertheless, there exist a number of relations between Yukawa couplings, remnants of the underlying grand unified theory symmetry and the wave function profiles of the zero modes, which lead to a prediction of the light neutrino mass scale, mν 1˜10-3 eV and heavy Majorana neutrino masses in the range from 1 012 to 1 014 GeV . The model successfully includes thermal leptogenesis.

  11. Stability and Hopf Bifurcation in a HIV-1 System with Multitime Delays

    NASA Astrophysics Data System (ADS)

    Zhao, Lingyan; Liu, Haihong; Yan, Fang

    In this paper, we propose a mathematical model for HIV-1 infection with three time delays. The model examines a viral-therapy for controlling infections by using an engineered virus to selectively eliminate infected cells. In our model, three time delays represent the latent period of pathogen virus, pathogen virus production period and recombinant (genetically modified) virus production period, respectively. Detailed theoretical analysis have demonstrated that the values of three delays can affect the stability of equilibrium solutions, can also lead to Hopf bifurcation and oscillated solutions of the system. Moreover, we give the conditions for the existence of stable positive equilibrium solution and Hopf bifurcation. Further, the properties of Hopf bifurcation are discussed. These theoretical results indicate that the delays play an important role in determining the dynamic behavior quantitatively. Therefore, it is a fact that delays are very important, which should not be missed in controlling HIV-1 infections.

  12. Predicted and measured boundary layer refraction for advanced turboprop propeller noise

    NASA Technical Reports Server (NTRS)

    Dittmar, James H.; Krejsa, Eugene A.

    1990-01-01

    Currently, boundary layer refraction presents a limitation to the measurement of forward arc propeller noise measured on an acoustic plate in the NASA Lewis 8- by 6-Foot Supersonic Wind Tunnel. The use of a validated boundary layer refraction model to adjust the data could remove this limitation. An existing boundary layer refraction model is used to predict the refraction for cases where boundary layer refraction was measured. In general, the model exhibits the same qualitative behavior as the measured refraction. However, the prediction method does not show quantitative agreement with the data. In general, it overpredicts the amount of refraction for the far forward angles at axial Mach number of 0.85 and 0.80 and underpredicts the refraction at axial Mach numbers of 0.75 and 0.70. A more complete propeller source description is suggested as a way to improve the prediction method.

  13. A model of the primordial lunar atmosphere

    NASA Astrophysics Data System (ADS)

    Saxena, Prabal; Elkins-Tanton, Lindy; Petro, Noah; Mandell, Avi

    2017-09-01

    We create the first quantitative model for the early lunar atmosphere, coupled with a magma ocean crystallization model. Immediately after formation, the moon's surface was subject to a radiative environment that included contributions from the early Sun, a post-impact Earth that radiated like a mid-type M dwarf star, and a cooling global magma ocean. This radiative environment resulted in a largely Earth-side atmosphere on the Moon, ranging from ∼104 to ∼102 pascals, composed of heavy volatiles (Na and SiO). This atmosphere persisted through lid formation and was additionally characterized by supersonic winds that transported significant quantities of moderate volatiles and likely generated magma ocean waves. The existence of this atmosphere may have influenced the distribution of some moderate volatiles and created temperature asymmetries which influenced ocean flow and cooling. Such asymmetries may characterize young, tidally locked rocky bodies with global magma oceans and subject to intense irradiation.

  14. A Model of the Primordial Lunar Atmosphere

    NASA Technical Reports Server (NTRS)

    Saxena, Prabal; Elkins-Tanton, Lindy; Petro, Noah; Mandell, Avi

    2017-01-01

    We create the first quantitative model for the early lunar atmosphere, coupled with a magma ocean crystallization model. Immediately after formation, the moon's surface was subject to a radiative environment that included contributions from the early Sun, a post-impact Earth that radiated like a mid-type M dwarf star, and a cooling global magma ocean. This radiative environment resulted in a largely Earth-side atmosphere on the Moon, ranging from approximately 10(exp 4) to approximately 10(exp 2) pascals, composed of heavy volatiles (Na and SiO). This atmosphere persisted through lid formation and was additionally characterized by supersonic winds that transported significant quantities of moderate volatiles and likely generated magma ocean waves. The existence of this atmosphere may have influenced the distribution of some moderate volatiles and created temperature asymmetries which influenced ocean flow and cooling. Such asymmetries may characterize young, tidally locked rocky bodies with global magma oceans and subject to intense irradiation.

  15. Attenuation of low-frequency underwater sound using an array of air-filled balloons and comparison to effective medium theory.

    PubMed

    Lee, Kevin M; Wilson, Preston S; Wochner, Mark S

    2017-12-01

    The ultimate goal of this work is to accurately predict the attenuation through a collection of large (on the order of 10-cm-radius) tethered encapsulated bubbles used in underwater noise abatement systems. Measurements of underwater sound attenuation were performed during a set of lake experiments, where a low-frequency compact electromechanical sound source was surrounded by different arrays of encapsulated bubbles with various individual bubbles sizes and void fractions. The measurements were compared with an existing predictive model [Church, J. Acoust. Soc. Am. 97, 1510-1521 (1995)] of the dispersion relation for linear propagation in liquid containing encapsulated bubbles. Although the model was originally intended to describe ultrasound contrast agents, it is evaluated here for large bubbles, and hence low frequencies, as a design tool for future underwater noise abatement systems, and there is good quantitative agreement between the observations and the model.

  16. The topological Anderson insulator phase in the Kane-Mele model

    NASA Astrophysics Data System (ADS)

    Orth, Christoph P.; Sekera, Tibor; Bruder, Christoph; Schmidt, Thomas L.

    2016-04-01

    It has been proposed that adding disorder to a topologically trivial mercury telluride/cadmium telluride (HgTe/CdTe) quantum well can induce a transition to a topologically nontrivial state. The resulting state was termed topological Anderson insulator and was found in computer simulations of the Bernevig-Hughes-Zhang model. Here, we show that the topological Anderson insulator is a more universal phenomenon and also appears in the Kane-Mele model of topological insulators on a honeycomb lattice. We numerically investigate the interplay of the relevant parameters, and establish the parameter range in which the topological Anderson insulator exists. A staggered sublattice potential turns out to be a necessary condition for the transition to the topological Anderson insulator. For weak enough disorder, a calculation based on the lowest-order Born approximation reproduces quantitatively the numerical data. Our results thus considerably increase the number of candidate materials for the topological Anderson insulator phase.

  17. Sensitivity analysis of Repast computational ecology models with R/Repast.

    PubMed

    Prestes García, Antonio; Rodríguez-Patón, Alfonso

    2016-12-01

    Computational ecology is an emerging interdisciplinary discipline founded mainly on modeling and simulation methods for studying ecological systems. Among the existing modeling formalisms, the individual-based modeling is particularly well suited for capturing the complex temporal and spatial dynamics as well as the nonlinearities arising in ecosystems, communities, or populations due to individual variability. In addition, being a bottom-up approach, it is useful for providing new insights on the local mechanisms which are generating some observed global dynamics. Of course, no conclusions about model results could be taken seriously if they are based on a single model execution and they are not analyzed carefully. Therefore, a sound methodology should always be used for underpinning the interpretation of model results. The sensitivity analysis is a methodology for quantitatively assessing the effect of input uncertainty in the simulation output which should be incorporated compulsorily to every work based on in-silico experimental setup. In this article, we present R/Repast a GNU R package for running and analyzing Repast Simphony models accompanied by two worked examples on how to perform global sensitivity analysis and how to interpret the results.

  18. On the Development and Use of Large Chemical Similarity Networks, Informatics Best Practices and Novel Chemical Descriptors Towards Materials Quantitative Structure Property Relationships

    NASA Astrophysics Data System (ADS)

    Krein, Michael

    After decades of development and use in a variety of application areas, Quantitative Structure Property Relationships (QSPRs) and related descriptor-based statistical learning methods have achieved a level of infamy due to their misuse. The field is rife with past examples of overtrained models, overoptimistic performance assessment, and outright cheating in the form of explicitly removing data to fit models. These actions do not serve the community well, nor are they beneficial to future predictions based on established models. In practice, in order to select combinations of descriptors and machine learning methods that might work best, one must consider the nature and size of the training and test datasets, be aware of existing hypotheses about the data, and resist the temptation to bias structure representation and modeling to explicitly fit the hypotheses. The definition and application of these best practices is important for obtaining actionable modeling outcomes, and for setting user expectations of modeling accuracy when predicting the endpoint values of unknowns. A wide variety of statistical learning approaches, descriptor types, and model validation strategies are explored herein, with the goals of helping end users understand the factors involved in creating and using QSPR models effectively, and to better understand relationships within the data, especially by looking at the problem space from multiple perspectives. Molecular relationships are commonly envisioned in a continuous high-dimensional space of numerical descriptors, referred to as chemistry space. Descriptor and similarity metric choice influence the partitioning of this space into regions corresponding to local structural similarity. These regions, known as domains of applicability, are most likely to be successfully modeled by a QSPR. In Chapter 2, the network topology and scaling relationships of several chemistry spaces are thoroughly investigated. Chemistry spaces studied include the ZINC data set, a qHTS PubChem bioassay, as well as the protein binding sites from the PDB. The characteristics of these networks are compared and contrasted with those of the bioassay Structure Activity Landscape Index (SALI) subnetwork, which maps discontinuities or cliffs in the structure activity landscape. Mapping this newly generated information over underlying chemistry space networks generated using different descriptors demonstrates local modeling capacity and can guide the choice of better local representations of chemistry space. Chapter 2 introduces and demonstrates this novel concept, which also enables future work in visualization and interpretation of chemical spaces. Initially, it was discovered that there were no community-available tools to leverage best-practice ideas to comprehensively build, compare, and interpret QSPRs. The Yet Another Modeling System (YAMS) tool performs a series of balanced, rational decisions in dataset preprocessing and parameter/feature selection over a choice of modeling methods. To date, YAMS is the only community-available informatics tool that performs such decisions consistently between methods while also providing multiple model performance comparisons and detailed descriptor importance information. The focus of the tool is thus to convey rich information about model quality and predictions that help to "close the loop" between modeling and experimental efforts, for example, in tailoring nanocomposite properties. Polymer nanocomposites (PNC) are complex material systems encompassing many potential structures, chemistries, and self assembled morphologies that could significantly impact commercial and military applications. There is a strong desire to characterize and understand the tradespace of nanocomposites, to identify the important factors relating nanostructure to materials properties and determine an effective way to control materials properties at the manufacturing scale. Due to the complexity of the systems, existing design approaches rely heavily on trial-and-error learning. By leveraging existing experimental data, Materials Quantitative Structure-Property Relationships (MQSPRs) relate molecular structures to the polar and dispersive components of corresponding surface tensions. In turn, existing theories relate polymer and nanofiller polar and dispersive surface tension components to the dispersion state and interfacial polymer relaxation times. These quantities may, in the future, be used as input to continuum mechanics approaches shown able to predict the thermomechanical response of nanocomposites. For a polymer dataset and a particle dataset, multiple structural representations and descriptor sets are benchmarked, including a set of high performance surface-property descriptors developed as part of this work. The systematic variation of structural representations as part of the informatics approach reveals important insight in modeling polymers, and should become common practice when defining new problem spaces.

  19. Quantitative fibronectin to help decision-making in women with symptoms of preterm labour (QUIDS) part 1: Individual participant data meta-analysis and health economic analysis

    PubMed Central

    Wotherspoon, Lisa M; Boyd, Kathleen A; Morris, Rachel K; Jackson, Lesley; Chandiramani, Manju; David, Anna L; Khalil, Asma; Shennan, Andrew; Hodgetts Morton, Victoria; Lavender, Tina; Khan, Khalid; Harper-Clarke, Susan; Mol, Ben W; Riley, Richard D; Norrie, John; Norman, Jane E

    2018-01-01

    Introduction The aim of the QUIDS study is to develop a decision support tool for the management of women with symptoms and signs of preterm labour, based on a validated prognostic model using quantitative fetal fibronectin (qfFN) concentration, in combination with clinical risk factors. Methods and analysis The study will evaluate the Rapid fFN 10Q System (Hologic, Marlborough, Massachusetts) which quantifies fFN in a vaginal swab. In part 1 of the study, we will develop and internally validate a prognostic model using an individual participant data (IPD) meta-analysis of existing studies containing women with symptoms of preterm labour alongside fFN measurements and pregnancy outcome. An economic analysis will be undertaken to assess potential cost-effectiveness of the qfFN prognostic model. The primary endpoint will be the ability of the prognostic model to rule out spontaneous preterm birth within 7 days. Six eligible studies were identified by systematic review of the literature and five agreed to provide their IPD (n=5 studies, 1783 women and 139 events of preterm delivery within 7 days of testing). Ethics and dissemination The study is funded by the National Institute of Healthcare Research Health Technology Assessment (HTA 14/32/01). It has been approved by the West of Scotland Research Ethics Committee (16/WS/0068). PROSPERO registration number CRD42015027590. Version Protocol version 2, date 1 November 2016. PMID:29627817

  20. Quantitative assessment of carbon sequestration reduction induced by disturbances in temperate Eurasian steppe

    NASA Astrophysics Data System (ADS)

    Chen, Yizhao; Ju, Weimin; Groisman, Pavel; Li, Jianlong; Propastin, Pavel; Xu, Xia; Zhou, Wei; Ruan, Honghua

    2017-11-01

    The temperate Eurasian steppe (TES) is a region where various environmental, social, and economic stresses converge. Multiple types of disturbance exist widely across the landscape, and heavily influence carbon cycling in this region. However, a current quantitative assessment of the impact of disturbances on carbon sequestration is largely lacking. In this study, we combined the boreal ecosystem productivity simulator (BEPS), the Shiyomi grazing model, and the global fire model (Glob-FIRM) to investigate the impact of the two major types of disturbance in the TES (i.e. domestic grazing and fire) on regional carbon sequestration. Model performance was validated using satellite data and field observations. Model outputs indicate that disturbance has a significant impact on carbon sequestration at a regional scale. The annual total carbon lost due to disturbances was 7.8 TgC yr-1, accounting for 14.2% of the total net ecosystem productivity (NEP). Domestic grazing plays the dominant role in terrestrial carbon consumption, accounting for 95% of the total carbon lost from the two disturbances. Carbon losses from both disturbances significantly increased from 1999 to 2008 (R 2 = 0.82, P < 0.001 for grazing, R 2 = 0.51, P < 0.05 for fire). Heavy domestic grazing in relatively barren grasslands substantially reduced carbon sequestration, particularly in the grasslands of Turkmenistan, Uzbekistan, and the far southwest of Inner Mongolia. This spatially-explicit information has potential implications for sustainable management of carbon sequestration in the vast grassland ecosystems.

  1. Further evaluation of quantitative structure--activity relationship models for the prediction of the skin sensitization potency of selected fragrance allergens.

    PubMed

    Patlewicz, Grace Y; Basketter, David A; Pease, Camilla K Smith; Wilson, Karen; Wright, Zoe M; Roberts, David W; Bernard, Guillaume; Arnau, Elena Giménez; Lepoittevin, Jean-Pierre

    2004-02-01

    Fragrance substances represent a very diverse group of chemicals; a proportion of them are associated with the ability to cause allergic reactions in the skin. Efforts to find substitute materials are hindered by the need to undertake animal testing for determining both skin sensitization hazard and potency. One strategy to avoid such testing is through an understanding of the relationships between chemical structure and skin sensitization, so-called structure-activity relationships. In recent work, we evaluated 2 groups of fragrance chemicals -- saturated aldehydes and alpha,beta-unsaturated aldehydes. Simple quantitative structure-activity relationship (QSAR) models relating the EC3 values [derived from the local lymph node assay (LLNA)] to physicochemical properties were developed for both sets of aldehydes. In the current study, we evaluated an additional group of carbonyl-containing compounds to test the predictive power of the developed QSARs and to extend their scope. The QSAR models were used to predict EC3 values of 10 newly selected compounds. Local lymph node assay data generated for these compounds demonstrated that the original QSARs were fairly accurate, but still required improvement. Development of these QSAR models has provided us with a better understanding of the potential mechanisms of action for aldehydes, and hence how to avoid or limit allergy. Knowledge generated from this work is being incorporated into new/improved rules for sensitization in the expert toxicity prediction system, deductive estimation of risk from existing knowledge (DEREK).

  2. A quantitative property-property relationship for the internal diffusion coefficients of organic compounds in solid materials.

    PubMed

    Huang, L; Fantke, P; Ernstoff, A; Jolliet, O

    2017-11-01

    Indoor releases of organic chemicals encapsulated in solid materials are major contributors to human exposures and are directly related to the internal diffusion coefficient in solid materials. Existing correlations to estimate the diffusion coefficient are only valid for a limited number of chemical-material combinations. This paper develops and evaluates a quantitative property-property relationship (QPPR) to predict diffusion coefficients for a wide range of organic chemicals and materials. We first compiled a training dataset of 1103 measured diffusion coefficients for 158 chemicals in 32 consolidated material types. Following a detailed analysis of the temperature influence, we developed a multiple linear regression model to predict diffusion coefficients as a function of chemical molecular weight (MW), temperature, and material type (adjusted R 2 of .93). The internal validations showed the model to be robust, stable and not a result of chance correlation. The external validation against two separate prediction datasets demonstrated the model has good predicting ability within its applicability domain (Rext2>.8), namely MW between 30 and 1178 g/mol and temperature between 4 and 180°C. By covering a much wider range of organic chemicals and materials, this QPPR facilitates high-throughput estimates of human exposures for chemicals encapsulated in solid materials. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. In Vivo Determination of Body Composition in Zebrafish (Danio rerio) by Quantitative Magnetic Resonance

    PubMed Central

    Fowler, L. Adele; Dennis, Lacey N.; Barry, R. Jeff; Powell, Mickie L.; Watts, Stephen A.

    2016-01-01

    Abstract Zebrafish (Danio rerio) as a model research organism continues to expand its relevance and role in multiple research disciplines, with recent work directed toward models of metabolism, nutrition, and energetics. Multiple technologies exist to assess body composition in animal research models at various levels of detail (tissues/organs, body regions, and whole organism). The development and/or validation of body composition assessment tools can open new areas of research questions for a given organism. Using fish from a comparative nutrition study, quantitative magnetic resonance (QMR) assessment of whole body fat and fat-free mass (FFM) in live fish was performed. QMR measures from two cohorts (n = 26 and n = 27) were compared with chemical carcass analysis (CCA) of FM and FFM. QMR was significantly correlated with chemical carcass values (fat, p < 0.001; lean, p = 0.002), although QMR significantly overestimated fat mass (FM) (0.011 g; p < 0.0001) and underestimated FFM (−0.024 g; p < 0.0001) relative to CCA. In a separate cross-validation group of fish, prediction equations corrected carcass values for FM (p = 0.121) and FFM (p = 0.753). These results support the utilization of QMR—a nonlethal nondestructive method—for cross-sectional or longitudinal body composition assessment outcomes in zebrafish. PMID:26974510

  4. Quantitative measurement of eyestrain on 3D stereoscopic display considering the eye foveation model and edge information.

    PubMed

    Heo, Hwan; Lee, Won Oh; Shin, Kwang Yong; Park, Kang Ryoung

    2014-05-15

    We propose a new method for measuring the degree of eyestrain on 3D stereoscopic displays using a glasses-type of eye tracking device. Our study is novel in the following four ways: first, the circular area where a user's gaze position exists is defined based on the calculated gaze position and gaze estimation error. Within this circular area, the position where edge strength is maximized can be detected, and we determine this position as the gaze position that has a higher probability of being the correct one. Based on this gaze point, the eye foveation model is defined. Second, we quantitatively evaluate the correlation between the degree of eyestrain and the causal factors of visual fatigue, such as the degree of change of stereoscopic disparity (CSD), stereoscopic disparity (SD), frame cancellation effect (FCE), and edge component (EC) of the 3D stereoscopic display using the eye foveation model. Third, by comparing the eyestrain in conventional 3D video and experimental 3D sample video, we analyze the characteristics of eyestrain according to various factors and types of 3D video. Fourth, by comparing the eyestrain with or without the compensation of eye saccades movement in 3D video, we analyze the characteristics of eyestrain according to the types of eye movements in 3D video. Experimental results show that the degree of CSD causes more eyestrain than other factors.

  5. Leverage principle of retardation signal in titration of double protein via chip moving reaction boundary electrophoresis.

    PubMed

    Zhang, Liu-Xia; Cao, Yi-Ren; Xiao, Hua; Liu, Xiao-Ping; Liu, Shao-Rong; Meng, Qing-Hua; Fan, Liu-Yin; Cao, Cheng-Xi

    2016-03-15

    In the present work we address a simple, rapid and quantitative analytical method for detection of different proteins present in biological samples. For this, we proposed the model of titration of double protein (TDP) and its relevant leverage theory relied on the retardation signal of chip moving reaction boundary electrophoresis (MRBE). The leverage principle showed that the product of the first protein content and its absolute retardation signal is equal to that of the second protein content and its absolute one. To manifest the model, we achieved theoretical self-evidence for the demonstration of the leverage principle at first. Then relevant experiments were conducted on the TDP-MRBE chip. The results revealed that (i) there was a leverage principle of retardation signal within the TDP of two pure proteins, and (ii) a lever also existed within these two complex protein samples, evidently demonstrating the validity of TDP model and leverage theory in MRBE chip. It was also showed that the proposed technique could provide a rapid and simple quantitative analysis of two protein samples in a mixture. Finally, we successfully applied the developed technique for the quantification of soymilk in adulterated infant formula. The TDP-MRBE opens up a new window for the detection of adulteration ratio of the poor food (milk) in blended high quality one. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Computational simulation of extravehicular activity dynamics during a satellite capture attempt.

    PubMed

    Schaffner, G; Newman, D J; Robinson, S K

    2000-01-01

    A more quantitative approach to the analysis of astronaut extravehicular activity (EVA) tasks is needed because of their increasing complexity, particularly in preparation for the on-orbit assembly of the International Space Station. Existing useful EVA computer analyses produce either high-resolution three-dimensional computer images based on anthropometric representations or empirically derived predictions of astronaut strength based on lean body mass and the position and velocity of body joints but do not provide multibody dynamic analysis of EVA tasks. Our physics-based methodology helps fill the current gap in quantitative analysis of astronaut EVA by providing a multisegment human model and solving the equations of motion in a high-fidelity simulation of the system dynamics. The simulation work described here improves on the realism of previous efforts by including three-dimensional astronaut motion, incorporating joint stops to account for the physiological limits of range of motion, and incorporating use of constraint forces to model interaction with objects. To demonstrate the utility of this approach, the simulation is modeled on an actual EVA task, namely, the attempted capture of a spinning Intelsat VI satellite during STS-49 in May 1992. Repeated capture attempts by an EVA crewmember were unsuccessful because the capture bar could not be held in contact with the satellite long enough for the capture latches to fire and successfully retrieve the satellite.

  7. Interpretation of protein quantitation using the Bradford assay: comparison with two calculation models.

    PubMed

    Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung

    2013-03-01

    The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Predicting the chromatographic retention of polymers: application of the polymer model to poly(styrene/ethylacrylate)copolymers.

    PubMed

    Bashir, Mubasher A; Radke, Wolfgang

    2012-02-17

    The retention behavior of a range of statistical poly(styrene/ethylacrylate) copolymers is investigated, in order to determine the possibility to predict retention volumes of these copolymers based on a suitable chromatographic retention model. It was found that the composition of elution in gradient chromatography of the copolymers is closely related to the eluent composition at which, in isocratic chromatography, the transition from elution in adsorption to exclusion mode occurs. For homopolymers this transition takes place at a critical eluent composition at which the molar mass dependence of elution volume vanishes. Thus, similar critical eluent compositions can be defined for statistical copolymers. The existence of a critical eluent composition is further supported by the narrower peak width, indicating that the broad molar mass distribution of the samples does not contribute to the retention volume. It is shown that the existing retention model for homopolymers allows for correct quantitative predictions of retention volumes based on only three appropriate initial experiments. The selection of these initial experiments involves a gradient run and two isocratic experiments, one at the composition of elution calculated from first gradient run and second at a slightly higher eluent strength. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Effects of Drawing on Alpha Activity: A Quantitative EEG Study with Implications for Art Therapy

    ERIC Educational Resources Information Center

    Belkofer, Christopher M.; Van Hecke, Amy Vaughan; Konopka, Lukasz M.

    2014-01-01

    Little empirical evidence exists as to how materials used in art therapy affect the brain and its neurobiological functioning. This pre/post within-groups study utilized the quantitative electroencephalogram (qEEG) to measure residual effects in the brain after 20 minutes of drawing. EEG recordings were conducted before and after participants (N =…

  10. A Quantitative Study on Burnout for Teachers Who Work with Students Who Have Moderate to Severe Disabilities

    ERIC Educational Resources Information Center

    Dickerson, Elizabeth G.

    2017-01-01

    Purpose: The purpose of this quantitative research was to examine what relationships, if any, exist between the independent variable of burnout and dependent variables of job satisfaction for special education teachers who work with students who have moderate to severe disabilities ages 5 to 22 in a Southern California school district.…

  11. Reproducibility of Quantitative Structural and Physiological MRI Measurements

    DTIC Science & Technology

    2017-08-09

    per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...Journal Article 3. DATES COVERED (From – To) January 2015 – July 2017 4. TITLE AND SUBTITLE Reproducibility of Quantitative Structural and...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) USAF School of Aerospace Medicine Aeromedical Research Dept/FHOH 2510 Fifth St., Bldg

  12. A Study to Formulate Quantitative Guidelines for the Audio-Visual Communications Field. Final Report.

    ERIC Educational Resources Information Center

    Faris, Gene; Sherman, Mendel

    Quantitative guidelines for use in determining the audiovisual (AV) needs of educational institutions were developed by the Octobe r 14-16, 1965 Seminar of the NDEA (National Defense Education Act), Faris-Sherman study. The guidelines that emerged were based in part on a review of past efforts and existing standards but primarily reflected the…

  13. Modeling Dynamics of Cell-to-Cell Variability in TRAIL-Induced Apoptosis Explains Fractional Killing and Predicts Reversible Resistance

    PubMed Central

    Bertaux, François; Stoma, Szymon; Drasdo, Dirk; Batt, Gregory

    2014-01-01

    Isogenic cells sensing identical external signals can take markedly different decisions. Such decisions often correlate with pre-existing cell-to-cell differences in protein levels. When not neglected in signal transduction models, these differences are accounted for in a static manner, by assuming randomly distributed initial protein levels. However, this approach ignores the a priori non-trivial interplay between signal transduction and the source of this cell-to-cell variability: temporal fluctuations of protein levels in individual cells, driven by noisy synthesis and degradation. Thus, modeling protein fluctuations, rather than their consequences on the initial population heterogeneity, would set the quantitative analysis of signal transduction on firmer grounds. Adopting this dynamical view on cell-to-cell differences amounts to recast extrinsic variability into intrinsic noise. Here, we propose a generic approach to merge, in a systematic and principled manner, signal transduction models with stochastic protein turnover models. When applied to an established kinetic model of TRAIL-induced apoptosis, our approach markedly increased model prediction capabilities. One obtains a mechanistic explanation of yet-unexplained observations on fractional killing and non-trivial robust predictions of the temporal evolution of cell resistance to TRAIL in HeLa cells. Our results provide an alternative explanation to survival via induction of survival pathways since no TRAIL-induced regulations are needed and suggest that short-lived anti-apoptotic protein Mcl1 exhibit large and rare fluctuations. More generally, our results highlight the importance of accounting for stochastic protein turnover to quantitatively understand signal transduction over extended durations, and imply that fluctuations of short-lived proteins deserve particular attention. PMID:25340343

  14. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  15. Modeling the effect of blunt impact on mitochondrial function in cartilage: implications for development of osteoarthritis.

    PubMed

    Kapitanov, Georgi I; Ayati, Bruce P; Martin, James A

    2017-01-01

    Osteoarthritis (OA) is a disease characterized by degeneration of joint cartilage. It is associated with pain and disability and is the result of either age and activity related joint wear or an injury. Non-invasive treatment options are scarce and prevention and early intervention methods are practically non-existent. The modeling effort presented in this article is constructed based on an emerging biological hypothesis-post-impact oxidative stress leads to cartilage cell apoptosis and hence the degeneration observed with the disease. The objective is to quantitatively describe the loss of cell viability and function in cartilage after an injurious impact and identify the key parameters and variables that contribute to this phenomenon. We constructed a system of differential equations that tracks cell viability, mitochondrial function, and concentrations of reactive oxygen species (ROS), adenosine triphosphate (ATP), and glycosaminoglycans (GAG). The system was solved using MATLAB and the equations' parameters were fit to existing data using a particle swarm algorithm. The model fits well the available data for cell viability, ATP production, and GAG content. Local sensitivity analysis shows that the initial amount of ROS is the most important parameter. The model we constructed is a viable method for producing in silico studies and with a few modifications, and data calibration and validation, may be a powerful predictive tool in the search for a non-invasive treatment for post-traumatic osteoarthritis.

  16. Quantitative Characterization of Spurious Gibbs Waves in 45 CMIP5 Models

    NASA Astrophysics Data System (ADS)

    Geil, K. L.; Zeng, X.

    2014-12-01

    Gibbs oscillations appear in global climate models when representing fields, such as orography, that contain discontinuities or sharp gradients. It has been known for decades that the oscillations are associated with the transformation of the truncated spectral representation of a field to physical space and that the oscillations can also be present in global models that do not use spectral methods. The spurious oscillations are potentially detrimental to model simulations (e.g., over ocean) and this work provides a quantitative characterization of the Gibbs oscillations that appear across the Coupled Model Intercomparison Project Phase 5 (CMIP5) models. An ocean transect running through the South Pacific High toward the Andes is used to characterize the oscillations in ten different variables. These oscillations are found to be stationary and hence are not caused by (physical) waves in the atmosphere. We quantify the oscillation amplitude using the root mean square difference (RMSD) between the transect of a variable and its running mean (rather than the constant mean across the transect). We also compute the RMSD to interannual variability (IAV) ratio, which provides a relative measure of the oscillation amplitude. Of the variables examined, the largest RMSD values exist in the surface pressure field of spectral models, while the smallest RMSD values within the surface pressure field come from models that use finite difference (FD) techniques. Many spectral models have a surface pressure RMSD that is 2 to 15 times greater than IAV over the transect and an RMSD:IAV ratio greater than one for many other variables including surface temperature, incoming shortwave radiation at the surface, incoming longwave radiation at the surface, and total cloud fraction. In general, the FD models out-perform the spectral models, but not all the spectral models have large amplitude oscillations and there are a few FD models where the oscillations do appear. Finally, we present a brief comparison of the numerical methods of a select few models to better understand their Gibbs oscillations.

  17. Modified-hypernetted-chain determination of the phase diagram of rigid C60 molecules

    NASA Astrophysics Data System (ADS)

    Caccamo, C.

    1995-02-01

    The modified-hypernetted-chain theory is applied to the determination of the phase diagram of the Lennard-Jones (LJ) fluid, and of a model of C60 previously investigated [Phys. Rev. Lett. 71, 1200 (1993)] through molecular-dynamics (MD) simulation and a different theoretical approach. In the LJ case the agreement with available MD data is quantitative and superior to other theories. For C60, the phase diagram obtained is in quite good agreement with previous MD results: in particular, the theory confirms the existence of a liquid phase between 1600 and 1920 K, the estimated triple point and critical temperature, respectively.

  18. A two-level structure for advanced space power system automation

    NASA Technical Reports Server (NTRS)

    Loparo, Kenneth A.; Chankong, Vira

    1990-01-01

    The tasks to be carried out during the three-year project period are: (1) performing extensive simulation using existing mathematical models to build a specific knowledge base of the operating characteristics of space power systems; (2) carrying out the necessary basic research on hierarchical control structures, real-time quantitative algorithms, and decision-theoretic procedures; (3) developing a two-level automation scheme for fault detection and diagnosis, maintenance and restoration scheduling, and load management; and (4) testing and demonstration. The outlines of the proposed system structure that served as a master plan for this project, work accomplished, concluding remarks, and ideas for future work are also addressed.

  19. The value of health care information exchange and interoperability.

    PubMed

    Walker, Jan; Pan, Eric; Johnston, Douglas; Adler-Milstein, Julia; Bates, David W; Middleton, Blackford

    2005-01-01

    In this paper we assess the value of electronic health care information exchange and interoperability (HIEI) between providers (hospitals and medical group practices) and independent laboratories, radiology centers, pharmacies, payers, public health departments, and other providers. We have created an HIEI taxonomy and combined published evidence with expert opinion in a cost-benefit model. Fully standardized HIEI could yield a net value of dollar 77.8 billion per year once fully implemented. Nonstandardized HIEI offers smaller positive financial returns. The clinical impact of HIEI for which quantitative estimates cannot yet be made would likely add further value. A compelling business case exists for national implementation of fully standardized HIEI.

  20. Downstream boundary effects on the frequency of self-excited oscillations in transonic diffuser flows

    NASA Astrophysics Data System (ADS)

    Hsieh, T.

    1986-10-01

    Investigation of downstream boundary effects on the frequency of self-excited oscillations in two-dimensional, separated transonic diffuser flows were conducted numerically by solving the compressible, Reynolds-averaged, thin-layer Navier-Stokes equation with two equation turbulence models. It was found that the flow fields are very sensitive to the location of the downstream boundary. Extension of the diffuser downstream boundary significantly reduces the frequency and amplitude of oscillations for pressure, velocity, and shock. The existence of a suction slot in the experimental setpup obscures the physical downstream boundary and therefore presents a difficulty for quantitative comparisons between computation and experiment.

  1. Accelerated discovery of metallic glasses through iteration of machine learning and high-throughput experiments

    PubMed Central

    Wolverton, Christopher; Hattrick-Simpers, Jason; Mehta, Apurva

    2018-01-01

    With more than a hundred elements in the periodic table, a large number of potential new materials exist to address the technological and societal challenges we face today; however, without some guidance, searching through this vast combinatorial space is frustratingly slow and expensive, especially for materials strongly influenced by processing. We train a machine learning (ML) model on previously reported observations, parameters from physiochemical theories, and make it synthesis method–dependent to guide high-throughput (HiTp) experiments to find a new system of metallic glasses in the Co-V-Zr ternary. Experimental observations are in good agreement with the predictions of the model, but there are quantitative discrepancies in the precise compositions predicted. We use these discrepancies to retrain the ML model. The refined model has significantly improved accuracy not only for the Co-V-Zr system but also across all other available validation data. We then use the refined model to guide the discovery of metallic glasses in two additional previously unreported ternaries. Although our approach of iterative use of ML and HiTp experiments has guided us to rapid discovery of three new glass-forming systems, it has also provided us with a quantitatively accurate, synthesis method–sensitive predictor for metallic glasses that improves performance with use and thus promises to greatly accelerate discovery of many new metallic glasses. We believe that this discovery paradigm is applicable to a wider range of materials and should prove equally powerful for other materials and properties that are synthesis path–dependent and that current physiochemical theories find challenging to predict. PMID:29662953

  2. Accelerated discovery of metallic glasses through iteration of machine learning and high-throughput experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Fang; Ward, Logan; Williams, Travis

    With more than a hundred elements in the periodic table, a large number of potential new materials exist to address the technological and societal challenges we face today; however, without some guidance, searching through this vast combinatorial space is frustratingly slow and expensive, especially for materials strongly influenced by processing. We train a machine learning (ML) model on previously reported observations, parameters from physiochemical theories, and make it synthesis method–dependent to guide high-throughput (HiTp) experiments to find a new system of metallic glasses in the Co-V-Zr ternary. Experimental observations are in good agreement with the predictions of the model, butmore » there are quantitative discrepancies in the precise compositions predicted. We use these discrepancies to retrain the ML model. The refined model has significantly improved accuracy not only for the Co-V-Zr system but also across all other available validation data. We then use the refined model to guide the discovery of metallic glasses in two additional previously unreported ternaries. Although our approach of iterative use of ML and HiTp experiments has guided us to rapid discovery of three new glass-forming systems, it has also provided us with a quantitatively accurate, synthesis method–sensitive predictor for metallic glasses that improves performance with use and thus promises to greatly accelerate discovery of many new metallic glasses. We believe that this discovery paradigm is applicable to a wider range of materials and should prove equally powerful for other materials and properties that are synthesis path–dependent and that current physiochemical theories find challenging to predict.« less

  3. A dual-docking microfluidic cell migration assay (D2-Chip) for testing neutrophil chemotaxis and the memory effect.

    PubMed

    Yang, Ke; Wu, Jiandong; Xu, Guoqing; Xie, Dongxue; Peretz-Soroka, Hagit; Santos, Susy; Alexander, Murray; Zhu, Ling; Zhang, Michael; Liu, Yong; Lin, Francis

    2017-04-18

    Chemotaxis is a classic mechanism for guiding cell migration and an important topic in both fundamental cell biology and health sciences. Neutrophils are a widely used model to study eukaryotic cell migration and neutrophil chemotaxis itself can lead to protective or harmful immune actions to the body. While much has been learnt from past research about how neutrophils effectively navigate through a chemoattractant gradient, many interesting questions remain unclear. For example, while it is tempting to model neutrophil chemotaxis using the well-established biased random walk theory, the experimental proof was challenged by the cell's highly persistent migrating nature. A special experimental design is required to test the key predictions from the random walk model. Another question that has interested the cell migration community for decades concerns the existence of chemotactic memory and its underlying mechanism. Although chemotactic memory has been suggested in various studies, a clear quantitative experimental demonstration will improve our understanding of the migratory memory effect. Motivated by these questions, we developed a microfluidic cell migration assay (so-called dual-docking chip or D 2 -Chip) that can test both the biased random walk model and the memory effect for neutrophil chemotaxis on a single chip enabled by multi-region gradient generation and dual-region cell alignment. Our results provide experimental support for the biased random walk model and chemotactic memory for neutrophil chemotaxis. Quantitative data analyses provide new insights into neutrophil chemotaxis and memory by making connections to entropic disorder, cell morphology and oscillating migratory response.

  4. Comparison of the applicability domain of a quantitative structure-activity relationship for estrogenicity with a large chemical inventory.

    PubMed

    Netzeva, Tatiana I; Gallegos Saliner, Ana; Worth, Andrew P

    2006-05-01

    The aim of the present study was to illustrate that it is possible and relatively straightforward to compare the domain of applicability of a quantitative structure-activity relationship (QSAR) model in terms of its physicochemical descriptors with a large inventory of chemicals. A training set of 105 chemicals with data for relative estrogenic gene activation, obtained in a recombinant yeast assay, was used to develop the QSAR. A binary classification model for predicting active versus inactive chemicals was developed using classification tree analysis and two descriptors with a clear physicochemical meaning (octanol-water partition coefficient, or log Kow, and the number of hydrogen bond donors, or n(Hdon)). The model demonstrated a high overall accuracy (90.5%), with a sensitivity of 95.9% and a specificity of 78.1%. The robustness of the model was evaluated using the leave-many-out cross-validation technique, whereas the predictivity was assessed using an artificial external test set composed of 12 compounds. The domain of the QSAR training set was compared with the chemical space covered by the European Inventory of Existing Commercial Chemical Substances (EINECS), as incorporated in the CDB-EC software, in the log Kow / n(Hdon) plane. The results showed that the training set and, therefore, the applicability domain of the QSAR model covers a small part of the physicochemical domain of the inventory, even though a simple method for defining the applicability domain (ranges in the descriptor space) was used. However, a large number of compounds are located within the narrow descriptor window.

  5. Self-efficacy and health-related quality of life in family carers of people with dementia: a systematic review

    PubMed Central

    Crellin, Nadia E.; Orrell, Martin; McDermott, Orii; Charlesworth, Georgina

    2014-01-01

    Objectives: This review aims to explore the role of self-efficacy (SE) in the health-related quality of life (QoL) of family carers of people with dementia. Methods: A systematic review of literature identified a range of qualitative and quantitative studies. Search terms related to caring, SE, and dementia. Narrative synthesis was adopted to synthesise the findings. Results: Twenty-two studies met the full inclusion criteria, these included 17 quantitative, four qualitative, and one mixed-method study. A model describing the role of task/domain-specific SE beliefs in family carer health-related QoL was constructed. This model was informed by review findings and discussed in the context of existing conceptual models of carer adaptation and empirical research. Review findings offer support for the application of the SE theory to caring and for the two-factor view of carer appraisals and well-being. Findings do not support the independence of the negative and positive pathways. The review was valuable in highlighting methodological challenges confronting this area of research, particularly the conceptualisation and measurement issues surrounding both SE and health-related QoL. Conclusions: The model might have theoretical implications in guiding future research and advancing theoretical models of caring. It might also have clinical implications in facilitating the development of carer support services aimed at improving SE. The review highlights the need for future research, particularly longitudinal research, and further exploration of domain/task-specific SE beliefs, the influence of carer characteristics, and other mediating/moderating variables. PMID:24943873

  6. Accelerated discovery of metallic glasses through iteration of machine learning and high-throughput experiments

    DOE PAGES

    Ren, Fang; Ward, Logan; Williams, Travis; ...

    2018-04-01

    With more than a hundred elements in the periodic table, a large number of potential new materials exist to address the technological and societal challenges we face today; however, without some guidance, searching through this vast combinatorial space is frustratingly slow and expensive, especially for materials strongly influenced by processing. We train a machine learning (ML) model on previously reported observations, parameters from physiochemical theories, and make it synthesis method–dependent to guide high-throughput (HiTp) experiments to find a new system of metallic glasses in the Co-V-Zr ternary. Experimental observations are in good agreement with the predictions of the model, butmore » there are quantitative discrepancies in the precise compositions predicted. We use these discrepancies to retrain the ML model. The refined model has significantly improved accuracy not only for the Co-V-Zr system but also across all other available validation data. We then use the refined model to guide the discovery of metallic glasses in two additional previously unreported ternaries. Although our approach of iterative use of ML and HiTp experiments has guided us to rapid discovery of three new glass-forming systems, it has also provided us with a quantitatively accurate, synthesis method–sensitive predictor for metallic glasses that improves performance with use and thus promises to greatly accelerate discovery of many new metallic glasses. We believe that this discovery paradigm is applicable to a wider range of materials and should prove equally powerful for other materials and properties that are synthesis path–dependent and that current physiochemical theories find challenging to predict.« less

  7. "Don't Be a Whore, that's Not Ladylike": Discursive Discipline and Sorority Women's Gendered Subjectivity

    ERIC Educational Resources Information Center

    Berbary, Lisbeth A.

    2012-01-01

    While multiple and competing understandings of sororities exist in popular culture, academic research on sororities tends to homogenize the experience of sorority women, simplifying their existence to a quantitative understanding of specific behaviors such as those associated with binge drinking, eating disorders, and heterosexuality.…

  8. Continuous time Boolean modeling for biological signaling: application of Gillespie algorithm.

    PubMed

    Stoll, Gautier; Viara, Eric; Barillot, Emmanuel; Calzone, Laurence

    2012-08-29

    Mathematical modeling is used as a Systems Biology tool to answer biological questions, and more precisely, to validate a network that describes biological observations and predict the effect of perturbations. This article presents an algorithm for modeling biological networks in a discrete framework with continuous time. There exist two major types of mathematical modeling approaches: (1) quantitative modeling, representing various chemical species concentrations by real numbers, mainly based on differential equations and chemical kinetics formalism; (2) and qualitative modeling, representing chemical species concentrations or activities by a finite set of discrete values. Both approaches answer particular (and often different) biological questions. Qualitative modeling approach permits a simple and less detailed description of the biological systems, efficiently describes stable state identification but remains inconvenient in describing the transient kinetics leading to these states. In this context, time is represented by discrete steps. Quantitative modeling, on the other hand, can describe more accurately the dynamical behavior of biological processes as it follows the evolution of concentration or activities of chemical species as a function of time, but requires an important amount of information on the parameters difficult to find in the literature. Here, we propose a modeling framework based on a qualitative approach that is intrinsically continuous in time. The algorithm presented in this article fills the gap between qualitative and quantitative modeling. It is based on continuous time Markov process applied on a Boolean state space. In order to describe the temporal evolution of the biological process we wish to model, we explicitly specify the transition rates for each node. For that purpose, we built a language that can be seen as a generalization of Boolean equations. Mathematically, this approach can be translated in a set of ordinary differential equations on probability distributions. We developed a C++ software, MaBoSS, that is able to simulate such a system by applying Kinetic Monte-Carlo (or Gillespie algorithm) on the Boolean state space. This software, parallelized and optimized, computes the temporal evolution of probability distributions and estimates stationary distributions. Applications of the Boolean Kinetic Monte-Carlo are demonstrated for three qualitative models: a toy model, a published model of p53/Mdm2 interaction and a published model of the mammalian cell cycle. Our approach allows to describe kinetic phenomena which were difficult to handle in the original models. In particular, transient effects are represented by time dependent probability distributions, interpretable in terms of cell populations.

  9. The mathematics of cancer: integrating quantitative models.

    PubMed

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  10. Generalized PSF modeling for optimized quantitation in PET imaging.

    PubMed

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF modeling does not offer optimized PET quantitation, and that PSF overestimation may provide enhanced SUV quantitation. Furthermore, generalized PSF modeling may provide a valuable approach for quantitative tasks such as treatment-response assessment and prognostication.

  11. A comparative study of the constitutive models for silicon carbide

    NASA Astrophysics Data System (ADS)

    Ding, Jow-Lian; Dwivedi, Sunil; Gupta, Yogendra

    2001-06-01

    Most of the constitutive models for polycrystalline silicon carbide were developed and evaluated using data from either normal plate impact or Hopkinson bar experiments. At ISP, extensive efforts have been made to gain detailed insight into the shocked state of the silicon carbide (SiC) using innovative experimental methods, viz., lateral stress measurements, in-material unloading measurements, and combined compression shear experiments. The data obtained from these experiments provide some unique information for both developing and evaluating material models. In this study, these data for SiC were first used to evaluate some of the existing models to identify their strength and possible deficiencies. Motivated by both the results of this comparative study and the experimental observations, an improved phenomenological model was developed. The model incorporates pressure dependence of strength, rate sensitivity, damage evolution under both tension and compression, pressure confinement effect on damage evolution, stiffness degradation due to damage, and pressure dependence of stiffness. The model developments are able to capture most of the material features observed experimentally, but more work is needed to better match the experimental data quantitatively.

  12. On the Formal-Logical Analysis of the Foundations of Mathematics Applied to Problems in Physics

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2016-03-01

    Analysis of the foundations of mathematics applied to problems in physics was proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is shown that critical analysis of the concept of mathematical quantity - central concept of mathematics - leads to the following conclusion: (1) The concept of ``mathematical quantity'' is the result of the following mental operations: (a) abstraction of the ``quantitative determinacy of physical quantity'' from the ``physical quantity'' at that the ``quantitative determinacy of physical quantity'' is an independent object of thought; (b) abstraction of the ``amount (i.e., abstract number)'' from the ``quantitative determinacy of physical quantity'' at that the ``amount (i.e., abstract number)'' is an independent object of thought. In this case, unnamed, abstract numbers are the only sign of the ``mathematical quantity''. This sign is not an essential sign of the material objects. (2) The concept of mathematical quantity is meaningless, erroneous, and inadmissible concept in science because it represents the following formal-logical and dialectical-materialistic error: negation of the existence of the essential sign of the concept (i.e., negation of the existence of the essence of the concept) and negation of the existence of measure of material object.

  13. Quantitative ESD Guidelines for Charged Spacecraft Derived from the Physics of Discharges

    NASA Technical Reports Server (NTRS)

    Frederickson, A. R.

    1992-01-01

    Quantitative guidelines are proposed for Electrostatic Discharge (ESD) pulse shape on charged spacecraft. The guidelines are based on existing ground test data, and on a physical description of the pulsed discharge process. The guidelines are designed to predict pulse shape for surface charging and internal charging on a wide variety of spacecraft structures. The pulses depend on the area of the sample, its capacitance to ground, and the strength of the electric field in the vacuum adjacent to the charged surface. By knowing the pulse shape, current vs. time, one can determine if nearby circuits are threatened by the pulse. The quantitative guidelines might be used to estimate the level of threat to an existing spacecraft, or to redesign a spacecraft to reduce its pulses to a known safe level. The experiments which provide the data and the physics that allow one to interpret the data will be discussed, culminating in examples of how to predict pulse shape/size. This method has been used, but not confirmed, on several spacecraft.

  14. Lentivirus-mediated platelet gene therapy of murine hemophilia A with pre-existing anti-FVIII immunity

    PubMed Central

    Kuether, E. L.; Schroeder, J. A.; Fahs, S. A.; Cooley, B. C.; Chen, Y.; Montgomery, R. R.; Wilcox, D. A.; Shi, Q.

    2012-01-01

    Summary Background The development of inhibitory antibodies, referred to as inhibitors, against exogenous FVIII in a significant subset of patients with hemophilia A remains a persistent challenge to the efficacy of protein replacement therapy. Our previous studies using the transgenic approach provided proof-of-principle that platelet-specific expression could be successful for treating hemophilia A in the presence of inhibitory antibodies. Objective To investigate a clinically translatable approach for platelet gene therapy of hemophilia A with pre-existing inhibitors. Methods Platelet-FVIII expression in pre-immunized FVIIInull mice was introduced by transplantation of lentivirus-transduced bone marrow or enriched hematopoietic stem cells. FVIII expression was determined by a chromogenic assay. The transgene copy number per cell was quantitated by real time PCR. Inhibitor titer was measured by Bethesda assay. Phenotypic correction was assessed by the tail clipping assay and an electrolytic-induced venous injury model. Integration sites were analyzed by LAM-PCR. Results Therapeutic levels of platelet-FVIII expression were sustained long-term without evoking an anti-FVIII memory response in the transduced pre-immunized recipients. The tail clip survival test and the electrolytic injury model confirmed that hemostasis was improved in the treated animals. Sequential bone marrow transplants showed sustained platelet-FVIII expression resulting in phenotypic correction in pre-immunized secondary and tertiary recipients. Conclusions Lentivirus-mediated platelet-specific gene transfer improves hemostasis in hemophilic A mice with pre-existing inhibitors, indicating that this approach may be a promising strategy for gene therapy of hemophilia A even in the high-risk setting of pre-existing inhibitory antibodies. PMID:22632092

  15. Application of scenario analysis and multiagent technique in land-use planning: a case study on Sanjiang wetlands.

    PubMed

    Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong

    2013-01-01

    Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.

  16. Application of Scenario Analysis and Multiagent Technique in Land-Use Planning: A Case Study on Sanjiang Wetlands

    PubMed Central

    Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong

    2013-01-01

    Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816

  17. Cochrane Qualitative and Implementation Methods Group guidance series-paper 4: methods for assessing evidence on intervention implementation.

    PubMed

    Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Booth, Andrew; Harden, Angela; Hannes, Karin; Thomas, James; Flemming, Kate; Garside, Ruth; Noyes, Jane

    2018-05-01

    This article provides reviewers with guidance on methods for identifying and processing evidence to understand intervention implementation. Strategies, tools, and methods are applied to the systematic review process to illustrate how process and implementation can be addressed using quantitative, qualitative, and other sources of evidence (i.e., descriptive textual and nonempirical). Reviewers can take steps to navigate the heterogeneity and level of uncertainty present in the concepts, measures, and methods used to assess implementation. Activities can be undertaken in advance of a Cochrane quantitative review to develop program theory and logic models that situate implementation in the causal chain. Four search strategies are offered to retrieve process and implementation evidence. Recommendations are made for addressing rigor or risk of bias in process evaluation or implementation evidence. Strategies are recommended for locating and extracting data from primary studies. The basic logic is presented to assist reviewers to make initial review-level judgments about implementation failure and theory failure. Although strategies, tools, and methods can assist reviewers to address process and implementation using quantitative, qualitative, and other forms of evidence, few exemplar reviews exist. There is a need for further methodological development and trialing of proposed approaches. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Interrelation of structure and operational states in cascading failure of overloading lines in power grids

    NASA Astrophysics Data System (ADS)

    Xue, Fei; Bompard, Ettore; Huang, Tao; Jiang, Lin; Lu, Shaofeng; Zhu, Huaiying

    2017-09-01

    As the modern power system is expected to develop to a more intelligent and efficient version, i.e. the smart grid, or to be the central backbone of energy internet for free energy interactions, security concerns related to cascading failures have been raised with consideration of catastrophic results. The researches of topological analysis based on complex networks have made great contributions in revealing structural vulnerabilities of power grids including cascading failure analysis. However, existing literature with inappropriate assumptions in modeling still cannot distinguish the effects between the structure and operational state to give meaningful guidance for system operation. This paper is to reveal the interrelation between network structure and operational states in cascading failure and give quantitative evaluation by integrating both perspectives. For structure analysis, cascading paths will be identified by extended betweenness and quantitatively described by cascading drop and cascading gradient. Furthermore, the operational state for cascading paths will be described by loading level. Then, the risk of cascading failure along a specific cascading path can be quantitatively evaluated considering these two factors. The maximum cascading gradient of all possible cascading paths can be used as an overall metric to evaluate the entire power grid for its features related to cascading failure. The proposed method is tested and verified on IEEE30-bus system and IEEE118-bus system, simulation evidences presented in this paper suggests that the proposed model can identify the structural causes for cascading failure and is promising to give meaningful guidance for the protection of system operation in the future.

  19. Quantitative identification of nitrate pollution sources and uncertainty analysis based on dual isotope approach in an agricultural watershed.

    PubMed

    Ji, Xiaoliang; Xie, Runting; Hao, Yun; Lu, Jun

    2017-10-01

    Quantitative identification of nitrate (NO 3 - -N) sources is critical to the control of nonpoint source nitrogen pollution in an agricultural watershed. Combined with water quality monitoring, we adopted the environmental isotope (δD-H 2 O, δ 18 O-H 2 O, δ 15 N-NO 3 - , and δ 18 O-NO 3 - ) analysis and the Markov Chain Monte Carlo (MCMC) mixing model to determine the proportions of riverine NO 3 - -N inputs from four potential NO 3 - -N sources, namely, atmospheric deposition (AD), chemical nitrogen fertilizer (NF), soil nitrogen (SN), and manure and sewage (M&S), in the ChangLe River watershed of eastern China. Results showed that NO 3 - -N was the main form of nitrogen in this watershed, accounting for approximately 74% of the total nitrogen concentration. A strong hydraulic interaction existed between the surface and groundwater for NO 3 - -N pollution. The variations of the isotopic composition in NO 3 - -N suggested that microbial nitrification was the dominant nitrogen transformation process in surface water, whereas significant denitrification was observed in groundwater. MCMC mixing model outputs revealed that M&S was the predominant contributor to riverine NO 3 - -N pollution (contributing 41.8% on average), followed by SN (34.0%), NF (21.9%), and AD (2.3%) sources. Finally, we constructed an uncertainty index, UI 90 , to quantitatively characterize the uncertainties inherent in NO 3 - -N source apportionment and discussed the reasons behind the uncertainties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Quantitative Fluorescence Studies in Living Cells: Extending Fluorescence Fluctuation Spectroscopy to Peripheral Membrane Proteins

    NASA Astrophysics Data System (ADS)

    Smith, Elizabeth Myhra

    The interactions of peripheral membrane proteins with both membrane lipids and proteins are vital for many cellular processes including membrane trafficking, cellular signaling, and cell growth/regulation. Building accurate biophysical models of these processes requires quantitative characterization of the behavior of peripheral membrane proteins, yet methods to quantify their interactions inside living cells are very limited. Because peripheral membrane proteins usually exist both in membrane-bound and cytoplasmic forms, the separation of these two populations is a key challenge. This thesis aims at addressing this challenge by extending fluorescence fluctuation spectroscopy (FFS) to simultaneously measure the oligomeric state of peripheral membrane proteins in the cytoplasm and at the plasma membrane. We developed a new method based on z-scan FFS that accounts for the fluorescence contributions from cytoplasmic and membrane layers by incorporating a fluorescence intensity z-scan through the cell. H-Ras-EGFP served as a model system to demonstrate the feasibility of the technique. The resolvability and stability of z-scanning was determined as well as the oligomeric state of H-Ras-EGFP at the plasma membrane and in the cytoplasm. Further, we successfully characterized the binding affinity of a variety of proteins to the plasma membrane by quantitative analysis of the z-scan fluorescence intensity profile. This analysis method, which we refer to as z-scan fluorescence profile deconvoution, was further used in combination with dual-color competition studies to determine the lipid specificity of protein binding. Finally, we applied z-scan FFS to provide insight into the early assembly steps of the HTLV-1 retrovirus.

  1. A Second-Generation Device for Automated Training and Quantitative Behavior Analyses of Molecularly-Tractable Model Organisms

    PubMed Central

    Blackiston, Douglas; Shomrat, Tal; Nicolas, Cindy L.; Granata, Christopher; Levin, Michael

    2010-01-01

    A deep understanding of cognitive processes requires functional, quantitative analyses of the steps leading from genetics and the development of nervous system structure to behavior. Molecularly-tractable model systems such as Xenopus laevis and planaria offer an unprecedented opportunity to dissect the mechanisms determining the complex structure of the brain and CNS. A standardized platform that facilitated quantitative analysis of behavior would make a significant impact on evolutionary ethology, neuropharmacology, and cognitive science. While some animal tracking systems exist, the available systems do not allow automated training (feedback to individual subjects in real time, which is necessary for operant conditioning assays). The lack of standardization in the field, and the numerous technical challenges that face the development of a versatile system with the necessary capabilities, comprise a significant barrier keeping molecular developmental biology labs from integrating behavior analysis endpoints into their pharmacological and genetic perturbations. Here we report the development of a second-generation system that is a highly flexible, powerful machine vision and environmental control platform. In order to enable multidisciplinary studies aimed at understanding the roles of genes in brain function and behavior, and aid other laboratories that do not have the facilities to undergo complex engineering development, we describe the device and the problems that it overcomes. We also present sample data using frog tadpoles and flatworms to illustrate its use. Having solved significant engineering challenges in its construction, the resulting design is a relatively inexpensive instrument of wide relevance for several fields, and will accelerate interdisciplinary discovery in pharmacology, neurobiology, regenerative medicine, and cognitive science. PMID:21179424

  2. Quantitative structure-property relationship (correlation analysis) of phosphonic acid-based chelates in design of MRI contrast agent.

    PubMed

    Tiwari, Anjani K; Ojha, Himanshu; Kaul, Ankur; Dutta, Anupama; Srivastava, Pooja; Shukla, Gauri; Srivastava, Rakesh; Mishra, Anil K

    2009-07-01

    Nuclear magnetic resonance imaging is a very useful tool in modern medical diagnostics, especially when gadolinium (III)-based contrast agents are administered to the patient with the aim of increasing the image contrast between normal and diseased tissues. With the use of soft modelling techniques such as quantitative structure-activity relationship/quantitative structure-property relationship after a suitable description of their molecular structure, we have studied a series of phosphonic acid for designing new MRI contrast agent. Quantitative structure-property relationship studies with multiple linear regression analysis were applied to find correlation between different calculated molecular descriptors of the phosphonic acid-based chelating agent and their stability constants. The final quantitative structure-property relationship mathematical models were found as--quantitative structure-property relationship Model for phosphonic acid series (Model 1)--log K(ML) = {5.00243(+/-0.7102)}- MR {0.0263(+/-0.540)}n = 12 l r l = 0.942 s = 0.183 F = 99.165 quantitative structure-property relationship Model for phosphonic acid series (Model 2)--log K(ML) = {5.06280(+/-0.3418)}- MR {0.0252(+/- .198)}n = 12 l r l = 0.956 s = 0.186 F = 99.256.

  3. Coupling between geochemical reactions and multicomponent gas and solute transport in unsaturated media: A reactive transport modeling study

    USGS Publications Warehouse

    Molins, S.; Mayer, K.U.

    2007-01-01

    The two‐way coupling that exists between biogeochemical reactions and vadose zone transport processes, in particular gas phase transport, determines the composition of soil gas. To explore these feedback processes quantitatively, multicomponent gas diffusion and advection are implemented into an existing reactive transport model that includes a full suite of geochemical reactions. Multicomponent gas diffusion is described on the basis of the dusty gas model, which accounts for all relevant gas diffusion mechanisms. The simulation of gas attenuation in partially saturated landfill soil covers, methane production, and oxidation in aquifers contaminated by organic compounds (e.g., an oil spill site) and pyrite oxidation in mine tailings demonstrate that both diffusive and advective gas transport can be affected by geochemical reactions. Methane oxidation in landfill covers reduces the existing upward pressure gradient, thereby decreasing the contribution of advective methane emissions to the atmosphere and enhancing the net flux of atmospheric oxygen into the soil column. At an oil spill site, methane oxidation causes a reversal in the direction of gas advection, which results in advective transport toward the zone of oxidation both from the ground surface and the deeper zone of methane production. Both diffusion and advection contribute to supply atmospheric oxygen into the subsurface, and methane emissions to the atmosphere are averted. During pyrite oxidation in mine tailings, pressure reduction in the reaction zone drives advective gas flow into the sediment column, enhancing the oxidation process. In carbonate‐rich mine tailings, calcite dissolution releases carbon dioxide, which partly offsets the pressure reduction caused by O2 consumption.

  4. Redd Site Selection and Spawning Habitat Use by Fall Chinook Salmon: The Importance of Geomorphic Features in Large Rivers

    PubMed

    Geist; Dauble

    1998-09-01

    / Knowledge of the three-dimensional connectivity between rivers and groundwater within the hyporheic zone can be used to improve the definition of fall chinook salmon (Oncorhynchus tshawytscha) spawning habitat. Information exists on the microhabitat characteristics that define suitable salmon spawning habitat. However, traditional spawning habitat models that use these characteristics to predict available spawning habitat are restricted because they can not account for the heterogeneous nature of rivers. We present a conceptual spawning habitat model for fall chinook salmon that describes how geomorphic features of river channels create hydraulic processes, including hyporheic flows, that influence where salmon spawn in unconstrained reaches of large mainstem alluvial rivers. Two case studies based on empirical data from fall chinook salmon spawning areas in the Hanford Reach of the Columbia River are presented to illustrate important aspects of our conceptual model. We suggest that traditional habitat models and our conceptual model be combined to predict the limits of suitable fall chinook salmon spawning habitat. This approach can incorporate quantitative measures of river channel morphology, including general descriptors of geomorphic features at different spatial scales, in order to understand the processes influencing redd site selection and spawning habitat use. This information is needed in order to protect existing salmon spawning habitat in large rivers, as well as to recover habitat already lost.KEY WORDS: Hyporheic zone; Geomorphology; Spawning habitat; Large rivers; Fall chinook salmon; Habitat management

  5. Assessing competencies: an evaluation of ASTD's Certified Professional in Learning and Performance (CPLP) designation.

    PubMed

    Kwon, Seolim; Wadholm, Robert R; Carmody, Laurie E

    2014-06-01

    The American Society of Training and Development's (ASTD) Certified Professional in Learning and Performance (CPLP) program is purported to be based on the ASTD's competency model, a model which outlines foundational competencies, roles, and areas of expertise in the field of training and performance improvement. This study seeks to uncover the relationship between the competency model and the CPLP knowledge exam questions and work product submissions (two of the major instruments used to test for competency of CPLP applicants). A mixed qualitative-quantitative approach is used to identify themes, quantify relationships, and assess questions and guidelines. Multiple raters independently analyzed the data and identified key themes, and Fleiss' Kappa coefficient was used in measuring inter-rater agreement. The study concludes that several discrepancies exist between the competency model and the knowledge exam and work product submission guidelines. Recommendations are given for possible improvement of the CPLP program. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Merging spatially variant physical process models under an optimized systems dynamics framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cain, William O.; Lowry, Thomas Stephen; Pierce, Suzanne A.

    The complexity of water resource issues, its interconnectedness to other systems, and the involvement of competing stakeholders often overwhelm decision-makers and inhibit the creation of clear management strategies. While a range of modeling tools and procedures exist to address these problems, they tend to be case specific and generally emphasize either a quantitative and overly analytic approach or present a qualitative dialogue-based approach lacking the ability to fully explore consequences of different policy decisions. The integration of these two approaches is needed to drive toward final decisions and engender effective outcomes. Given these limitations, the Computer Assisted Dispute Resolution systemmore » (CADRe) was developed to aid in stakeholder inclusive resource planning. This modeling and negotiation system uniquely addresses resource concerns by developing a spatially varying system dynamics model as well as innovative global optimization search techniques to maximize outcomes from participatory dialogues. Ultimately, the core system architecture of CADRe also serves as the cornerstone upon which key scientific innovation and challenges can be addressed.« less

  7. Measures of model performance based on the log accuracy ratio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morley, Steven Karl; Brito, Thiago Vasconcelos; Welling, Daniel T.

    Quantitative assessment of modeling and forecasting of continuous quantities uses a variety of approaches. We review existing literature describing metrics for forecast accuracy and bias, concentrating on those based on relative errors and percentage errors. Of these accuracy metrics, the mean absolute percentage error (MAPE) is one of the most common across many fields and has been widely applied in recent space science literature and we highlight the benefits and drawbacks of MAPE and proposed alternatives. We then introduce the log accuracy ratio, and derive from it two metrics: the median symmetric accuracy; and the symmetric signed percentage bias. Robustmore » methods for estimating the spread of a multiplicative linear model using the log accuracy ratio are also presented. The developed metrics are shown to be easy to interpret, robust, and to mitigate the key drawbacks of their more widely-used counterparts based on relative errors and percentage errors. Their use is illustrated with radiation belt electron flux modeling examples.« less

  8. Characterizing Space Environments with Long-Term Space Plasma Archive Resources

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Miller, J. Scott; Diekmann, Anne M.; Parker, Linda N.

    2009-01-01

    A significant scientific benefit of establishing and maintaining long-term space plasma data archives is the ready access the archives afford to resources required for characterizing spacecraft design environments. Space systems must be capable of operating in the mean environments driven by climatology as well as the extremes that occur during individual space weather events. Long- term time series are necessary to obtain quantitative information on environment variability and extremes that characterize the mean and worst case environments that may be encountered during a mission. In addition, analysis of large data sets are important to scientific studies of flux limiting processes that provide a basis for establishing upper limits to environment specifications used in radiation or charging analyses. We present applications using data from existing archives and highlight their contributions to space environment models developed at Marshall Space Flight Center including the Chandra Radiation Model, ionospheric plasma variability models, and plasma models of the L2 space environment.

  9. Functional mapping of reaction norms to multiple environmental signals through nonparametric covariance estimation

    PubMed Central

    2011-01-01

    Background The identification of genes or quantitative trait loci that are expressed in response to different environmental factors such as temperature and light, through functional mapping, critically relies on precise modeling of the covariance structure. Previous work used separable parametric covariance structures, such as a Kronecker product of autoregressive one [AR(1)] matrices, that do not account for interaction effects of different environmental factors. Results We implement a more robust nonparametric covariance estimator to model these interactions within the framework of functional mapping of reaction norms to two signals. Our results from Monte Carlo simulations show that this estimator can be useful in modeling interactions that exist between two environmental signals. The interactions are simulated using nonseparable covariance models with spatio-temporal structural forms that mimic interaction effects. Conclusions The nonparametric covariance estimator has an advantage over separable parametric covariance estimators in the detection of QTL location, thus extending the breadth of use of functional mapping in practical settings. PMID:21269481

  10. The effects of nutrition labeling on consumer food choice: a psychological experiment and computational model.

    PubMed

    Helfer, Peter; Shultz, Thomas R

    2014-12-01

    The widespread availability of calorie-dense food is believed to be a contributing cause of an epidemic of obesity and associated diseases throughout the world. One possible countermeasure is to empower consumers to make healthier food choices with useful nutrition labeling. An important part of this endeavor is to determine the usability of existing and proposed labeling schemes. Here, we report an experiment on how four different labeling schemes affect the speed and nutritional value of food choices. We then apply decision field theory, a leading computational model of human decision making, to simulate the experimental results. The psychology experiment shows that quantitative, single-attribute labeling schemes have greater usability than multiattribute and binary ones, and that they remain effective under moderate time pressure. The computational model simulates these psychological results and provides explanatory insights into them. This work shows how experimental psychology and computational modeling can contribute to the evaluation and improvement of nutrition-labeling schemes. © 2014 New York Academy of Sciences.

  11. Measures of model performance based on the log accuracy ratio

    DOE PAGES

    Morley, Steven Karl; Brito, Thiago Vasconcelos; Welling, Daniel T.

    2018-01-03

    Quantitative assessment of modeling and forecasting of continuous quantities uses a variety of approaches. We review existing literature describing metrics for forecast accuracy and bias, concentrating on those based on relative errors and percentage errors. Of these accuracy metrics, the mean absolute percentage error (MAPE) is one of the most common across many fields and has been widely applied in recent space science literature and we highlight the benefits and drawbacks of MAPE and proposed alternatives. We then introduce the log accuracy ratio, and derive from it two metrics: the median symmetric accuracy; and the symmetric signed percentage bias. Robustmore » methods for estimating the spread of a multiplicative linear model using the log accuracy ratio are also presented. The developed metrics are shown to be easy to interpret, robust, and to mitigate the key drawbacks of their more widely-used counterparts based on relative errors and percentage errors. Their use is illustrated with radiation belt electron flux modeling examples.« less

  12. Modelling the interdependence between the stoichiometry of receptor oligomerization and ligand binding for a coexisting dimer/tetramer receptor system.

    PubMed

    Rovira, X; Vivó, M; Serra, J; Roche, D; Strange, P G; Giraldo, J

    2009-01-01

    Many G protein-coupled receptors have been shown to exist as oligomers, but the oligomerization state and the effects of this on receptor function are unclear. For some G protein-coupled receptors, in ligand binding assays, different radioligands provide different maximal binding capacities. Here we have developed mathematical models for co-expressed dimeric and tetrameric species of receptors. We have considered models where the dimers and tetramers are in equilibrium and where they do not interconvert and we have also considered the potential influence of the ligands on the degree of oligomerization. By analogy with agonist efficacy, we have considered ligands that promote, inhibit or have no effect on oligomerization. Cell surface receptor expression and the intrinsic capacity of receptors to oligomerize are quantitative parameters of the equations. The models can account for differences in the maximal binding capacities of radioligands in different preparations of receptors and provide a conceptual framework for simulation and data fitting in complex oligomeric receptor situations.

  13. Exploring Asynchronous Many-Task Runtime Systems toward Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knight, Samuel; Baker, Gavin Matthew; Gamell, Marc

    2015-10-01

    Major exascale computing reports indicate a number of software challenges to meet the dramatic change of system architectures in near future. While several-orders-of-magnitude increase in parallelism is the most commonly cited of those, hurdles also include performance heterogeneity of compute nodes across the system, increased imbalance between computational capacity and I/O capabilities, frequent system interrupts, and complex hardware architectures. Asynchronous task-parallel programming models show a great promise in addressing these issues, but are not yet fully understood nor developed su ciently for computational science and engineering application codes. We address these knowledge gaps through quantitative and qualitative exploration of leadingmore » candidate solutions in the context of engineering applications at Sandia. In this poster, we evaluate MiniAero code ported to three leading candidate programming models (Charm++, Legion and UINTAH) to examine the feasibility of these models that permits insertion of new programming model elements into an existing code base.« less

  14. Agent-based re-engineering of ErbB signaling: a modeling pipeline for integrative systems biology.

    PubMed

    Das, Arya A; Ajayakumar Darsana, T; Jacob, Elizabeth

    2017-03-01

    Experiments in systems biology are generally supported by a computational model which quantitatively estimates the parameters of the system by finding the best fit to the experiment. Mathematical models have proved to be successful in reverse engineering the system. The data generated is interpreted to understand the dynamics of the underlying phenomena. The question we have sought to answer is that - is it possible to use an agent-based approach to re-engineer a biological process, making use of the available knowledge from experimental and modelling efforts? Can the bottom-up approach benefit from the top-down exercise so as to create an integrated modelling formalism for systems biology? We propose a modelling pipeline that learns from the data given by reverse engineering, and uses it for re-engineering the system, to carry out in-silico experiments. A mathematical model that quantitatively predicts co-expression of EGFR-HER2 receptors in activation and trafficking has been taken for this study. The pipeline architecture takes cues from the population model that gives the rates of biochemical reactions, to formulate knowledge-based rules for the particle model. Agent-based simulations using these rules, support the existing facts on EGFR-HER2 dynamics. We conclude that, re-engineering models, built using the results of reverse engineering, opens up the possibility of harnessing the power pack of data which now lies scattered in literature. Virtual experiments could then become more realistic when empowered with the findings of empirical cell biology and modelling studies. Implemented on the Agent Modelling Framework developed in-house. C ++ code templates available in Supplementary material . liz.csir@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  15. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    PubMed

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-03-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg.

  16. Quantitative Protein Localization Signatures Reveal an Association between Spatial and Functional Divergences of Proteins

    PubMed Central

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-01-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg. PMID:24603469

  17. An Assessment of the Quantitative Literacy of Undergraduate Students

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2016-01-01

    Quantitative literacy (QLT) represents an underlying higher-order construct that accounts for a person's willingness to engage in quantitative situations in everyday life. The purpose of this study is to retest the construct validity of a model of quantitative literacy (Wilkins, 2010). In this model, QLT represents a second-order factor that…

  18. Synthesis of quantitative and qualitative research: an example using Critical Interpretive Synthesis.

    PubMed

    Flemming, Kate

    2010-01-01

    This paper is a report of a Critical Interpretive Synthesis to synthesize quantitative research, in the form of an effectiveness review and a guideline, with qualitative research to examine the use of morphine to treat cancer-related pain. Critical Interpretive Synthesis is a new method of reviewing, developed from meta-ethnography, which integrates systematic review methodology with a qualitative tradition of enquiry. It has not previously been used specifically to synthesize effectiveness and qualitative literature. Data sources. An existing systematic review of quantitative research and a guideline examining the effectiveness of oral morphine to treat cancer pain were identified. Electronic searches of Medline, CINAHL, Embase, PsychINFO, Health Management Information Consortium database and the Social Science Citation Index to identify qualitative research were carried out in May 2008. Qualitative research papers reporting on the use of morphine to treat cancer pain were identified. The findings of the effectiveness research were used as a framework to guide the translation of findings from qualitative research using an integrative grid. A secondary translation of findings from the qualitative research, not specifically mapped to the effectiveness literature, was guided by the framework. Nineteen qualitative papers were synthesized with the quantitative effectiveness literature, producing 14 synthetic constructs. These were developed into four synthesizing arguments which drew on patients', carers' and healthcare professionals' interpretations of the meaning and context of the use of morphine to treat cancer pain. Critical Interpretive Synthesis can be adapted to synthesize reviews of quantitative research into effectiveness with qualitative research and fits into an existing typology of approaches to synthesizing qualitative and quantitative research.

  19. Assessing Quantitative Literacy in Higher Education: An Overview of Existing Research and Assessments with Recommendations for Next-Generation Assessment. Research Report. ETS RR-14-22

    ERIC Educational Resources Information Center

    Roohr, Katrina Crotts; Graf, Edith Aurora; Liu, Ou Lydia

    2014-01-01

    Quantitative literacy has been recognized as an important skill in the higher education and workforce communities, focusing on problem solving, reasoning, and real-world application. As a result, there is a need by various stakeholders in higher education and workforce communities to evaluate whether college students receive sufficient training on…

  20. Public and Private Further Education and Training in South Africa: A Comparative Analysis of the Quantitative Evidence

    ERIC Educational Resources Information Center

    Akoojee, Salim; McGrath, Simon

    2007-01-01

    Public and private provision of vocational education and training (or Further Education and Training in the South African usage) exist in a relationship with each other but are rarely considered together. An analysis is provided of recent quantitative evidence on both sectors in South Africa in order to advance the case for further policy and…

Top