Sample records for models quantitatively describe

  1. Computer simulation of the metastatic progression.

    PubMed

    Wedemann, Gero; Bethge, Anja; Haustein, Volker; Schumacher, Udo

    2014-01-01

    A novel computer model based on a discrete event simulation procedure describes quantitatively the processes underlying the metastatic cascade. Analytical functions describe the size of the primary tumor and the metastases, while a rate function models the intravasation events of the primary tumor and metastases. Events describe the behavior of the malignant cells until the formation of new metastases. The results of the computer simulations are in quantitative agreement with clinical data determined from a patient with hepatocellular carcinoma in the liver. The model provides a more detailed view on the process than a conventional mathematical model. In particular, the implications of interventions on metastasis formation can be calculated.

  2. 6 Principles for Quantitative Reasoning and Modeling

    ERIC Educational Resources Information Center

    Weber, Eric; Ellis, Amy; Kulow, Torrey; Ozgur, Zekiye

    2014-01-01

    Encouraging students to reason with quantitative relationships can help them develop, understand, and explore mathematical models of real-world phenomena. Through two examples--modeling the motion of a speeding car and the growth of a Jactus plant--this article describes how teachers can use six practical tips to help students develop quantitative…

  3. A Quantitative Cost Effectiveness Model for Web-Supported Academic Instruction

    ERIC Educational Resources Information Center

    Cohen, Anat; Nachmias, Rafi

    2006-01-01

    This paper describes a quantitative cost effectiveness model for Web-supported academic instruction. The model was designed for Web-supported instruction (rather than distance learning only) characterizing most of the traditional higher education institutions. It is based on empirical data (Web logs) of students' and instructors' usage…

  4. SYSTEMS BIOLOGY MODEL DEVELOPMENT AND APPLICATION

    EPA Science Inventory

    System biology models holistically describe, in a quantitative fashion, the relationships between different levels of a biologic system. Relationships between individual components of a system are delineated. System biology models describe how the components of the system inter...

  5. Quantitative Adverse Outcome Pathways and Their Application to Predictive Toxicology

    EPA Science Inventory

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course p...

  6. Ionocovalency and Applications 1. Ionocovalency Model and Orbital Hybrid Scales

    PubMed Central

    Zhang, Yonghe

    2010-01-01

    Ionocovalency (IC), a quantitative dual nature of the atom, is defined and correlated with quantum-mechanical potential to describe quantitatively the dual properties of the bond. Orbiotal hybrid IC model scale, IC, and IC electronegativity scale, XIC, are proposed, wherein the ionicity and the covalent radius are determined by spectroscopy. Being composed of the ionic function I and the covalent function C, the model describes quantitatively the dual properties of bond strengths, charge density and ionic potential. Based on the atomic electron configuration and the various quantum-mechanical built-up dual parameters, the model formed a Dual Method of the multiple-functional prediction, which has much more versatile and exceptional applications than traditional electronegativity scales and molecular properties. Hydrogen has unconventional values of IC and XIC, lower than that of boron. The IC model can agree fairly well with the data of bond properties and satisfactorily explain chemical observations of elements throughout the Periodic Table. PMID:21151444

  7. Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography

    NASA Astrophysics Data System (ADS)

    Revel, G. M.; Pandarese, G.; Cavuto, A.

    2012-06-01

    The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.

  8. A Bayesian network approach to knowledge integration and representation of farm irrigation: 1. Model development

    NASA Astrophysics Data System (ADS)

    Wang, Q. J.; Robertson, D. E.; Haines, C. L.

    2009-02-01

    Irrigation is important to many agricultural businesses but also has implications for catchment health. A considerable body of knowledge exists on how irrigation management affects farm business and catchment health. However, this knowledge is fragmentary; is available in many forms such as qualitative and quantitative; is dispersed in scientific literature, technical reports, and the minds of individuals; and is of varying degrees of certainty. Bayesian networks allow the integration of dispersed knowledge into quantitative systems models. This study describes the development, validation, and application of a Bayesian network model of farm irrigation in the Shepparton Irrigation Region of northern Victoria, Australia. In this first paper we describe the process used to integrate a range of sources of knowledge to develop a model of farm irrigation. We describe the principal model components and summarize the reaction to the model and its development process by local stakeholders. Subsequent papers in this series describe model validation and the application of the model to assess the regional impact of historical and future management intervention.

  9. Final Report for Dynamic Models for Causal Analysis of Panel Data. Models for Change in Quantitative Variables, Part II Scholastic Models. Part II, Chapter 4.

    ERIC Educational Resources Information Center

    Hannan, Michael T.

    This document is part of a series of chapters described in SO 011 759. Stochastic models for the sociological analysis of change and the change process in quantitative variables are presented. The author lays groundwork for the statistical treatment of simple stochastic differential equations (SDEs) and discusses some of the continuities of…

  10. Civic Engagement Measures for Latina/o College Students

    ERIC Educational Resources Information Center

    Alcantar, Cynthia M.

    2014-01-01

    This chapter uses a critical quantitative approach to study models and measures of civic engagement for Latina/o college students. The chapter describes the importance of a critical quantitative approach to study civic engagement of Latina/o college students, then uses Hurtado et al.'s (Hurtado, S., 2012) model to examine the civic engagement…

  11. A Method for Label-Free, Differential Top-Down Proteomics.

    PubMed

    Ntai, Ioanna; Toby, Timothy K; LeDuc, Richard D; Kelleher, Neil L

    2016-01-01

    Biomarker discovery in the translational research has heavily relied on labeled and label-free quantitative bottom-up proteomics. Here, we describe a new approach to biomarker studies that utilizes high-throughput top-down proteomics and is the first to offer whole protein characterization and relative quantitation within the same experiment. Using yeast as a model, we report procedures for a label-free approach to quantify the relative abundance of intact proteins ranging from 0 to 30 kDa in two different states. In this chapter, we describe the integrated methodology for the large-scale profiling and quantitation of the intact proteome by liquid chromatography-mass spectrometry (LC-MS) without the need for metabolic or chemical labeling. This recent advance for quantitative top-down proteomics is best implemented with a robust and highly controlled sample preparation workflow before data acquisition on a high-resolution mass spectrometer, and the application of a hierarchical linear statistical model to account for the multiple levels of variance contained in quantitative proteomic comparisons of samples for basic and clinical research.

  12. A Laparoscopic Swine Model of Noncompressible Torso Hemorrhage

    DTIC Science & Technology

    2014-01-01

    Various porcine models of hemorrhage have been developed for civilian and military trauma research. However, the predominant contemporary models lack...significant predictors of mortality. CONCLUSION: This study describes a model of NCTH that reflects clinically relevant physiology in trauma and...uncontrolled hemorrhage. In addition, it quantitatively assesses the role of the swine contractile spleen in the described model. (J Trauma Acute Care Surg

  13. Tannin structural elucidation and quantitative ³¹P NMR analysis. 1. Model compounds.

    PubMed

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    Tannins and flavonoids are secondary metabolites of plants that display a wide array of biological activities. This peculiarity is related to the inhibition of extracellular enzymes that occurs through the complexation of peptides by tannins. Not only the nature of these interactions, but more fundamentally also the structure of these heterogeneous polyphenolic molecules are not completely clear. This first paper describes the development of a new analytical method for the structural characterization of tannins on the basis of tannin model compounds employing an in situ labeling of all labile H groups (aliphatic OH, phenolic OH, and carboxylic acids) with a phosphorus reagent. The ³¹P NMR analysis of ³¹P-labeled samples allowed the unprecedented quantitative and qualitative structural characterization of hydrolyzable tannins, proanthocyanidins, and catechin tannin model compounds, forming the foundations for the quantitative structural elucidation of a variety of actual tannin samples described in part 2 of this series.

  14. Getting quantitative about consequences of cross-ecosystem resource subsidies on recipient consumers

    USGS Publications Warehouse

    Richardson, John S.; Wipfli, Mark S.

    2016-01-01

    Most studies of cross-ecosystem resource subsidies have demonstrated positive effects on recipient consumer populations, often with very large effect sizes. However, it is important to move beyond these initial addition–exclusion experiments to consider the quantitative consequences for populations across gradients in the rates and quality of resource inputs. In our introduction to this special issue, we describe at least four potential models that describe functional relationships between subsidy input rates and consumer responses, most of them asymptotic. Here we aim to advance our quantitative understanding of how subsidy inputs influence recipient consumers and their communities. In the papers following, fish were either the recipient consumers or the subsidy as carcasses of anadromous species. Advancing general, predictive models will enable us to further consider what other factors are potentially co-limiting (e.g., nutrients, other population interactions, physical habitat, etc.) and better integrate resource subsidies into consumer–resource, biophysical dynamics models.

  15. Quantitative model for the blood pressure-lowering interaction of valsartan and amlodipine.

    PubMed

    Heo, Young-A; Holford, Nick; Kim, Yukyung; Son, Mijeong; Park, Kyungsoo

    2016-12-01

    The objective of this study was to develop a population pharmacokinetic (PK) and pharmacodynamic (PD) model to quantitatively describe the antihypertensive effect of combined therapy with amlodipine and valsartan. PK modelling was used with data collected from 48 healthy volunteers receiving a single dose of combined formulation of 10 mg amlodipine and 160 mg valsartan. Systolic (SBP) and diastolic blood pressure (DBP) were recorded during combined administration. SBP and DBP data for each drug alone were gathered from the literature. PKPD models of each drug and for combined administration were built with NONMEM 7.3. A two-compartment model with zero order absorption best described the PK data of both drugs. Amlodipine and valsartan monotherapy effects on SBP and DBP were best described by an I max model with an effect compartment delay. Combined therapy was described using a proportional interaction term as follows: (D 1  + D 2 ) +ALPHA×(D 1 × D 2 ). D 1 and D 2 are the predicted drug effects of amlodipine and valsartan monotherapy respectively. ALPHA is the interaction term for combined therapy. Quantitative estimates of ALPHA were -0.171 (95% CI: -0.218, -0.143) for SBP and -0.0312 (95% CI: -0.07739, -0.00283) for DBP. These infra-additive interaction terms for both SBP and DBP were consistent with literature results for combined administration of drugs in these classes. PKPD models for SBP and DBP successfully described the time course of the antihypertensive effects of amlodipine and valsartan. An infra-additive interaction between amlodipine and valsartan when used in combined administration was confirmed and quantified. © 2016 The British Pharmacological Society.

  16. Quantitative model for the blood pressure‐lowering interaction of valsartan and amlodipine

    PubMed Central

    Heo, Young‐A; Holford, Nick; Kim, Yukyung; Son, Mijeong

    2016-01-01

    Aims The objective of this study was to develop a population pharmacokinetic (PK) and pharmacodynamic (PD) model to quantitatively describe the antihypertensive effect of combined therapy with amlodipine and valsartan. Methods PK modelling was used with data collected from 48 healthy volunteers receiving a single dose of combined formulation of 10 mg amlodipine and 160 mg valsartan. Systolic (SBP) and diastolic blood pressure (DBP) were recorded during combined administration. SBP and DBP data for each drug alone were gathered from the literature. PKPD models of each drug and for combined administration were built with NONMEM 7.3. Results A two‐compartment model with zero order absorption best described the PK data of both drugs. Amlodipine and valsartan monotherapy effects on SBP and DBP were best described by an I max model with an effect compartment delay. Combined therapy was described using a proportional interaction term as follows: (D1 + D2) +ALPHA×(D1 × D2). D1 and D2 are the predicted drug effects of amlodipine and valsartan monotherapy respectively. ALPHA is the interaction term for combined therapy. Quantitative estimates of ALPHA were −0.171 (95% CI: −0.218, −0.143) for SBP and −0.0312 (95% CI: −0.07739, −0.00283) for DBP. These infra‐additive interaction terms for both SBP and DBP were consistent with literature results for combined administration of drugs in these classes. Conclusion PKPD models for SBP and DBP successfully described the time course of the antihypertensive effects of amlodipine and valsartan. An infra‐additive interaction between amlodipine and valsartan when used in combined administration was confirmed and quantified. PMID:27504853

  17. Final Report for Dynamic Models for Causal Analysis of Panel Data. Models for Change in Quantitative Variables, Part III: Estimation from Panel Data. Part II, Chapter 5.

    ERIC Educational Resources Information Center

    Hannan, Michael T.

    This document is part of a series of chapters described in SO 011 759. Addressing the problems of studying change and the change process, the report argues that sociologists should study coupled changes in qualitative and quantitative outcomes (e.g., marital status and earnings). The author presents a model for sociological studies of change in…

  18. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  19. Development of a Model for Some Aspects of University Policy. Technical Report.

    ERIC Educational Resources Information Center

    Goossens, J. L. M.; And Others

    A method to calculate the need for academic staff per faculty, based on educational programs and numbers of students, is described which is based on quantitative relations between programs, student enrollment, and total budget. The model is described schematically and presented in a mathematical form adapted to computer processing. Its application…

  20. Modeling biological gradient formation: combining partial differential equations and Petri nets.

    PubMed

    Bertens, Laura M F; Kleijn, Jetty; Hille, Sander C; Heiner, Monika; Koutny, Maciej; Verbeek, Fons J

    2016-01-01

    Both Petri nets and differential equations are important modeling tools for biological processes. In this paper we demonstrate how these two modeling techniques can be combined to describe biological gradient formation. Parameters derived from partial differential equation describing the process of gradient formation are incorporated in an abstract Petri net model. The quantitative aspects of the resulting model are validated through a case study of gradient formation in the fruit fly.

  1. The Matching Relation and Situation-Specific Bias Modulation in Professional Football Play Selection

    PubMed Central

    Stilling, Stephanie T; Critchfield, Thomas S

    2010-01-01

    The utility of a quantitative model depends on the extent to which its fitted parameters vary systematically with environmental events of interest. Professional football statistics were analyzed to determine whether play selection (passing versus rushing plays) could be accounted for with the generalized matching equation, and in particular whether variations in play selection across game situations would manifest as changes in the equation's fitted parameters. Statistically significant changes in bias were found for each of five types of game situations; no systematic changes in sensitivity were observed. Further analyses suggested relationships between play selection bias and both turnover probability (which can be described in terms of punishment) and yards-gained variance (which can be described in terms of variable-magnitude reinforcement schedules). The present investigation provides a useful demonstration of association between face-valid, situation-specific effects in a domain of everyday interest, and a theoretically important term of a quantitative model of behavior. Such associations, we argue, are an essential focus in translational extensions of quantitative models. PMID:21119855

  2. Incorporating temporal and clinical reasoning in a new measure of continuity of care.

    PubMed Central

    Spooner, S. A.

    1994-01-01

    Previously described quantitative methods for measuring continuity of care have assumed that perfect continuity exists when a patient sees only one provider, regardless of the temporal pattern and clinical context of the visits. This paper describes an implementation of a new operational model of continuity--the Temporal Continuity Index--that takes into account time intervals between well visits in a pediatric residency continuity clinic. Ideal continuity in this model is achieved when intervals between visits are appropriate based on the age of the patient and clinical context of the encounters. The fundamental concept in this model is the expectation interval, which contains the length of the maximum ideal follow-up interval for a visit and the maximum follow-up interval. This paper describes an initial implementation of the TCI model and compares TCI calculations to previous quantitative methods and proposes its use as part of the assessment of resident education in outpatient settings. PMID:7950019

  3. Highlights from High Energy Neutrino Experiments at CERN

    NASA Astrophysics Data System (ADS)

    Schlatter, W.-D.

    2015-07-01

    Experiments with high energy neutrino beams at CERN provided early quantitative tests of the Standard Model. This article describes results from studies of the nucleon quark structure and of the weak current, together with the precise measurement of the weak mixing angle. These results have established a new quality for tests of the electroweak model. In addition, the measurements of the nucleon structure functions in deep inelastic neutrino scattering allowed first quantitative tests of QCD.

  4. PREDICTING SUBSURFACE CONTAMINANT TRANSPORT AND TRANSFORMATION: CONSIDERATIONS FOR MODEL SELECTION AND FIELD VALIDATION

    EPA Science Inventory

    Predicting subsurface contaminant transport and transformation requires mathematical models based on a variety of physical, chemical, and biological processes. The mathematical model is an attempt to quantitatively describe observed processes in order to permit systematic forecas...

  5. Strengthening Student Engagement with Quantitative Subjects in a Business Faculty

    ERIC Educational Resources Information Center

    Warwick, Jon; Howard, Anna

    2014-01-01

    This paper reflects on the results of research undertaken at a large UK university relating to the teaching of quantitative subjects within a Business Faculty. It builds on a simple model of student engagement and, through the description of three case studies, describes research undertaken and developments implemented to strengthen aspects of the…

  6. From Nominal to Quantitative Codification of Content-Neutral Variables in Graphics Research: The Beginnings of a Manifest Content Model.

    ERIC Educational Resources Information Center

    Crow, Wendell C.

    This paper suggests ways in which manifest, physical attributes of graphic elements can be described and measured. It also proposes a preliminary conceptual model that accounts for the readily apparent, measurable variables in a visual message. The graphic elements that are described include format, typeface, and photographs/artwork. The…

  7. Querying quantitative logic models (Q2LM) to study intracellular signaling networks and cell-cytokine interactions.

    PubMed

    Morris, Melody K; Shriver, Zachary; Sasisekharan, Ram; Lauffenburger, Douglas A

    2012-03-01

    Mathematical models have substantially improved our ability to predict the response of a complex biological system to perturbation, but their use is typically limited by difficulties in specifying model topology and parameter values. Additionally, incorporating entities across different biological scales ranging from molecular to organismal in the same model is not trivial. Here, we present a framework called "querying quantitative logic models" (Q2LM) for building and asking questions of constrained fuzzy logic (cFL) models. cFL is a recently developed modeling formalism that uses logic gates to describe influences among entities, with transfer functions to describe quantitative dependencies. Q2LM does not rely on dedicated data to train the parameters of the transfer functions, and it permits straight-forward incorporation of entities at multiple biological scales. The Q2LM framework can be employed to ask questions such as: Which therapeutic perturbations accomplish a designated goal, and under what environmental conditions will these perturbations be effective? We demonstrate the utility of this framework for generating testable hypotheses in two examples: (i) a intracellular signaling network model; and (ii) a model for pharmacokinetics and pharmacodynamics of cell-cytokine interactions; in the latter, we validate hypotheses concerning molecular design of granulocyte colony stimulating factor. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    PubMed Central

    Zhu, Hao; Sun, Yan; Rajagopal, Gunaretnam; Mondry, Adrian; Dhar, Pawan

    2004-01-01

    Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods described. PMID:15339335

  9. Quantitative weaknesses of the Marcus-Hush theory of electrode kinetics revealed by Reverse Scan Square Wave Voltammetry: The reduction of 2-methyl-2-nitropropane at mercury microelectrodes

    NASA Astrophysics Data System (ADS)

    Laborda, Eduardo; Wang, Yijun; Henstridge, Martin C.; Martínez-Ortiz, Francisco; Molina, Angela; Compton, Richard G.

    2011-08-01

    The Marcus-Hush and Butler-Volmer kinetic electrode models are compared experimentally by studying the reduction of 2-methyl-2-nitropropane in acetonitrile at mercury microelectrodes using Reverse Scan Square Wave Voltammetry. This technique is found to be very sensitive to the electrode kinetics and to permit critical comparison of the two models. The Butler-Volmer model satisfactorily fits the experimental data whereas Marcus-Hush does not quantitatively describe this redox system.

  10. Quantitative Prediction of Drug–Drug Interactions Involving Inhibitory Metabolites in Drug Development: How Can Physiologically Based Pharmacokinetic Modeling Help?

    PubMed Central

    Chen, Y; Mao, J; Lin, J; Yu, H; Peters, S; Shebley, M

    2016-01-01

    This subteam under the Drug Metabolism Leadership Group (Innovation and Quality Consortium) investigated the quantitative role of circulating inhibitory metabolites in drug–drug interactions using physiologically based pharmacokinetic (PBPK) modeling. Three drugs with major circulating inhibitory metabolites (amiodarone, gemfibrozil, and sertraline) were systematically evaluated in addition to the literature review of recent examples. The application of PBPK modeling in drug interactions by inhibitory parent–metabolite pairs is described and guidance on strategic application is provided. PMID:27642087

  11. A Way Forward Commentary

    EPA Science Inventory

    Models for predicting adverse outcomes can help reduce and focus animal testing with new and existing chemicals. This short "thought starter" describes how quantitative-structure activity relationship and systems biology models can be used to help define toxicity pathways and li...

  12. A Constitutive Relationship for Gravelly Soil Considering Fine Particle Suffusion

    PubMed Central

    Zhang, Yuning; Chen, Yulong

    2017-01-01

    Suffusion erosion may occur in sandy gravel dam foundations that use suspended cutoff walls. This erosion causes a loss of fine particles, degrades the soil strength and deformation moduli, and adversely impacts the cutoff walls of the dam foundation, as well as the overlying dam body. A comprehensive evaluation of these effects requires models that quantitatively describe the effects of fine particle losses on the stress-strain relationships of sandy gravels. In this work, we propose an experimental scheme for studying these types of models, and then perform triaxial and confined compression tests to determine the effects of particle losses on the stress-strain relationships. Considering the Duncan-Chang E-B model, quantitative expressions describing the relationship between the parameters of the model and the particle losses were derived. The results show that particle losses did not alter the qualitative stress-strain characteristics of the soils; however, the soil strength and deformation moduli were degraded. By establishing the relationship between the parameters of the model and the losses, the same model can then be used to describe the relationship between sandy gravels and erosion levels that vary in both time and space. PMID:29065532

  13. Simulation Modeling of a Facility Layout in Operations Management Classes

    ERIC Educational Resources Information Center

    Yazici, Hulya Julie

    2006-01-01

    Teaching quantitative courses can be challenging. Similarly, layout modeling and lean production concepts can be difficult to grasp in an introductory OM (operations management) class. This article describes a simulation model developed in PROMODEL to facilitate the learning of layout modeling and lean manufacturing. Simulation allows for the…

  14. Fractional calculus phenomenology in two-dimensional plasma models

    NASA Astrophysics Data System (ADS)

    Gustafson, Kyle; Del Castillo Negrete, Diego; Dorland, Bill

    2006-10-01

    Transport processes in confined plasmas for fusion experiments, such as ITER, are not well-understood at the basic level of fully nonlinear, three-dimensional kinetic physics. Turbulent transport is invoked to describe the observed levels in tokamaks, which are orders of magnitude greater than the theoretical predictions. Recent results show the ability of a non-diffusive transport model to describe numerical observations of turbulent transport. For example, resistive MHD modeling of tracer particle transport in pressure-gradient driven turbulence for a three-dimensional plasma reveals that the superdiffusive (2̂˜t^α where α> 1) radial transport in this system is described quantitatively by a fractional diffusion equation Fractional calculus is a generalization involving integro-differential operators, which naturally describe non-local behaviors. Our previous work showed the quantitative agreement of special fractional diffusion equation solutions with numerical tracer particle flows in time-dependent linearized dynamics of the Hasegawa-Mima equation (for poloidal transport in a two-dimensional cold-ion plasma). In pursuit of a fractional diffusion model for transport in a gyrokinetic plasma, we now present numerical results from tracer particle transport in the nonlinear Hasegawa-Mima equation and a planar gyrokinetic model. Finite Larmor radius effects will be discussed. D. del Castillo Negrete, et al, Phys. Rev. Lett. 94, 065003 (2005).

  15. [Transmission dynamic model for echinococcosis granulosus: establishment and application].

    PubMed

    Yang, Shi-Jie; Wu, Wei-Ping

    2009-06-01

    A dynamic model of disease can be used to quantitatively describe the pattern and characteristics of disease transmission, predict the disease status and evaluate the efficacy of control strategy. This review summarizes the basic transmission dynamic models of echinococcosis granulosus and their application.

  16. SENSITIVE PARAMETER EVALUATION FOR A VADOSE ZONE FATE AND TRANSPORT MODEL

    EPA Science Inventory

    This report presents information pertaining to quantitative evaluation of the potential impact of selected parameters on output of vadose zone transport and fate models used to describe the behavior of hazardous chemicals in soil. The Vadose 2one Interactive Processes (VIP) model...

  17. [Modeling continuous scaling of NDVI based on fractal theory].

    PubMed

    Luan, Hai-Jun; Tian, Qing-Jiu; Yu, Tao; Hu, Xin-Li; Huang, Yan; Du, Ling-Tong; Zhao, Li-Min; Wei, Xi; Han, Jie; Zhang, Zhou-Wei; Li, Shao-Peng

    2013-07-01

    Scale effect was one of the very important scientific problems of remote sensing. The scale effect of quantitative remote sensing can be used to study retrievals' relationship between different-resolution images, and its research became an effective way to confront the challenges, such as validation of quantitative remote sensing products et al. Traditional up-scaling methods cannot describe scale changing features of retrievals on entire series of scales; meanwhile, they are faced with serious parameters correction issues because of imaging parameters' variation of different sensors, such as geometrical correction, spectral correction, etc. Utilizing single sensor image, fractal methodology was utilized to solve these problems. Taking NDVI (computed by land surface radiance) as example and based on Enhanced Thematic Mapper Plus (ETM+) image, a scheme was proposed to model continuous scaling of retrievals. Then the experimental results indicated that: (a) For NDVI, scale effect existed, and it could be described by fractal model of continuous scaling; (2) The fractal method was suitable for validation of NDVI. All of these proved that fractal was an effective methodology of studying scaling of quantitative remote sensing.

  18. Continuous time Boolean modeling for biological signaling: application of Gillespie algorithm.

    PubMed

    Stoll, Gautier; Viara, Eric; Barillot, Emmanuel; Calzone, Laurence

    2012-08-29

    Mathematical modeling is used as a Systems Biology tool to answer biological questions, and more precisely, to validate a network that describes biological observations and predict the effect of perturbations. This article presents an algorithm for modeling biological networks in a discrete framework with continuous time. There exist two major types of mathematical modeling approaches: (1) quantitative modeling, representing various chemical species concentrations by real numbers, mainly based on differential equations and chemical kinetics formalism; (2) and qualitative modeling, representing chemical species concentrations or activities by a finite set of discrete values. Both approaches answer particular (and often different) biological questions. Qualitative modeling approach permits a simple and less detailed description of the biological systems, efficiently describes stable state identification but remains inconvenient in describing the transient kinetics leading to these states. In this context, time is represented by discrete steps. Quantitative modeling, on the other hand, can describe more accurately the dynamical behavior of biological processes as it follows the evolution of concentration or activities of chemical species as a function of time, but requires an important amount of information on the parameters difficult to find in the literature. Here, we propose a modeling framework based on a qualitative approach that is intrinsically continuous in time. The algorithm presented in this article fills the gap between qualitative and quantitative modeling. It is based on continuous time Markov process applied on a Boolean state space. In order to describe the temporal evolution of the biological process we wish to model, we explicitly specify the transition rates for each node. For that purpose, we built a language that can be seen as a generalization of Boolean equations. Mathematically, this approach can be translated in a set of ordinary differential equations on probability distributions. We developed a C++ software, MaBoSS, that is able to simulate such a system by applying Kinetic Monte-Carlo (or Gillespie algorithm) on the Boolean state space. This software, parallelized and optimized, computes the temporal evolution of probability distributions and estimates stationary distributions. Applications of the Boolean Kinetic Monte-Carlo are demonstrated for three qualitative models: a toy model, a published model of p53/Mdm2 interaction and a published model of the mammalian cell cycle. Our approach allows to describe kinetic phenomena which were difficult to handle in the original models. In particular, transient effects are represented by time dependent probability distributions, interpretable in terms of cell populations.

  19. Planner-Based Control of Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Kortenkamp, David; Fry, Chuck; Bell, Scott

    2005-01-01

    The paper describes an approach to the integration of qualitative and quantitative modeling techniques for advanced life support (ALS) systems. Developing reliable control strategies that scale up to fully integrated life support systems requires augmenting quantitative models and control algorithms with the abstractions provided by qualitative, symbolic models and their associated high-level control strategies. This will allow for effective management of the combinatorics due to the integration of a large number of ALS subsystems. By focusing control actions at different levels of detail and reactivity we can use faster: simpler responses at the lowest level and predictive but complex responses at the higher levels of abstraction. In particular, methods from model-based planning and scheduling can provide effective resource management over long time periods. We describe reference implementation of an advanced control system using the IDEA control architecture developed at NASA Ames Research Center. IDEA uses planning/scheduling as the sole reasoning method for predictive and reactive closed loop control. We describe preliminary experiments in planner-based control of ALS carried out on an integrated ALS simulation developed at NASA Johnson Space Center.

  20. Quantitative self-assembly prediction yields targeted nanomedicines

    NASA Astrophysics Data System (ADS)

    Shamay, Yosi; Shah, Janki; Işık, Mehtap; Mizrachi, Aviram; Leibold, Josef; Tschaharganeh, Darjus F.; Roxbury, Daniel; Budhathoki-Uprety, Januka; Nawaly, Karla; Sugarman, James L.; Baut, Emily; Neiman, Michelle R.; Dacek, Megan; Ganesh, Kripa S.; Johnson, Darren C.; Sridharan, Ramya; Chu, Karen L.; Rajasekhar, Vinagolu K.; Lowe, Scott W.; Chodera, John D.; Heller, Daniel A.

    2018-02-01

    Development of targeted nanoparticle drug carriers often requires complex synthetic schemes involving both supramolecular self-assembly and chemical modification. These processes are generally difficult to predict, execute, and control. We describe herein a targeted drug delivery system that is accurately and quantitatively predicted to self-assemble into nanoparticles based on the molecular structures of precursor molecules, which are the drugs themselves. The drugs assemble with the aid of sulfated indocyanines into particles with ultrahigh drug loadings of up to 90%. We devised quantitative structure-nanoparticle assembly prediction (QSNAP) models to identify and validate electrotopological molecular descriptors as highly predictive indicators of nano-assembly and nanoparticle size. The resulting nanoparticles selectively targeted kinase inhibitors to caveolin-1-expressing human colon cancer and autochthonous liver cancer models to yield striking therapeutic effects while avoiding pERK inhibition in healthy skin. This finding enables the computational design of nanomedicines based on quantitative models for drug payload selection.

  1. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  2. More details...
  3. A Combined MIS/DS Course Uses Lecture Capture Technology to "Level the Playing Field" in Student Numeracy

    ERIC Educational Resources Information Center

    Popovich, Karen

    2012-01-01

    This paper describes the process taken to develop a quantitative-based and Excel™-driven course that combines "BOTH" Management Information Systems (MIS) and Decision Science (DS) modeling outcomes and lays the foundation for upper level quantitative courses such as operations management, finance and strategic management. In addition,…

  4. Quantifying the Adaptive Cycle

    EPA Science Inventory

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative...

  5. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  6. Training Signaling Pathway Maps to Biochemical Data with Constrained Fuzzy Logic: Quantitative Analysis of Liver Cell Responses to Inflammatory Stimuli

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Clarke, David C.; Sorger, Peter K.; Lauffenburger, Douglas A.

    2011-01-01

    Predictive understanding of cell signaling network operation based on general prior knowledge but consistent with empirical data in a specific environmental context is a current challenge in computational biology. Recent work has demonstrated that Boolean logic can be used to create context-specific network models by training proteomic pathway maps to dedicated biochemical data; however, the Boolean formalism is restricted to characterizing protein species as either fully active or inactive. To advance beyond this limitation, we propose a novel form of fuzzy logic sufficiently flexible to model quantitative data but also sufficiently simple to efficiently construct models by training pathway maps on dedicated experimental measurements. Our new approach, termed constrained fuzzy logic (cFL), converts a prior knowledge network (obtained from literature or interactome databases) into a computable model that describes graded values of protein activation across multiple pathways. We train a cFL-converted network to experimental data describing hepatocytic protein activation by inflammatory cytokines and demonstrate the application of the resultant trained models for three important purposes: (a) generating experimentally testable biological hypotheses concerning pathway crosstalk, (b) establishing capability for quantitative prediction of protein activity, and (c) prediction and understanding of the cytokine release phenotypic response. Our methodology systematically and quantitatively trains a protein pathway map summarizing curated literature to context-specific biochemical data. This process generates a computable model yielding successful prediction of new test data and offering biological insight into complex datasets that are difficult to fully analyze by intuition alone. PMID:21408212

  7. Quantification of Microbial Phenotypes

    PubMed Central

    Martínez, Verónica S.; Krömer, Jens O.

    2016-01-01

    Metabolite profiling technologies have improved to generate close to quantitative metabolomics data, which can be employed to quantitatively describe the metabolic phenotype of an organism. Here, we review the current technologies available for quantitative metabolomics, present their advantages and drawbacks, and the current challenges to generate fully quantitative metabolomics data. Metabolomics data can be integrated into metabolic networks using thermodynamic principles to constrain the directionality of reactions. Here we explain how to estimate Gibbs energy under physiological conditions, including examples of the estimations, and the different methods for thermodynamics-based network analysis. The fundamentals of the methods and how to perform the analyses are described. Finally, an example applying quantitative metabolomics to a yeast model by 13C fluxomics and thermodynamics-based network analysis is presented. The example shows that (1) these two methods are complementary to each other; and (2) there is a need to take into account Gibbs energy errors. Better estimations of metabolic phenotypes will be obtained when further constraints are included in the analysis. PMID:27941694

  8. High-Content Screening for Quantitative Cell Biology.

    PubMed

    Mattiazzi Usaj, Mojca; Styles, Erin B; Verster, Adrian J; Friesen, Helena; Boone, Charles; Andrews, Brenda J

    2016-08-01

    High-content screening (HCS), which combines automated fluorescence microscopy with quantitative image analysis, allows the acquisition of unbiased multiparametric data at the single cell level. This approach has been used to address diverse biological questions and identify a plethora of quantitative phenotypes of varying complexity in numerous different model systems. Here, we describe some recent applications of HCS, ranging from the identification of genes required for specific biological processes to the characterization of genetic interactions. We review the steps involved in the design of useful biological assays and automated image analysis, and describe major challenges associated with each. Additionally, we highlight emerging technologies and future challenges, and discuss how the field of HCS might be enhanced in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. [Quantitative relationship between gas chromatographic retention time and structural parameters of alkylphenols].

    PubMed

    Ruan, Xiaofang; Zhang, Ruisheng; Yao, Xiaojun; Liu, Mancang; Fan, Botao

    2007-03-01

    Alkylphenols are a group of permanent pollutants in the environment and could adversely disturb the human endocrine system. It is therefore important to effectively separate and measure the alkylphenols. To guide the chromatographic analysis of these compounds in practice, the development of quantitative relationship between the molecular structure and the retention time of alkylphenols becomes necessary. In this study, topological, constitutional, geometrical, electrostatic and quantum-chemical descriptors of 44 alkylphenols were calculated using a software, CODESSA, and these descriptors were pre-selected using the heuristic method. As a result, three-descriptor linear model (LM) was developed to describe the relationship between the molecular structure and the retention time of alkylphenols. Meanwhile, the non-linear regression model was also developed based on support vector machine (SVM) using the same three descriptors. The correlation coefficient (R(2)) for the LM and SVM was 0.98 and 0. 92, and the corresponding root-mean-square error was 0. 99 and 2. 77, respectively. By comparing the stability and prediction ability of the two models, it was found that the linear model was a better method for describing the quantitative relationship between the retention time of alkylphenols and the molecular structure. The results obtained suggested that the linear model could be applied for the chromatographic analysis of alkylphenols with known molecular structural parameters.

  10. Modeling of Receptor Tyrosine Kinase Signaling: Computational and Experimental Protocols.

    PubMed

    Fey, Dirk; Aksamitiene, Edita; Kiyatkin, Anatoly; Kholodenko, Boris N

    2017-01-01

    The advent of systems biology has convincingly demonstrated that the integration of experiments and dynamic modelling is a powerful approach to understand the cellular network biology. Here we present experimental and computational protocols that are necessary for applying this integrative approach to the quantitative studies of receptor tyrosine kinase (RTK) signaling networks. Signaling by RTKs controls multiple cellular processes, including the regulation of cell survival, motility, proliferation, differentiation, glucose metabolism, and apoptosis. We describe methods of model building and training on experimentally obtained quantitative datasets, as well as experimental methods of obtaining quantitative dose-response and temporal dependencies of protein phosphorylation and activities. The presented methods make possible (1) both the fine-grained modeling of complex signaling dynamics and identification of salient, course-grained network structures (such as feedback loops) that bring about intricate dynamics, and (2) experimental validation of dynamic models.

  11. A quantitative description for efficient financial markets

    NASA Astrophysics Data System (ADS)

    Immonen, Eero

    2015-09-01

    In this article we develop a control system model for describing efficient financial markets. We define the efficiency of a financial market in quantitative terms by robust asymptotic price-value equality in this model. By invoking the Internal Model Principle of robust output regulation theory we then show that under No Bubble Conditions, in the proposed model, the market is efficient if and only if the following conditions hold true: (1) the traders, as a group, can identify any mispricing in asset value (even if no one single trader can do it accurately), and (2) the traders, as a group, incorporate an internal model of the value process (again, even if no one single trader knows it). This main result of the article, which deliberately avoids the requirement for investor rationality, demonstrates, in quantitative terms, that the more transparent the markets are, the more efficient they are. An extensive example is provided to illustrate the theoretical development.

  12. Complexity-aware simple modeling.

    PubMed

    Gómez-Schiavon, Mariana; El-Samad, Hana

    2018-02-26

    Mathematical models continue to be essential for deepening our understanding of biology. On one extreme, simple or small-scale models help delineate general biological principles. However, the parsimony of detail in these models as well as their assumption of modularity and insulation make them inaccurate for describing quantitative features. On the other extreme, large-scale and detailed models can quantitatively recapitulate a phenotype of interest, but have to rely on many unknown parameters, making them often difficult to parse mechanistically and to use for extracting general principles. We discuss some examples of a new approach-complexity-aware simple modeling-that can bridge the gap between the small-scale and large-scale approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. A Network Neuroscience of Human Learning: Potential To Inform Quantitative Theories of Brain and Behavior

    PubMed Central

    Bassett, Danielle S.; Mattar, Marcelo G.

    2017-01-01

    Humans adapt their behavior to their external environment in a process often facilitated by learning. Efforts to describe learning empirically can be complemented by quantitative theories that map changes in neurophysiology to changes in behavior. In this review we highlight recent advances in network science that offer a sets of tools and a general perspective that may be particularly useful in understanding types of learning that are supported by distributed neural circuits. We describe recent applications of these tools to neuroimaging data that provide unique insights into adaptive neural processes, the attainment of knowledge, and the acquisition of new skills, forming a network neuroscience of human learning. While promising, the tools have yet to be linked to the well-formulated models of behavior that are commonly utilized in cognitive psychology. We argue that continued progress will require the explicit marriage of network approaches to neuroimaging data and quantitative models of behavior. PMID:28259554

  14. A Network Neuroscience of Human Learning: Potential to Inform Quantitative Theories of Brain and Behavior.

    PubMed

    Bassett, Danielle S; Mattar, Marcelo G

    2017-04-01

    Humans adapt their behavior to their external environment in a process often facilitated by learning. Efforts to describe learning empirically can be complemented by quantitative theories that map changes in neurophysiology to changes in behavior. In this review we highlight recent advances in network science that offer a sets of tools and a general perspective that may be particularly useful in understanding types of learning that are supported by distributed neural circuits. We describe recent applications of these tools to neuroimaging data that provide unique insights into adaptive neural processes, the attainment of knowledge, and the acquisition of new skills, forming a network neuroscience of human learning. While promising, the tools have yet to be linked to the well-formulated models of behavior that are commonly utilized in cognitive psychology. We argue that continued progress will require the explicit marriage of network approaches to neuroimaging data and quantitative models of behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Project Simu-School Component Washington State University

    ERIC Educational Resources Information Center

    Glass, Thomas E.

    1976-01-01

    This component of the project attempts to facilitate planning by furnishing models that manage cumbersome and complex data, supply an objectivity that identifies all relationships between elements of the model, and provide a quantitative model allowing for various forecasting techniques that describe the long-range impact of decisions. (Author/IRT)

  16. Downscaling SSPs in Bangladesh - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    NASA Astrophysics Data System (ADS)

    Allan, A.; Barbour, E.; Salehin, M.; Hutton, C.; Lázár, A. N.; Nicholls, R. J.; Rahman, M. M.

    2015-12-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  17. Downscaling SSPs in the GBM Delta - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    NASA Astrophysics Data System (ADS)

    Allan, Andrew; Barbour, Emily; Salehin, Mashfiqus; Munsur Rahman, Md.; Hutton, Craig; Lazar, Attila

    2016-04-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  18. A Model Describing Stable Coherent Synchrotron Radiation in Storage Rings

    NASA Astrophysics Data System (ADS)

    Sannibale, F.; Byrd, J. M.; Loftsdóttir, Á.; Venturini, M.; Abo-Bakr, M.; Feikes, J.; Holldack, K.; Kuske, P.; Wüstefeld, G.; Hübers, H.-W.; Warnock, R.

    2004-08-01

    We present a model describing high power stable broadband coherent synchrotron radiation (CSR) in the terahertz frequency region in an electron storage ring. The model includes distortion of bunch shape from the synchrotron radiation (SR), which enhances higher frequency coherent emission, and limits to stable emission due to an instability excited by the SR wakefield. It gives a quantitative explanation of several features of the recent observations of CSR at the BESSYII storage ring. We also use this model to optimize the performance of a source for stable CSR emission.

  19. Specificity and non-specificity in RNA–protein interactions

    PubMed Central

    Jankowsky, Eckhard; Harris, Michael E.

    2016-01-01

    Gene expression is regulated by complex networks of interactions between RNAs and proteins. Proteins that interact with RNA have been traditionally viewed as either specific or non-specific; specific proteins interact preferentially with defined RNA sequence or structure motifs, whereas non-specific proteins interact with RNA sites devoid of such characteristics. Recent studies indicate that the binary “specific vs. non-specific” classification is insufficient to describe the full spectrum of RNA–protein interactions. Here, we review new methods that enable quantitative measurements of protein binding to large numbers of RNA variants, and the concepts aimed as describing resulting binding spectra: affinity distributions, comprehensive binding models and free energy landscapes. We discuss how these new methodologies and associated concepts enable work towards inclusive, quantitative models for specific and non-specific RNA–protein interactions. PMID:26285679

  20. A model for the characterization of the spatial properties in vestibular neurons

    NASA Technical Reports Server (NTRS)

    Angelaki, D. E.; Bush, G. A.; Perachio, A. A.

    1992-01-01

    Quantitative study of the static and dynamic response properties of some otolith-sensitive neurons has been difficult in the past partly because their responses to different linear acceleration vectors exhibited no "null" plane and a dependence of phase on stimulus orientation. The theoretical formulation of the response ellipse provides a quantitative way to estimate the spatio-temporal properties of such neurons. Its semi-major axis gives the direction of the polarization vector (i.e., direction of maximal sensitivity) and it estimates the neuronal response for stimulation along that direction. In addition, the semi-minor axis of the ellipse provides an estimate of the neuron's maximal sensitivity in the "null" plane. In this paper, extracellular recordings from otolith-sensitive vestibular nuclei neurons in decerebrate rats were used to demonstrate the practical application of the method. The experimentally observed gain and phase dependence on the orientation angle of the acceleration vector in a head-horizontal plane was described and satisfactorily fit by the response ellipse model. In addition, the model satisfactorily fits neuronal responses in three-dimensions and unequivocally demonstrates that the response ellipse formulation is the general approach to describe quantitatively the spatial properties of vestibular neurons.

  21. HOW CAN BIOLOGICALLY-BASED MODELING OF ARSENIC KINETICS AND DYNAMICS INFORM THE RISK ASSESSMENT PROCESS?

    EPA Science Inventory

    Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic met...

  1. Physiologically-Based Pharmacokinetic (PBPK) Model for the Thyroid Hormones in the Pregnant Rat and Fetus.

    EPA Science Inventory

    A developmental PBPK model is constructed to quantitatively describe the tissue economy of the thyroid hormones (THs), thyroxine (T4) and triiodothyronine (T3), in the rat. The model is also used to link maternal (THs) to rat fetal tissues via placental transfer. THs are importan...

  2. The Integration of Evaluation Paradigms Through Metaphor.

    ERIC Educational Resources Information Center

    Felker, Roberta M.

    The point of view is presented that evaluation projects can be enriched by not using either an exclusively quantitative model or an exclusively qualitative model but by combining both models in one project. The concept of metaphor is used to clarify the usefulness of the combination. Iconic or holistic metaphors describe an object or event as…

  3. Computational modeling of the amphibian thyroid axis supported by targeted in vivo testing to advance quantitative adverse outcome pathway development

    EPA Science Inventory

    In vitro screening of chemicals for bioactivity together with computational modeling are beginning to replace animal toxicity testing in support of chemical risk assessment. To facilitate this transition, an amphibian thyroid axis model has been developed to describe thyroid home...

  4. Safe uses of Hill's model: an exact comparison with the Adair-Klotz model

    PubMed Central

    2011-01-01

    Background The Hill function and the related Hill model are used frequently to study processes in the living cell. There are very few studies investigating the situations in which the model can be safely used. For example, it has been shown, at the mean field level, that the dose response curve obtained from a Hill model agrees well with the dose response curves obtained from a more complicated Adair-Klotz model, provided that the parameters of the Adair-Klotz model describe strongly cooperative binding. However, it has not been established whether such findings can be extended to other properties and non-mean field (stochastic) versions of the same, or other, models. Results In this work a rather generic quantitative framework for approaching such a problem is suggested. The main idea is to focus on comparing the particle number distribution functions for Hill's and Adair-Klotz's models instead of investigating a particular property (e.g. the dose response curve). The approach is valid for any model that can be mathematically related to the Hill model. The Adair-Klotz model is used to illustrate the technique. One main and two auxiliary similarity measures were introduced to compare the distributions in a quantitative way. Both time dependent and the equilibrium properties of the similarity measures were studied. Conclusions A strongly cooperative Adair-Klotz model can be replaced by a suitable Hill model in such a way that any property computed from the two models, even the one describing stochastic features, is approximately the same. The quantitative analysis showed that boundaries of the regions in the parameter space where the models behave in the same way exhibit a rather rich structure. PMID:21521501

  5. Modeling of Continuum Manipulators Using Pythagorean Hodograph Curves.

    PubMed

    Singh, Inderjeet; Amara, Yacine; Melingui, Achille; Mani Pathak, Pushparaj; Merzouki, Rochdi

    2018-05-10

    Research on continuum manipulators is increasingly developing in the context of bionic robotics because of their many advantages over conventional rigid manipulators. Due to their soft structure, they have inherent flexibility, which makes it a huge challenge to control them with high performances. Before elaborating a control strategy of such robots, it is essential to reconstruct first the behavior of the robot through development of an approximate behavioral model. This can be kinematic or dynamic depending on the conditions of operation of the robot itself. Kinematically, two types of modeling methods exist to describe the robot behavior; quantitative methods describe a model-based method, and qualitative methods describe a learning-based method. In kinematic modeling of continuum manipulator, the assumption of constant curvature is often considered to simplify the model formulation. In this work, a quantitative modeling method is proposed, based on the Pythagorean hodograph (PH) curves. The aim is to obtain a three-dimensional reconstruction of the shape of the continuum manipulator with variable curvature, allowing the calculation of its inverse kinematic model (IKM). It is noticed that the performances of the PH-based kinematic modeling of continuum manipulators are considerable regarding position accuracy, shape reconstruction, and time/cost of the model calculation, than other kinematic modeling methods, for two cases: free load manipulation and variable load manipulation. This modeling method is applied to the compact bionic handling assistant (CBHA) manipulator for validation. The results are compared with other IKMs developed in case of CBHA manipulator.

  6. A cascading failure model for analyzing railway accident causation

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Tao; Li, Ke-Ping

    2018-01-01

    In this paper, a new cascading failure model is proposed for quantitatively analyzing the railway accident causation. In the model, the loads of nodes are redistributed according to the strength of the causal relationships between the nodes. By analyzing the actual situation of the existing prevention measures, a critical threshold of the load parameter in the model is obtained. To verify the effectiveness of the proposed cascading model, simulation experiments of a train collision accident are performed. The results show that the cascading failure model can describe the cascading process of the railway accident more accurately than the previous models, and can quantitatively analyze the sensitivities and the influence of the causes. In conclusion, this model can assist us to reveal the latent rules of accident causation to reduce the occurrence of railway accidents.

  7. How Can Biologically-Based Modeling of Arsenic Kinetics and Dynamics Inform the Risk Assessment Process? -- ETD

    EPA Science Inventory

    Quantitative biologically-based models describing key events in the continuum from arsenic exposure to the development of adverse health effects provide a framework to integrate information obtained across diverse research areas. For example, genetic polymorphisms in arsenic me...

  8. Quantitative software models for the estimation of cost, size, and defects

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Bright, L.; Decker, B.; Lum, K.; Mikulski, C.; Powell, J.

    2002-01-01

    The presentation will provide a brief overview of the SQI measurement program as well as describe each of these models and how they are currently being used in supporting JPL project, task and software managers to estimate and plan future software systems and subsystems.

  9. Quantitative Adverse Outcome Pathways and Their ...

    EPA Pesticide Factsheets

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course predictions that can support regulatory decision-making. Herein we describe several facets of qAOPs, including (a) motivation for development, (b) technical considerations, (c) evaluation of confidence, and (d) potential applications. The qAOP used as an illustrative example for these points describes the linkage between inhibition of cytochrome P450 19A aromatase (the MIE) and population-level decreases in the fathead minnow (FHM; Pimephales promelas). The qAOP consists of three linked computational models for the following: (a) the hypothalamic-pitutitary-gonadal axis in female FHMs, where aromatase inhibition decreases the conversion of testosterone to 17β-estradiol (E2), thereby reducing E2-dependent vitellogenin (VTG; egg yolk protein precursor) synthesis, (b) VTG-dependent egg development and spawning (fecundity), and (c) fecundity-dependent population trajectory. While development of the example qAOP was based on experiments with FHMs exposed to the aromatase inhibitor fadrozole, we also show how a toxic equivalence (TEQ) calculation allows use of the qAOP to predict effects of another, untested aromatase inhibitor, iprodione. While qAOP development can be resource-intensive, the quan

  10. Predictive and mechanistic multivariate linear regression models for reaction development

    PubMed Central

    Santiago, Celine B.; Guo, Jing-Yao

    2018-01-01

    Multivariate Linear Regression (MLR) models utilizing computationally-derived and empirically-derived physical organic molecular descriptors are described in this review. Several reports demonstrating the effectiveness of this methodological approach towards reaction optimization and mechanistic interrogation are discussed. A detailed protocol to access quantitative and predictive MLR models is provided as a guide for model development and parameter analysis. PMID:29719711

  11. One Model Fits All: Explaining Many Aspects of Number Comparison within a Single Coherent Model-A Random Walk Account

    ERIC Educational Resources Information Center

    Reike, Dennis; Schwarz, Wolf

    2016-01-01

    The time required to determine the larger of 2 digits decreases with their numerical distance, and, for a given distance, increases with their magnitude (Moyer & Landauer, 1967). One detailed quantitative framework to account for these effects is provided by random walk models. These chronometric models describe how number-related noisy…

  12. Asynchronous adaptive time step in quantitative cellular automata modeling

    PubMed Central

    Zhu, Hao; Pang, Peter YH; Sun, Yan; Dhar, Pawan

    2004-01-01

    Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment. PMID:15222901

  13. Quantitative characterization of genetic parts and circuits for plant synthetic biology.

    PubMed

    Schaumberg, Katherine A; Antunes, Mauricio S; Kassaw, Tessema K; Xu, Wenlong; Zalewski, Christopher S; Medford, June I; Prasad, Ashok

    2016-01-01

    Plant synthetic biology promises immense technological benefits, including the potential development of a sustainable bio-based economy through the predictive design of synthetic gene circuits. Such circuits are built from quantitatively characterized genetic parts; however, this characterization is a significant obstacle in work with plants because of the time required for stable transformation. We describe a method for rapid quantitative characterization of genetic plant parts using transient expression in protoplasts and dual luciferase outputs. We observed experimental variability in transient-expression assays and developed a mathematical model to describe, as well as statistical normalization methods to account for, this variability, which allowed us to extract quantitative parameters. We characterized >120 synthetic parts in Arabidopsis and validated our method by comparing transient expression with expression in stably transformed plants. We also tested >100 synthetic parts in sorghum (Sorghum bicolor) protoplasts, and the results showed that our method works in diverse plant groups. Our approach enables the construction of tunable gene circuits in complex eukaryotic organisms.

  14. A quantitative test of population genetics using spatiogenetic patterns in bacterial colonies.

    PubMed

    Korolev, Kirill S; Xavier, João B; Nelson, David R; Foster, Kevin R

    2011-10-01

    It is widely accepted that population-genetics theory is the cornerstone of evolutionary analyses. Empirical tests of the theory, however, are challenging because of the complex relationships between space, dispersal, and evolution. Critically, we lack quantitative validation of the spatial models of population genetics. Here we combine analytics, on- and off-lattice simulations, and experiments with bacteria to perform quantitative tests of the theory. We study two bacterial species, the gut microbe Escherichia coli and the opportunistic pathogen Pseudomonas aeruginosa, and show that spatiogenetic patterns in colony biofilms of both species are accurately described by an extension of the one-dimensional stepping-stone model. We use one empirical measure, genetic diversity at the colony periphery, to parameterize our models and show that we can then accurately predict another key variable: the degree of short-range cell migration along an edge. Moreover, the model allows us to estimate other key parameters, including effective population size (density) at the expansion frontier. While our experimental system is a simplification of natural microbial community, we argue that it constitutes proof of principle that the spatial models of population genetics can quantitatively capture organismal evolution.

  15. Transforming Boolean models to continuous models: methodology and application to T-cell receptor signaling

    PubMed Central

    Wittmann, Dominik M; Krumsiek, Jan; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Klamt, Steffen; Theis, Fabian J

    2009-01-01

    Background The understanding of regulatory and signaling networks has long been a core objective in Systems Biology. Knowledge about these networks is mainly of qualitative nature, which allows the construction of Boolean models, where the state of a component is either 'off' or 'on'. While often able to capture the essential behavior of a network, these models can never reproduce detailed time courses of concentration levels. Nowadays however, experiments yield more and more quantitative data. An obvious question therefore is how qualitative models can be used to explain and predict the outcome of these experiments. Results In this contribution we present a canonical way of transforming Boolean into continuous models, where the use of multivariate polynomial interpolation allows transformation of logic operations into a system of ordinary differential equations (ODE). The method is standardized and can readily be applied to large networks. Other, more limited approaches to this task are briefly reviewed and compared. Moreover, we discuss and generalize existing theoretical results on the relation between Boolean and continuous models. As a test case a logical model is transformed into an extensive continuous ODE model describing the activation of T-cells. We discuss how parameters for this model can be determined such that quantitative experimental results are explained and predicted, including time-courses for multiple ligand concentrations and binding affinities of different ligands. This shows that from the continuous model we may obtain biological insights not evident from the discrete one. Conclusion The presented approach will facilitate the interaction between modeling and experiments. Moreover, it provides a straightforward way to apply quantitative analysis methods to qualitatively described systems. PMID:19785753

  16. Challenges in Developing Models Describing Complex Soil Systems

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Jacques, D.

    2014-12-01

    Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.

  17. Grass Grows, the Cow Eats: A Simple Grazing Systems Model with Emergent Properties

    ERIC Educational Resources Information Center

    Ungar, Eugene David; Seligman, Noam G.; Noy-Meir, Imanuel

    2004-01-01

    We describe a simple, yet intellectually challenging model of grazing systems that introduces basic concepts in ecology and systems analysis. The practical is suitable for high-school and university curricula with a quantitative orientation, and requires only basic skills in mathematics and spreadsheet use. The model is based on Noy-Meir's (1975)…

  18. A Didactic Experiment and Model of a Flat-Plate Solar Collector

    ERIC Educational Resources Information Center

    Gallitto, Aurelio Agliolo; Fiordilino, Emilio

    2011-01-01

    We report on an experiment performed with a home-made flat-plate solar collector, carried out together with high-school students. To explain the experimental results, we propose a model that describes the heating process of the solar collector. The model accounts quantitatively for the experimental data. We suggest that solar-energy topics should…

  19. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology

    PubMed Central

    Zhang, Wen; Cao, Jieer

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers’ overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles. PMID:29240789

  20. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology.

    PubMed

    Zhang, Wen; Cao, Jieer; Xu, Jun

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers' overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles.

  1. Development of a Quantitative Model Incorporating Key Events in a Hepatoxic Mode of Action to Predict Tumor Incidence

    EPA Science Inventory

    Biologically-Based Dose Response (BBDR) modeling of environmental pollutants can be utilized to inform the mode of action (MOA) by which compounds elicit adverse health effects. Chemicals that produce tumors are typically described as either genotoxic or non-genotoxic. One common...

  2. PREDICTING THE RISKS OF NEUROTOXIC VOLATILE ORGANIC COMPOUNDS BASED ON TARGET TISSUE DOSE.

    EPA Science Inventory

    Quantitative exposure-dose-response models relate the external exposure of a substance to the dose in the target tissue, and then relate the target tissue dose to production of adverse outcomes. We developed exposure-dose-response models to describe the affects of acute exposure...

  3. A Statistical Decision Model for Periodical Selection for a Specialized Information Center

    ERIC Educational Resources Information Center

    Dym, Eleanor D.; Shirey, Donald L.

    1973-01-01

    An experiment is described which attempts to define a quantitative methodology for the identification and evaluation of all possibly relevant periodical titles containing toxicological-biological information. A statistical decision model was designed and employed, along with yes/no criteria questions, a training technique and a quality control…

  4. Internet-based system for simulation-based medical planning for cardiovascular disease.

    PubMed

    Steele, Brooke N; Draney, Mary T; Ku, Joy P; Taylor, Charles A

    2003-06-01

    Current practice in vascular surgery utilizes only diagnostic and empirical data to plan treatments, which does not enable quantitative a priori prediction of the outcomes of interventions. We have previously described simulation-based medical planning methods to model blood flow in arteries and plan medical treatments based on physiologic models. An important consideration for the design of these patient-specific modeling systems is the accessibility to physicians with modest computational resources. We describe a simulation-based medical planning environment developed for the World Wide Web (WWW) using the Virtual Reality Modeling Language (VRML) and the Java programming language.

  5. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  6. Emergent Societal Effects of Crimino-Social Forces in an Animat Agent Model

    NASA Astrophysics Data System (ADS)

    Scogings, Chris J.; Hawick, Ken A.

    Societal behaviour can be studied at a causal level by perturbing a stable multi-agent model with new microscopic behaviours and observing the statistical response over an ensemble of simulated model systems. We report on the effects of introducing criminal and law-enforcing behaviours into a large scale animat agent model and describe the complex spatial agent patterns and population changes that result. Our well-established predator-prey substrate model provides a background framework against which these new microscopic behaviours can be trialled and investigated. We describe some quantitative results and some surprising conclusions concerning the overall societal health when individually anti-social behaviour is introduced.

  7. DigitalHuman (DH): An Integrative Mathematical Model ofHuman Physiology

    NASA Technical Reports Server (NTRS)

    Hester, Robert L.; Summers, Richard L.; lIescu, Radu; Esters, Joyee; Coleman, Thomas G.

    2010-01-01

    Mathematical models and simulation are important tools in discovering the key causal relationships governing physiological processes and improving medical intervention when physiological complexity is a central issue. We have developed a model of integrative human physiology called DigitalHuman (DH) consisting of -5000 variables modeling human physiology describing cardiovascular, renal, respiratory, endocrine, neural and metabolic physiology. Users can view time-dependent solutions and interactively introduce perturbations by altering numerical parameters to investigate new hypotheses. The variables, parameters and quantitative relationships as well as all other model details are described in XML text files. All aspects of the model, including the mathematical equations describing the physiological processes are written in XML open source, text-readable files. Model structure is based upon empirical data of physiological responses documented within the peer-reviewed literature. The model can be used to understand proposed physiological mechanisms and physiological interactions that may not be otherwise intUitively evident. Some of the current uses of this model include the analyses of renal control of blood pressure, the central role of the liver in creating and maintaining insulin resistance, and the mechanisms causing orthostatic hypotension in astronauts. Additionally the open source aspect of the modeling environment allows any investigator to add detailed descriptions of human physiology to test new concepts. The model accurately predicts both qualitative and more importantly quantitative changes in clinically and experimentally observed responses. DigitalHuman provides scientists a modeling environment to understand the complex interactions of integrative physiology. This research was supported by.NIH HL 51971, NSF EPSCoR, and NASA

  8. A functional-structural model of rice linking quantitative genetic information with morphological development and physiological processes.

    PubMed

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-04-01

    Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype-phenotype model, we present here a three-dimensional functional-structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed.

  9. A functional–structural model of rice linking quantitative genetic information with morphological development and physiological processes

    PubMed Central

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-01-01

    Background and Aims Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype–phenotype model, we present here a three-dimensional functional–structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. Methods The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Key Results Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. Conclusions We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed. PMID:21247905

  10. An experimental approach to identify dynamical models of transcriptional regulation in living cells

    NASA Astrophysics Data System (ADS)

    Fiore, G.; Menolascina, F.; di Bernardo, M.; di Bernardo, D.

    2013-06-01

    We describe an innovative experimental approach, and a proof of principle investigation, for the application of System Identification techniques to derive quantitative dynamical models of transcriptional regulation in living cells. Specifically, we constructed an experimental platform for System Identification based on a microfluidic device, a time-lapse microscope, and a set of automated syringes all controlled by a computer. The platform allows delivering a time-varying concentration of any molecule of interest to the cells trapped in the microfluidics device (input) and real-time monitoring of a fluorescent reporter protein (output) at a high sampling rate. We tested this platform on the GAL1 promoter in the yeast Saccharomyces cerevisiae driving expression of a green fluorescent protein (Gfp) fused to the GAL1 gene. We demonstrated that the System Identification platform enables accurate measurements of the input (sugars concentrations in the medium) and output (Gfp fluorescence intensity) signals, thus making it possible to apply System Identification techniques to obtain a quantitative dynamical model of the promoter. We explored and compared linear and nonlinear model structures in order to select the most appropriate to derive a quantitative model of the promoter dynamics. Our platform can be used to quickly obtain quantitative models of eukaryotic promoters, currently a complex and time-consuming process.

  11. Development of Novel Repellents Using Structure - Activity Modeling of Compounds in the USDA Archival Database

    DTIC Science & Technology

    2011-01-01

    used in efforts to develop QSAR models. Measurement of Repellent Efficacy Screening for Repellency of Compounds with Unknown Toxicology In screening...CPT) were used to develop Quantitative Structure Activity Relationship ( QSAR ) models to predict repellency. Successful prediction of novel...acylpiperidine QSAR models employed 4 descriptors to describe the relationship between structure and repellent duration. The ANN model of the carboxamides did not

  12. Mathematical modeling of reflectance and intrinsic fluorescence for cancer detection in human pancreatic tissue

    NASA Astrophysics Data System (ADS)

    Wilson, Robert H.; Chandra, Malavika; Scheiman, James; Simeone, Diane; McKenna, Barbara; Purdy, Julianne; Mycek, Mary-Ann

    2009-02-01

    Pancreatic adenocarcinoma has a five-year survival rate of only 4%, largely because an effective procedure for early detection has not been developed. In this study, mathematical modeling of reflectance and fluorescence spectra was utilized to quantitatively characterize differences between normal pancreatic tissue, pancreatitis, and pancreatic adenocarcinoma. Initial attempts at separating the spectra of different tissue types involved dividing fluorescence by reflectance, and removing absorption artifacts by applying a "reverse Beer-Lambert factor" when the absorption coefficient was modeled as a linear combination of the extinction coefficients of oxy- and deoxy-hemoglobin. These procedures demonstrated the need for a more complete mathematical model to quantitatively describe fluorescence and reflectance for minimally-invasive fiber-based optical diagnostics in the pancreas.

  13. Consumers' behavior in quantitative microbial risk assessment for pathogens in raw milk: Incorporation of the likelihood of consumption as a function of storage time and temperature.

    PubMed

    Crotta, Matteo; Paterlini, Franco; Rizzi, Rita; Guitian, Javier

    2016-02-01

    Foodborne disease as a result of raw milk consumption is an increasing concern in Western countries. Quantitative microbial risk assessment models have been used to estimate the risk of illness due to different pathogens in raw milk. In these models, the duration and temperature of storage before consumption have a critical influence in the final outcome of the simulations and are usually described and modeled as independent distributions in the consumer phase module. We hypothesize that this assumption can result in the computation, during simulations, of extreme scenarios that ultimately lead to an overestimation of the risk. In this study, a sensorial analysis was conducted to replicate consumers' behavior. The results of the analysis were used to establish, by means of a logistic model, the relationship between time-temperature combinations and the probability that a serving of raw milk is actually consumed. To assess our hypothesis, 2 recently published quantitative microbial risk assessment models quantifying the risks of listeriosis and salmonellosis related to the consumption of raw milk were implemented. First, the default settings described in the publications were kept; second, the likelihood of consumption as a function of the length and temperature of storage was included. When results were compared, the density of computed extreme scenarios decreased significantly in the modified model; consequently, the probability of illness and the expected number of cases per year also decreased. Reductions of 11.6 and 12.7% in the proportion of computed scenarios in which a contaminated milk serving was consumed were observed for the first and the second study, respectively. Our results confirm that overlooking the time-temperature dependency may yield to an important overestimation of the risk. Furthermore, we provide estimates of this dependency that could easily be implemented in future quantitative microbial risk assessment models of raw milk pathogens. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Statistical Model to Analyze Quantitative Proteomics Data Obtained by 18O/16O Labeling and Linear Ion Trap Mass Spectrometry

    PubMed Central

    Jorge, Inmaculada; Navarro, Pedro; Martínez-Acedo, Pablo; Núñez, Estefanía; Serrano, Horacio; Alfranca, Arántzazu; Redondo, Juan Miguel; Vázquez, Jesús

    2009-01-01

    Statistical models for the analysis of protein expression changes by stable isotope labeling are still poorly developed, particularly for data obtained by 16O/18O labeling. Besides large scale test experiments to validate the null hypothesis are lacking. Although the study of mechanisms underlying biological actions promoted by vascular endothelial growth factor (VEGF) on endothelial cells is of considerable interest, quantitative proteomics studies on this subject are scarce and have been performed after exposing cells to the factor for long periods of time. In this work we present the largest quantitative proteomics study to date on the short term effects of VEGF on human umbilical vein endothelial cells by 18O/16O labeling. Current statistical models based on normality and variance homogeneity were found unsuitable to describe the null hypothesis in a large scale test experiment performed on these cells, producing false expression changes. A random effects model was developed including four different sources of variance at the spectrum-fitting, scan, peptide, and protein levels. With the new model the number of outliers at scan and peptide levels was negligible in three large scale experiments, and only one false protein expression change was observed in the test experiment among more than 1000 proteins. The new model allowed the detection of significant protein expression changes upon VEGF stimulation for 4 and 8 h. The consistency of the changes observed at 4 h was confirmed by a replica at a smaller scale and further validated by Western blot analysis of some proteins. Most of the observed changes have not been described previously and are consistent with a pattern of protein expression that dynamically changes over time following the evolution of the angiogenic response. With this statistical model the 18O labeling approach emerges as a very promising and robust alternative to perform quantitative proteomics studies at a depth of several thousand proteins. PMID:19181660

  15. Monoclinic deformation of calcite crystals at ambient conditions

    NASA Astrophysics Data System (ADS)

    Przeniosło, R.; Fabrykiewicz, P.; Sosnowska, I.

    2016-09-01

    High resolution synchrotron radiation powder diffraction shows that the average crystal structure of calcite at ambient conditions is described with the trigonal space group R 3 bar c but there is a systematic hkl-dependent Bragg peak broadening. A modelling of this anisotropic peak broadening with the microstrain model from Stephens (1999) [15] is presented. The observed lattice parameters' correlations can be described by assuming a monoclinic-type deformation of calcite crystallites. A quantitative model of this monoclinic deformation observed at ambient conditions is described with the space group C 2 / c . The monoclinic unit cell suggested at ambient conditions is related with the monoclinic unit cell reported in calcite at high pressure (Merrill and Bassett (1975) [10]).

  16. Overcoming pain thresholds with multilevel models-an example using quantitative sensory testing (QST) data.

    PubMed

    Hirschfeld, Gerrit; Blankenburg, Markus R; Süß, Moritz; Zernikow, Boris

    2015-01-01

    The assessment of somatosensory function is a cornerstone of research and clinical practice in neurology. Recent initiatives have developed novel protocols for quantitative sensory testing (QST). Application of these methods led to intriguing findings, such as the presence lower pain-thresholds in healthy children compared to healthy adolescents. In this article, we (re-) introduce the basic concepts of signal detection theory (SDT) as a method to investigate such differences in somatosensory function in detail. SDT describes participants' responses according to two parameters, sensitivity and response-bias. Sensitivity refers to individuals' ability to discriminate between painful and non-painful stimulations. Response-bias refers to individuals' criterion for giving a "painful" response. We describe how multilevel models can be used to estimate these parameters and to overcome central critiques of these methods. To provide an example we apply these methods to data from the mechanical pain sensitivity test of the QST protocol. The results show that adolescents are more sensitive to mechanical pain and contradict the idea that younger children simply use more lenient criteria to report pain. Overall, we hope that the wider use of multilevel modeling to describe somatosensory functioning may advance neurology research and practice.

  17. Designing automation for human use: empirical studies and quantitative models.

    PubMed

    Parasuraman, R

    2000-07-01

    An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.

  18. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. © 2013 Elsevier B.V. All rights reserved.

  19. Efficient Generation and Selection of Virtual Populations in Quantitative Systems Pharmacology Models.

    PubMed

    Allen, R J; Rieger, T R; Musante, C J

    2016-03-01

    Quantitative systems pharmacology models mechanistically describe a biological system and the effect of drug treatment on system behavior. Because these models rarely are identifiable from the available data, the uncertainty in physiological parameters may be sampled to create alternative parameterizations of the model, sometimes termed "virtual patients." In order to reproduce the statistics of a clinical population, virtual patients are often weighted to form a virtual population that reflects the baseline characteristics of the clinical cohort. Here we introduce a novel technique to efficiently generate virtual patients and, from this ensemble, demonstrate how to select a virtual population that matches the observed data without the need for weighting. This approach improves confidence in model predictions by mitigating the risk that spurious virtual patients become overrepresented in virtual populations.

  20. Modeling of breath methane concentration profiles during exercise on an ergometer*

    PubMed Central

    Szabó, Anna; Unterkofler, Karl; Mochalski, Pawel; Jandacka, Martin; Ruzsanyi, Vera; Szabó, Gábor; Mohácsi, Árpád; Teschl, Susanne; Teschl, Gerald; King, Julian

    2016-01-01

    We develop a simple three compartment model based on mass balance equations which quantitatively describes the dynamics of breath methane concentration profiles during exercise on an ergometer. With the help of this model it is possible to estimate the endogenous production rate of methane in the large intestine by measuring breath gas concentrations of methane. PMID:26828421

  1. A COST-EFFECTIVENESS MODEL FOR THE ANALYSIS OF TITLE I ESEA PROJECT PROPOSALS, PART I-VII.

    ERIC Educational Resources Information Center

    ABT, CLARK C.

    SEVEN SEPARATE REPORTS DESCRIBE AN OVERVIEW OF A COST-EFFECTIVENESS MODEL AND FIVE SUBMODELS FOR EVALUATING THE EFFECTIVENESS OF ELEMENTARY AND SECONDARY ACT TITLE I PROPOSALS. THE DESIGN FOR THE MODEL ATTEMPTS A QUANTITATIVE DESCRIPTION OF EDUCATION SYSTEMS WHICH MAY BE PROGRAMED AS A COMPUTER SIMULATION TO INDICATE THE IMPACT OF A TITLE I…

  2. A mathematical function for the description of nutrient-response curve

    PubMed Central

    Ahmadi, Hamed

    2017-01-01

    Several mathematical equations have been proposed to modeling nutrient-response curve for animal and human justified on the goodness of fit and/or on the biological mechanism. In this paper, a functional form of a generalized quantitative model based on Rayleigh distribution principle for description of nutrient-response phenomena is derived. The three parameters governing the curve a) has biological interpretation, b) may be used to calculate reliable estimates of nutrient response relationships, and c) provide the basis for deriving relationships between nutrient and physiological responses. The new function was successfully applied to fit the nutritional data obtained from 6 experiments including a wide range of nutrients and responses. An evaluation and comparison were also done based simulated data sets to check the suitability of new model and four-parameter logistic model for describing nutrient responses. This study indicates the usefulness and wide applicability of the new introduced, simple and flexible model when applied as a quantitative approach to characterizing nutrient-response curve. This new mathematical way to describe nutritional-response data, with some useful biological interpretations, has potential to be used as an alternative approach in modeling nutritional responses curve to estimate nutrient efficiency and requirements. PMID:29161271

  3. Human systems immunology: hypothesis-based modeling and unbiased data-driven approaches.

    PubMed

    Arazi, Arnon; Pendergraft, William F; Ribeiro, Ruy M; Perelson, Alan S; Hacohen, Nir

    2013-10-31

    Systems immunology is an emerging paradigm that aims at a more systematic and quantitative understanding of the immune system. Two major approaches have been utilized to date in this field: unbiased data-driven modeling to comprehensively identify molecular and cellular components of a system and their interactions; and hypothesis-based quantitative modeling to understand the operating principles of a system by extracting a minimal set of variables and rules underlying them. In this review, we describe applications of the two approaches to the study of viral infections and autoimmune diseases in humans, and discuss possible ways by which these two approaches can synergize when applied to human immunology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Modeling Synergistic Drug Inhibition of Mycobacterium tuberculosis Growth in Murine Macrophages

    DTIC Science & Technology

    2011-01-01

    important application of metabolic network modeling is the ability to quantitatively model metabolic enzyme inhibition and predict bacterial growth...describe the extensions of this framework to model drug- induced growth inhibition of M. tuberculosis in macrophages.39 Mathematical framework Fig. 1 shows...starting point, we used the previously developed iNJ661v model to represent the metabolic Fig. 1 Mathematical framework: a set of coupled models used to

  5. Improving Climate and Achievement in a Troubled Urban High School through the Talent Development Model.

    ERIC Educational Resources Information Center

    McPartland, James; Balfanz, Robert; Jordan, Will; Legters, Nettie

    1998-01-01

    A case study of a large nonselective urban high school in Baltimore (Maryland) describes the design and implementation of a comprehensive package of school reforms, the Talent Development Model with Career Academies. Qualitative and quantitative evidence is provided on significant improvements in school climate, student attendance, promotion…

  6. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    ERIC Educational Resources Information Center

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  7. Enhancing the Quantitative Representation of Socioeconomic Conditions in the Shared Socio-economic Pathways (SSPs) using the International Futures Model

    NASA Astrophysics Data System (ADS)

    Rothman, D. S.; Siraj, A.; Hughes, B.

    2013-12-01

    The international research community is currently in the process of developing new scenarios for climate change research. One component of these scenarios are the Shared Socio-economic Pathways (SSPs), which describe a set of possible future socioeconomic conditions. These are presented in narrative storylines with associated quantitative drivers. The core quantitative drivers include total population, average GDP per capita, educational attainment, and urbanization at the global, regional, and national levels. At the same time there have been calls, particularly by the IAV community, for the SSPs to include additional quantitative information on other key social factors, such as income inequality, governance, health, and access to key infrastructures, which are discussed in the narratives. The International Futures system (IFs), based at the Pardee Center at the University of Denver, is able to provide forecasts of many of these indicators. IFs cannot use the SSP drivers as exogenous inputs, but we are able to create development pathways that closely reproduce the core quantitative drivers defined by the different SSPs, as well as incorporating assumptions on other key driving factors described in the qualitative narratives. In this paper, we present forecasts for additional quantitative indicators based upon the implementation of the SSP development pathways in IFs. These results will be of value to many researchers.

  8. "Could I return to my life?" Integrated Narrative Nursing Model in Education (INNE).

    PubMed

    Artioli, Giovanna; Foà, Chiara; Cosentino, Chiara; Sulla, Francesco; Sollami, Alfonso; Taffurelli, Chiara

    2018-03-28

    The Integrated Narrative Nursing Model (INNM) is an approach that integrates the qualitative methodology typical of the human sciences, with the quantitative methodology more often associated with the natural sciences. This complex model, which combines a focus on narrative with quantitative measures, has recently been effectively applied to the assessment of chronic patients. In this study, the model is applied to the planning phase of education (Integrated Narrative Nursing Education, INNE), and proves to be a valid instrument for the promotion of the current educational paradigm that is centered on the engagement of both the patient and the caregiver in their own path of care. The aim of this study is therefore to describe the nurse's strategy in the planning of an educational intervention by using the INNE model. The case of a 70-year-old woman with pulmonary neoplasm is described at her first admission to Hospice. Each step conducted by the reference nurse, who uses INNE to record the nurse-patient narrative and collect subsequent questionnaires in order to create a shared educational plan, is also described. The information collected was submitted, starting from a grounded methodology to the following four levels of analysis: I. Needs Assessment, II. Narrative Diagnosis, III. Quantitative Outcome, IV. Integrated Outcome. Step IV, which is derived from the integration of all levels of analysis, allows a nurse to define, even graphically, the conceptual map of a patient's needs, resources and perspectives, in a completely tailored manner. The INNE model offers a valid methodological support for the professional who intends to educate the patient through an inter-subjective and engaged pathway, between the professional, their patient and the socio-relational context. It is a matter of adopting a complex vision that combines processes and methods that require a steady scientific basis and advanced methodological expertise with active listening and empathy - skills which require emotional intelligence.

  9. Modeling Electronegative Impurity Concentrations in Liquid Argon Detectors

    NASA Astrophysics Data System (ADS)

    Tang, Wei; Li, Yichen; Thorn, Craig; Qian, Xin

    2017-01-01

    Achieving long electron lifetime is crucial to reach the high performance of large Liquid Argon Time Projection Chamber (LArTPC) envisioned for next generation neutrino experiments. We have built up a quantitative model to describe the impurity distribution and transportation in a cryostat. Henrys constants of Oxygen and water, which describe the partition of impurities between gas argon and liquid argon, have been deduced through this model with the measurements in BNL 20-L LAr test stand. These results indicate the importance of the gas purification system and prospects on large LArTPC detectors will be discussed.

  10. Final Report for Dynamic Models for Causal Analysis of Panel Data. Models for Change in Quantitative Variables, Part I Deterministic Models. Part II, Chapter 3.

    ERIC Educational Resources Information Center

    Hannan, Michael T.

    This document is part of a series of chapters described in SO 011 759. Addressing the question of effective models to measure change and the change process, the author suggests that linear structural equation systems may be viewed as steady state outcomes of continuous-change models and have rich sociological grounding. Two interpretations of the…

  11. Learning physical descriptors for materials science by compressed sensing

    NASA Astrophysics Data System (ADS)

    Ghiringhelli, Luca M.; Vybiral, Jan; Ahmetcik, Emre; Ouyang, Runhai; Levchenko, Sergey V.; Draxl, Claudia; Scheffler, Matthias

    2017-02-01

    The availability of big data in materials science offers new routes for analyzing materials properties and functions and achieving scientific understanding. Finding structure in these data that is not directly visible by standard tools and exploitation of the scientific information requires new and dedicated methodology based on approaches from statistical learning, compressed sensing, and other recent methods from applied mathematics, computer science, statistics, signal processing, and information science. In this paper, we explain and demonstrate a compressed-sensing based methodology for feature selection, specifically for discovering physical descriptors, i.e., physical parameters that describe the material and its properties of interest, and associated equations that explicitly and quantitatively describe those relevant properties. As showcase application and proof of concept, we describe how to build a physical model for the quantitative prediction of the crystal structure of binary compound semiconductors.

  12. xTract: software for characterizing conformational changes of protein complexes by quantitative cross-linking mass spectrometry.

    PubMed

    Walzthoeni, Thomas; Joachimiak, Lukasz A; Rosenberger, George; Röst, Hannes L; Malmström, Lars; Leitner, Alexander; Frydman, Judith; Aebersold, Ruedi

    2015-12-01

    Chemical cross-linking in combination with mass spectrometry generates distance restraints of amino acid pairs in close proximity on the surface of native proteins and protein complexes. In this study we used quantitative mass spectrometry and chemical cross-linking to quantify differences in cross-linked peptides obtained from complexes in spatially discrete states. We describe a generic computational pipeline for quantitative cross-linking mass spectrometry consisting of modules for quantitative data extraction and statistical assessment of the obtained results. We used the method to detect conformational changes in two model systems: firefly luciferase and the bovine TRiC complex. Our method discovers and explains the structural heterogeneity of protein complexes using only sparse structural information.

  13. Structure–property reduced order model for viscosity prediction in single-component CO 2 -binding organic liquids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cantu, David C.; Malhotra, Deepika; Koech, Phillip K.

    2016-01-01

    CO2 capture from power generation with aqueous solvents remains energy intensive due to the high water content of the current technology, or the high viscosity of non-aqueous alternatives. Quantitative reduced models, connecting molecular structure to bulk properties, are key for developing structure-property relationships that enable molecular design. In this work, we describe such a model that quantitatively predicts viscosities of CO2 binding organic liquids (CO2BOLs) based solely on molecular structure and the amount of bound CO2. The functional form of the model correlates the viscosity with the CO2 loading and an electrostatic term describing the charge distribution between the CO2-bearingmore » functional group and the proton-receiving amine. Molecular simulations identify the proton shuttle between these groups within the same molecule to be the critical indicator of low viscosity. The model, developed to allow for quick screening of solvent libraries, paves the way towards the rational design of low viscosity non-aqueous solvent systems for post-combustion CO2 capture. Following these theoretical recommendations, synthetic efforts of promising candidates and viscosity measurement provide experimental validation and verification.« less

  14. A Comparison between Elementary School Students' Mental Models and Visualizations in Textbooks for the Concept of Atom

    ERIC Educational Resources Information Center

    Polat-Yaseen, Zeynep

    2012-01-01

    This study was designed for two major goals, which are to describe students' mental models about atom concept from 6th to 8th grade and to compare students' mental models with visual representations of atom in textbooks. Qualitative and quantitative data were collected with 4 open-ended questions including drawings which were quantified using the…

  15. A model describing vestibular detection of body sway motion.

    NASA Technical Reports Server (NTRS)

    Nashner, L. M.

    1971-01-01

    An experimental technique was developed which facilitated the formulation of a quantitative model describing vestibular detection of body sway motion in a postural response mode. All cues, except vestibular ones, which gave a subject an indication that he was beginning to sway, were eliminated using a specially designed two-degree-of-freedom platform; body sway was then induced and resulting compensatory responses at the ankle joints measured. Hybrid simulation compared the experimental results with models of the semicircular canals and utricular otolith receptors. Dynamic characteristics of the resulting canal model compared closely with characteristics of models which describe eye movement and subjective responses to body rotational motions. The average threshold level, in the postural response mode, however, was considerably lower. Analysis indicated that the otoliths probably play no role in the initial detection of body sway motion.

  16. Quantitative measures for redox signaling.

    PubMed

    Pillay, Ché S; Eagling, Beatrice D; Driscoll, Scott R E; Rohwer, Johann M

    2016-07-01

    Redox signaling is now recognized as an important regulatory mechanism for a number of cellular processes including the antioxidant response, phosphokinase signal transduction and redox metabolism. While there has been considerable progress in identifying the cellular machinery involved in redox signaling, quantitative measures of redox signals have been lacking, limiting efforts aimed at understanding and comparing redox signaling under normoxic and pathogenic conditions. Here we have outlined some of the accepted principles for redox signaling, including the description of hydrogen peroxide as a signaling molecule and the role of kinetics in conferring specificity to these signaling events. Based on these principles, we then develop a working definition for redox signaling and review a number of quantitative methods that have been employed to describe signaling in other systems. Using computational modeling and published data, we show how time- and concentration- dependent analyses, in particular, could be used to quantitatively describe redox signaling and therefore provide important insights into the functional organization of redox networks. Finally, we consider some of the key challenges with implementing these methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Quantitative 3D analysis of shape dynamics of the left ventricle

    NASA Astrophysics Data System (ADS)

    Scowen, Barry C.; Smith, Stephen L.; Vannan, Mani A.; Arsenault, Marie

    1998-07-01

    There is an established link between Left Ventricular (LV) geometry and its performance. As a consequence of ischemic heart disease and the attempt to relieve myocardial tissue stress, ventricle shape begins to distort from a conical to spherical geometry with a reduction in pumping efficiency of the chamber. If untreated, premature heart failure will result. To increase the changes of successful treatment it is obviously important for the benefit of the patient to detect these abnormalities as soon as possible. It is the development of a technique to characterize and quantify the shape of the left ventricle that is described here. The system described in this paper uses a novel helix model which combines the advantages of current two dimensional (2D) quantitative measures which provide limited information, with 3D qualitative methods which provide accurate reconstructions of the LV using computationally expensive rendering schemes. A phantom object and dog ventricle (normal/abnormal) were imaged and helical models constructed. The result are encouraging with differences between normal and abnormal ventricles in both diastole and systole able to be determined. Further work entails building a library of subjects in order to determine the relationship between ventricle geometry and quantitative measurements.

  18. A quantitative framework for the forward design of synthetic miRNA circuits.

    PubMed

    Bloom, Ryan J; Winkler, Sally M; Smolke, Christina D

    2014-11-01

    Synthetic genetic circuits incorporating regulatory components based on RNA interference (RNAi) have been used in a variety of systems. A comprehensive understanding of the parameters that determine the relationship between microRNA (miRNA) and target expression levels is lacking. We describe a quantitative framework supporting the forward engineering of gene circuits that incorporate RNAi-based regulatory components in mammalian cells. We developed a model that captures the quantitative relationship between miRNA and target gene expression levels as a function of parameters, including mRNA half-life and miRNA target-site number. We extended the model to synthetic circuits that incorporate protein-responsive miRNA switches and designed an optimized miRNA-based protein concentration detector circuit that noninvasively measures small changes in the nuclear concentration of β-catenin owing to induction of the Wnt signaling pathway. Our results highlight the importance of methods for guiding the quantitative design of genetic circuits to achieve robust, reliable and predictable behaviors in mammalian cells.

  19. The Dynamics of Mobile Learning Utilization in Vocational Education: Frame Model Perspective Review

    ERIC Educational Resources Information Center

    Mahande, Ridwan Daud; Susanto, Adhi; Surjono, Herman Dwi

    2017-01-01

    This study aimed to describe the dynamics of content aspects, user aspects and social aspects of mobile learning utilization (m-learning) in vocational education from the FRAME Model perspective review. This study was quantitative descriptive research. The population in this study was teachers and students of state vocational school and private…

  20. Effectiveness of Facebook Based Learning to Enhance Creativity among Islamic Studies Students by Employing Isman Instructional Design Model

    ERIC Educational Resources Information Center

    Alias, Norlidah; Siraj, Saedah; Daud, Mohd Khairul Azman Md; Hussin, Zaharah

    2013-01-01

    The study examines the effectiveness of Facebook based learning to enhance creativity among Islamic Studies students in the secondary educational setting in Malaysia. It describes the design process by employing the Isman Instructional Design Model. A quantitative study was carried out using experimental method and background survey. The…

  1. An Experimental Model for Analyzing Strategies for Financing Higher Education in New York State.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Office of Postsecondary Research, Information Systems, and Institutional Aid.

    Described is an experimental, quantitative model developed by the New York State Education Department to evaluate state-level financing strategies for higher education. It can be used to address a variety of questions and takes into account a host of direct and indirect relationships. It uses computer software and optimization algorithms developed…

  2. A quantitative model for transforming reflectance spectra into the Munsell color space using cone sensitivity functions and opponent process weights.

    PubMed

    D'Andrade, Roy G; Romney, A Kimball

    2003-05-13

    This article presents a computational model of the process through which the human visual system transforms reflectance spectra into perceptions of color. Using physical reflectance spectra data and standard human cone sensitivity functions we describe the transformations necessary for predicting the location of colors in the Munsell color space. These transformations include quantitative estimates of the opponent process weights needed to transform cone activations into Munsell color space coordinates. Using these opponent process weights, the Munsell position of specific colors can be predicted from their physical spectra with a mean correlation of 0.989.

  3. Efficient Generation and Selection of Virtual Populations in Quantitative Systems Pharmacology Models

    PubMed Central

    Rieger, TR; Musante, CJ

    2016-01-01

    Quantitative systems pharmacology models mechanistically describe a biological system and the effect of drug treatment on system behavior. Because these models rarely are identifiable from the available data, the uncertainty in physiological parameters may be sampled to create alternative parameterizations of the model, sometimes termed “virtual patients.” In order to reproduce the statistics of a clinical population, virtual patients are often weighted to form a virtual population that reflects the baseline characteristics of the clinical cohort. Here we introduce a novel technique to efficiently generate virtual patients and, from this ensemble, demonstrate how to select a virtual population that matches the observed data without the need for weighting. This approach improves confidence in model predictions by mitigating the risk that spurious virtual patients become overrepresented in virtual populations. PMID:27069777

  4. Quantitative NDE applied to composites and metals

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.; Parker, F. Raymond; Heath, D. Michele; Welch, Christopher S.

    1989-01-01

    Research at the NASA/Langley Research Center concerning quantitative NDE of composites and metals is reviewed. The relationship between ultrasonics and polymer cure is outlined. NDE models are presented, which can be used to develop measurement technologies for characterizing the curing of a polymer system for composite materials. The models can be used to determine the glass transition temperature, the degree of cure, and the cure rate. The application of the model to control autoclave processing of composite materials is noted. Consideration is given to the use of thermal diffusion models combined with controlled thermal input measurements to determine the thermal diffusivity of materials. Also, a two-dimensional physical model is described that permits delaminations in samples of Space Shuttle Solid Rocket Motors to be detected in thermograms in the presence of cooling effects and uneven heating.

  5. Quantitative colorimetry of atherosclerotic plaque using the L*a*b* color space during angioscopy for the detection of lipid cores underneath thin fibrous caps.

    PubMed

    Ishibashi, Fumiyuki; Yokoyama, Shinya; Miyahara, Kengo; Dabreo, Alexandra; Weiss, Eric R; Iafrati, Mark; Takano, Masamichi; Okamatsu, Kentaro; Mizuno, Kyoichi; Waxman, Sergio

    2007-12-01

    Yellow plaques seen during angioscopy are thought to represent lipid cores underneath thin fibrous caps (LCTCs) and may be indicative of vulnerable sites. However, plaque color assessment during angioscopy has been criticized because of its qualitative nature. The purpose of the present study was to test the ability of a quantitative colorimetric system to measure yellow color intensity of atherosclerotic plaques during angioscopy and to characterize the color of LCTCs. Using angioscopy and a quantitative colorimetry system based on the L*a*b* color space [L* describes brightness (-100 to +100), b* describes blue to yellow (-100 to +100)], the optimal conditions for measuring plaque color were determined in three flat standard color samples and five artificial plaque models in cylinder porcine carotid arteries. In 88 human tissue samples, the colorimetric characteristics of LCTCs were then evaluated. In in-vitro samples and ex-vivo plaque models, brightness L* between 40 and 80 was determined to be optimal for acquiring b* values, and the variables unique to angioscopy in color perception did not impact b* values after adjusting for brightness L* by manipulating light or distance. In ex-vivo human tissue samples, b* value >/=23 (35.91 +/- 8.13) with L* between 40 and 80 was associated with LCTCs (fibrous caps <100 mum). Atherosclerotic plaque color can be consistently measured during angioscopy with quantitative colorimetry. High yellow color intensity, determined by this system, was associated with LCTCs. Quantitative colorimetry during angioscopy may be used for detection of LCTCs, which may be markers of vulnerability.

  6. Analysis of Radio Frequency Surveillance Systems for Air Traffic Control : Volume 1. Text.

    DOT National Transportation Integrated Search

    1976-02-01

    Performance criteria that afford quantitative evaluation of a variety of current and proposed configurations of the Air Traffic Control Radar Beacon System (ATCRBS) are described in detail. Two analytic system models are developed to allow applicatio...

  7. [Improved device and method for determination of protein digestibility in vitro].

    PubMed

    Lipatov, N N; Iudina, S B; Lisitsyn, A B

    1994-01-01

    The ten-cells device for modelling of ferment hydrolysis of food proteins by acid basic proteases of human alimentary canal is described. The new procedure for the calculation of quantitative characteristic of proteins digestion "in vitro" is presented.

  8. Toward a descriptive model of galactic cosmic rays in the heliosphere

    NASA Technical Reports Server (NTRS)

    Mewaldt, R. A.; Cummings, A. C.; Adams, James H., Jr.; Evenson, Paul; Fillius, W.; Jokipii, J. R.; Mckibben, R. B.; Robinson, Paul A., Jr.

    1988-01-01

    Researchers review the elements that enter into phenomenological models of the composition, energy spectra, and the spatial and temporal variations of galactic cosmic rays, including the so-called anomalous cosmic ray component. Starting from an existing model, designed to describe the behavior of cosmic rays in the near-Earth environment, researchers suggest possible updates and improvements to this model, and then propose a quantitative approach for extending such a model into other regions of the heliosphere.

  9. Building Quantitative Hydrologic Storylines from Process-based Models for Managing Water Resources in the U.S. Under Climate-changed Futures

    NASA Astrophysics Data System (ADS)

    Arnold, J.; Gutmann, E. D.; Clark, M. P.; Nijssen, B.; Vano, J. A.; Addor, N.; Wood, A.; Newman, A. J.; Mizukami, N.; Brekke, L. D.; Rasmussen, R.; Mendoza, P. A.

    2016-12-01

    Climate change narratives for water-resource applications must represent the change signals contextualized by hydroclimatic process variability and uncertainty at multiple scales. Building narratives of plausible change includes assessing uncertainties across GCM structure, internal climate variability, climate downscaling methods, and hydrologic models. Work with this linked modeling chain has dealt mostly with GCM sampling directed separately to either model fidelity (does the model correctly reproduce the physical processes in the world?) or sensitivity (of different model responses to CO2 forcings) or diversity (of model type, structure, and complexity). This leaves unaddressed any interactions among those measures and with other components in the modeling chain used to identify water-resource vulnerabilities to specific climate threats. However, time-sensitive, real-world vulnerability studies typically cannot accommodate a full uncertainty ensemble across the whole modeling chain, so a gap has opened between current scientific knowledge and most routine applications for climate-changed hydrology. To close that gap, the US Army Corps of Engineers, the Bureau of Reclamation, and the National Center for Atmospheric Research are working on techniques to subsample uncertainties objectively across modeling chain components and to integrate results into quantitative hydrologic storylines of climate-changed futures. Importantly, these quantitative storylines are not drawn from a small sample of models or components. Rather, they stem from the more comprehensive characterization of the full uncertainty space for each component. Equally important from the perspective of water-resource practitioners, these quantitative hydrologic storylines are anchored in actual design and operations decisions potentially affected by climate change. This talk will describe part of our work characterizing variability and uncertainty across modeling chain components and their interactions using newly developed observational data, models and model outputs, and post-processing tools for making the resulting quantitative storylines most useful in practical hydrology applications.

  10. Quantitative Radiomics System Decoding the Tumor Phenotype | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Our goal is to construct a publicly available computational radiomics system for the objective and automated extraction of quantitative imaging features that we believe will yield biomarkers of greater prognostic value compared with routinely extracted descriptors of tumor size. We will create a generalized, open, portable, and extensible radiomics platform that is widely applicable across cancer types and imaging modalities and describe how we will use lung and head and neck cancers as models to validate our developments.

  11. The SAM framework: modeling the effects of management factors on human behavior in risk analysis.

    PubMed

    Murphy, D M; Paté-Cornell, M E

    1996-08-01

    Complex engineered systems, such as nuclear reactors and chemical plants, have the potential for catastrophic failure with disastrous consequences. In recent years, human and management factors have been recognized as frequent root causes of major failures in such systems. However, classical probabilistic risk analysis (PRA) techniques do not account for the underlying causes of these errors because they focus on the physical system and do not explicitly address the link between components' performance and organizational factors. This paper describes a general approach for addressing the human and management causes of system failure, called the SAM (System-Action-Management) framework. Beginning with a quantitative risk model of the physical system, SAM expands the scope of analysis to incorporate first the decisions and actions of individuals that affect the physical system. SAM then links management factors (incentives, training, policies and procedures, selection criteria, etc.) to those decisions and actions. The focus of this paper is on four quantitative models of action that describe this last relationship. These models address the formation of intentions for action and their execution as a function of the organizational environment. Intention formation is described by three alternative models: a rational model, a bounded rationality model, and a rule-based model. The execution of intentions is then modeled separately. These four models are designed to assess the probabilities of individual actions from the perspective of management, thus reflecting the uncertainties inherent to human behavior. The SAM framework is illustrated for a hypothetical case of hazardous materials transportation. This framework can be used as a tool to increase the safety and reliability of complex technical systems by modifying the organization, rather than, or in addition to, re-designing the physical system.

  12. Quantitative Systems Pharmacology Modeling of Acid Sphingomyelinase Deficiency and the Enzyme Replacement Therapy Olipudase Alfa Is an Innovative Tool for Linking Pathophysiology and Pharmacology.

    PubMed

    Kaddi, Chanchala D; Niesner, Bradley; Baek, Rena; Jasper, Paul; Pappas, John; Tolsma, John; Li, Jing; van Rijn, Zachary; Tao, Mengdi; Ortemann-Renon, Catherine; Easton, Rachael; Tan, Sharon; Puga, Ana Cristina; Schuchman, Edward H; Barrett, Jeffrey S; Azer, Karim

    2018-06-19

    Acid sphingomyelinase deficiency (ASMD) is a rare lysosomal storage disorder with heterogeneous clinical manifestations, including hepatosplenomegaly and infiltrative pulmonary disease, and is associated with significant morbidity and mortality. Olipudase alfa (recombinant human acid sphingomyelinase) is an enzyme replacement therapy under development for the non-neurological manifestations of ASMD. We present a quantitative systems pharmacology (QSP) model supporting the clinical development of olipudase alfa. The model is multiscale and mechanistic, linking the enzymatic deficiency driving the disease to molecular-level, cellular-level, and organ-level effects. Model development was informed by natural history, and preclinical and clinical studies. By considering patient-specific pharmacokinetic (PK) profiles and indicators of disease severity, the model describes pharmacodynamic (PD) and clinical end points for individual patients. The ASMD QSP model provides a platform for quantitatively assessing systemic pharmacological effects in adult and pediatric patients, and explaining variability within and across these patient populations, thereby supporting the extrapolation of treatment response from adults to pediatrics. © 2018 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  13. Wind tunnel model surface gauge for measuring roughness

    NASA Technical Reports Server (NTRS)

    Vorburger, T. V.; Gilsinn, D. E.; Teague, E. C.; Giauque, C. H. W.; Scire, F. E.; Cao, L. X.

    1987-01-01

    The optical inspection of surface roughness research has proceeded along two different lines. First, research into a quantitative understanding of light scattering from metal surfaces and into the appropriate models to describe the surfaces themselves. Second, the development of a practical instrument for the measurement of rms roughness of high performance wind tunnel models with smooth finishes. The research is summarized, with emphasis on the second avenue of research.

  14. Quantitative model of super-Arrhenian behavior in glass forming materials

    NASA Astrophysics Data System (ADS)

    Caruthers, J. M.; Medvedev, G. A.

    2018-05-01

    The key feature of glass forming liquids is the super-Arrhenian temperature dependence of the mobility, where the mobility can increase by ten orders of magnitude or more as the temperature is decreased if crystallization does not intervene. A fundamental description of the super-Arrhenian behavior has been developed; specifically, the logarithm of the relaxation time is a linear function of 1 /U¯x , where U¯x is the independently determined excess molar internal energy and B is a material constant. This one-parameter mobility model quantitatively describes data for 21 glass forming materials, which are all the materials where there are sufficient experimental data for analysis. The effect of pressure on the loga mobility is also described using the same U¯x(T ,p ) function determined from the difference between the liquid and crystalline internal energies. It is also shown that B is well correlated with the heat of fusion. The prediction of the B /U¯x model is compared to the Adam and Gibbs 1 /T S¯x model, where the B /U¯x model is significantly better in unifying the full complement of mobility data. The implications of the B /U¯x model for the development of a fundamental description of glass are discussed.

  15. Non-thermal plasma destruction of allyl alcohol in waste gas: kinetics and modelling

    NASA Astrophysics Data System (ADS)

    DeVisscher, A.; Dewulf, J.; Van Durme, J.; Leys, C.; Morent, R.; Van Langenhove, H.

    2008-02-01

    Non-thermal plasma treatment is a promising technique for the destruction of volatile organic compounds in waste gas. A relatively unexplored technique is the atmospheric negative dc multi-pin-to-plate glow discharge. This paper reports experimental results of allyl alcohol degradation and ozone production in this type of plasma. A new model was developed to describe these processes quantitatively. The model contains a detailed chemical degradation scheme, and describes the physics of the plasma by assuming that the fraction of electrons that takes part in chemical reactions is an exponential function of the reduced field. The model captured the experimental kinetic data to less than 2 ppm standard deviation.

  16. Quantitative trait locus gene mapping: a new method for locating alcohol response genes.

    PubMed

    Crabbe, J C

    1996-01-01

    Alcoholism is a multigenic trait with important non-genetic determinants. Studies with genetic animal models of susceptibility to several of alcohol's effects suggest that several genes contributing modest effects on susceptibility (Quantitative Trait Loci, or QTLs) are important. A new technique of QTL gene mapping has allowed the identification of the location in mouse genome of several such QTLs. The method is described, and the locations of QTLs affecting the acute alcohol withdrawal reaction are described as an example of the method. Verification of these QTLs in ancillary studies is described and the strengths, limitations, and future directions to be pursued are discussed. QTL mapping is a promising method for identifying genes in rodents with the hope of directly extrapolating the results to the human genome. This review is based on a paper presented at the First International Congress of the Latin American Society for Biomedical Research on Alcoholism, Santiago, Chile, November 1994.

  17. Cultural consensus modeling to measure transactional sex in Swaziland: Scale building and validation.

    PubMed

    Fielding-Miller, Rebecca; Dunkle, Kristin L; Cooper, Hannah L F; Windle, Michael; Hadley, Craig

    2016-01-01

    Transactional sex is associated with increased risk of HIV and gender based violence in southern Africa and around the world. However the typical quantitative operationalization, "the exchange of gifts or money for sex," can be at odds with a wide array of relationship types and motivations described in qualitative explorations. To build on the strengths of both qualitative and quantitative research streams, we used cultural consensus models to identify distinct models of transactional sex in Swaziland. The process allowed us to build and validate emic scales of transactional sex, while identifying key informants for qualitative interviews within each model to contextualize women's experiences and risk perceptions. We used logistic and multinomial logistic regression models to measure associations with condom use and social status outcomes. Fieldwork was conducted between November 2013 and December 2014 in the Hhohho and Manzini regions. We identified three distinct models of transactional sex in Swaziland based on 124 Swazi women's emic valuation of what they hoped to receive in exchange for sex with their partners. In a clinic-based survey (n = 406), consensus model scales were more sensitive to condom use than the etic definition. Model consonance had distinct effects on social status for the three different models. Transactional sex is better measured as an emic spectrum of expectations within a relationship, rather than an etic binary relationship type. Cultural consensus models allowed us to blend qualitative and quantitative approaches to create an emicly valid quantitative scale grounded in qualitative context. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Doing Research on Education for Sustainable Development

    ERIC Educational Resources Information Center

    Reunamo, Jyrki; Pipere, Anita

    2011-01-01

    Purpose: The purpose of this paper is to describe the research preferences and differences of education for sustainable development (ESD) researchers. A model with the continuums assimilation-accommodation and adaptation-agency was applied resulting in quantitative, qualitative, theoretic and participative research orientations.…

  19. An analysis of radio frequency surveillance systems for air traffic control volume II: appendixes

    DOT National Transportation Integrated Search

    1976-02-01

    Performance criteria that afford quantitative evaluation of a variety of current and proposed configurations of the Air Traffic Control Radar Beacon System (ATCRBS) are described in detail. Two analytic system models are developed to allow applicatio...

  20. Learning Factors Transfer Analysis: Using Learning Curve Analysis to Automatically Generate Domain Models

    ERIC Educational Resources Information Center

    Pavlik, Philip I. Jr.; Cen, Hao; Koedinger, Kenneth R.

    2009-01-01

    This paper describes a novel method to create a quantitative model of an educational content domain of related practice item-types using learning curves. By using a pairwise test to search for the relationships between learning curves for these item-types, we show how the test results in a set of pairwise transfer relationships that can be…

  1. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  2. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  3. A quantitative trait locus mixture model that avoids spurious LOD score peaks.

    PubMed Central

    Feenstra, Bjarke; Skovgaard, Ib M

    2004-01-01

    In standard interval mapping of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. At any given location in the genome, the evidence of a putative QTL is measured by the likelihood ratio of the mixture model compared to a single normal distribution (the LOD score). This approach can occasionally produce spurious LOD score peaks in regions of low genotype information (e.g., widely spaced markers), especially if the phenotype distribution deviates markedly from a normal distribution. Such peaks are not indicative of a QTL effect; rather, they are caused by the fact that a mixture of normals always produces a better fit than a single normal distribution. In this study, a mixture model for QTL mapping that avoids the problems of such spurious LOD score peaks is presented. PMID:15238544

  4. A quantitative trait locus mixture model that avoids spurious LOD score peaks.

    PubMed

    Feenstra, Bjarke; Skovgaard, Ib M

    2004-06-01

    In standard interval mapping of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. At any given location in the genome, the evidence of a putative QTL is measured by the likelihood ratio of the mixture model compared to a single normal distribution (the LOD score). This approach can occasionally produce spurious LOD score peaks in regions of low genotype information (e.g., widely spaced markers), especially if the phenotype distribution deviates markedly from a normal distribution. Such peaks are not indicative of a QTL effect; rather, they are caused by the fact that a mixture of normals always produces a better fit than a single normal distribution. In this study, a mixture model for QTL mapping that avoids the problems of such spurious LOD score peaks is presented.

  5. Toxicity Evaluation of Engineered Nanomaterials: Risk Evaluation Tools (Phase 3 Studies)

    DTIC Science & Technology

    2012-01-01

    report. The second modeling approach was on quantitative structure activity relationships ( QSARs ). A manuscript entitled “Connecting the dots: Towards...expands rapidly. We proposed two types of mechanisms of toxic action supported by the nano- QSAR model , which collectively govern the toxicity of the...interpretative nano- QSAR model describing toxicity of 18 nano-metal oxides to a HaCaT cell line as a model for dermal exposure. In result, by the comparison of

  6. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  7. Simulation of metastatic progression using a computer model including chemotherapy and radiation therapy.

    PubMed

    Bethge, Anja; Schumacher, Udo; Wedemann, Gero

    2015-10-01

    Despite considerable research efforts, the process of metastasis formation is still a subject of intense discussion, and even established models differ considerably in basic details and in the conclusions drawn from them. Mathematical and computational models add a new perspective to the research as they can quantitatively investigate the processes of metastasis and the effects of treatment. However, existing models look at only one treatment option at a time. We enhanced a previously developed computer model (called CaTSiT) that enables quantitative comparison of different metastasis formation models with clinical and experimental data, to include the effects of chemotherapy, external beam radiation, radioimmunotherapy and radioembolization. CaTSiT is based on a discrete event simulation procedure. The growth of the primary tumor and its metastases is modeled by a piecewise-defined growth function that describes the growth behavior of the primary tumor and metastases during various time intervals. The piecewise-defined growth function is composed of analytical functions describing the growth behavior of the tumor based on characteristics of the tumor, such as dormancy, or the effects of various therapies. The spreading of malignant cells into the blood is modeled by intravasation events, which are generated according to a rate function. Further events in the model describe the behavior of the released malignant cells until the formation of a new metastasis. The model is published under the GNU General Public License version 3. To demonstrate the application of the computer model, a case of a patient with a hepatocellular carcinoma and multiple metastases in the liver was simulated. Besides the untreated case, different treatments were simulated at two time points: one directly after diagnosis of the primary tumor and the other several months later. Except for early applied radioimmunotherapy, no treatment strategy was able to eliminate all metastases. These results emphasize the importance of early diagnosis and of proceeding with treatment even if no clinically detectable metastases are present at the time of diagnosis of the primary tumor. CaTSiT could be a valuable tool for quantitative investigation of the process of tumor growth and metastasis formation, including the effects of various treatment options. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Four-point bending as a method for quantitatively evaluating spinal arthrodesis in a rat model.

    PubMed

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-02-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague-Dawley rat spines after single-level posterolateral fusion procedures at L4-L5. Segments were classified as 'not fused,' 'restricted motion,' or 'fused' by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4-L5 motion segment, and stiffness was measured as the slope of the moment-displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery.

  9. Bayesian parameter estimation in spectral quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Pulkkinen, Aki; Cox, Ben T.; Arridge, Simon R.; Kaipio, Jari P.; Tarvainen, Tanja

    2016-03-01

    Photoacoustic tomography (PAT) is an imaging technique combining strong contrast of optical imaging to high spatial resolution of ultrasound imaging. These strengths are achieved via photoacoustic effect, where a spatial absorption of light pulse is converted into a measurable propagating ultrasound wave. The method is seen as a potential tool for small animal imaging, pre-clinical investigations, study of blood vessels and vasculature, as well as for cancer imaging. The goal in PAT is to form an image of the absorbed optical energy density field via acoustic inverse problem approaches from the measured ultrasound data. Quantitative PAT (QPAT) proceeds from these images and forms quantitative estimates of the optical properties of the target. This optical inverse problem of QPAT is illposed. To alleviate the issue, spectral QPAT (SQPAT) utilizes PAT data formed at multiple optical wavelengths simultaneously with optical parameter models of tissue to form quantitative estimates of the parameters of interest. In this work, the inverse problem of SQPAT is investigated. Light propagation is modelled using the diffusion equation. Optical absorption is described with chromophore concentration weighted sum of known chromophore absorption spectra. Scattering is described by Mie scattering theory with an exponential power law. In the inverse problem, the spatially varying unknown parameters of interest are the chromophore concentrations, the Mie scattering parameters (power law factor and the exponent), and Gruneisen parameter. The inverse problem is approached with a Bayesian method. It is numerically demonstrated, that estimation of all parameters of interest is possible with the approach.

  10. Quantitative Hydraulic Models Of Early Land Plants Provide Insight Into Middle Paleozoic Terrestrial Paleoenvironmental Conditions

    NASA Astrophysics Data System (ADS)

    Wilson, J. P.; Fischer, W. W.

    2010-12-01

    Fossil plants provide useful proxies of Earth’s climate because plants are closely connected, through physiology and morphology, to the environments in which they lived. Recent advances in quantitative hydraulic models of plant water transport provide new insight into the history of climate by allowing fossils to speak directly to environmental conditions based on preserved internal anatomy. We report results of a quantitative hydraulic model applied to one of the earliest terrestrial plants preserved in three dimensions, the ~396 million-year-old vascular plant Asteroxylon mackei. This model combines equations describing the rate of fluid flow through plant tissues with detailed observations of plant anatomy; this allows quantitative estimates of two critical aspects of plant function. First and foremost, results from these models quantify the supply of water to evaporative surfaces; second, results describe the ability of plant vascular systems to resist tensile damage from extreme environmental events, such as drought or frost. This approach permits quantitative comparisons of functional aspects of Asteroxylon with other extinct and extant plants, informs the quality of plant-based environmental proxies, and provides concrete data that can be input into climate models. Results indicate that despite their small size, water transport cells in Asteroxylon could supply a large volume of water to the plant's leaves--even greater than cells from some later-evolved seed plants. The smallest Asteroxylon tracheids have conductivities exceeding 0.015 m^2 / MPa * s, whereas Paleozoic conifer tracheids do not reach this threshold until they are three times wider. However, this increase in conductivity came at the cost of little to no adaptations for transport safety, placing the plant’s vegetative organs in jeopardy during drought events. Analysis of the thickness-to-span ratio of Asteroxylon’s tracheids suggests that environmental conditions of reduced relative humidity (<20%) combined with elevated temperatures (>25°C) could cause sufficient cavitation to reduce hydraulic conductivity by 50%. This suggests that the Early Devonian environments that supported the earliest vascular plants were not subject to prolonged midseason droughts, or, alternatively, that the growing season was short. This places minimum constraints on water availability (e.g., groundwater hydration, relative humidity) in locations where Asteroxylon fossils are found; these environments must have had high relative humidities, comparable to tropical riparian environments. Given these constraints, biome-scale paleovegetation models that place early vascular plants distal to water sources can be revised to account for reduced drought tolerance. Paleoclimate proxies that treat early terrestrial plants as functionally interchangeable can incorporate physiological differences in a quantitatively meaningful way. Application of hydraulic models to fossil plants provides an additional perspective on the 475 million-year history of terrestrial photosynthetic environments and has potential to corroborate other plant-based paleoclimate proxies.

  11. A basis for a visual language for describing, archiving and analyzing functional models of complex biological systems

    PubMed Central

    Cook, Daniel L; Farley, Joel F; Tapscott, Stephen J

    2001-01-01

    Background: We propose that a computerized, internet-based graphical description language for systems biology will be essential for describing, archiving and analyzing complex problems of biological function in health and disease. Results: We outline here a conceptual basis for designing such a language and describe BioD, a prototype language that we have used to explore the utility and feasibility of this approach to functional biology. Using example models, we demonstrate that a rather limited lexicon of icons and arrows suffices to describe complex cell-biological systems as discrete models that can be posted and linked on the internet. Conclusions: Given available computer and internet technology, BioD may be implemented as an extensible, multidisciplinary language that can be used to archive functional systems knowledge and be extended to support both qualitative and quantitative functional analysis. PMID:11305940

  12. EVALUATING QUANTITATIVE FORMULAS FOR DOSE-RESPONSE ASSESSMENT OF CHEMICAL MIXTURES

    EPA Science Inventory

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment d...

  13. Field Scale Monitoring and Modeling of Water and Chemical Transfer in the Vadose Zone

    USDA-ARS?s Scientific Manuscript database

    Natural resource systems involve highly complex interactions of soil-plant-atmosphere-management components that are extremely difficult to quantitatively describe. Computer simulations for prediction and management of watersheds, water supply areas, and agricultural fields and farms have become inc...

  14. An orientation sensitive approach in biomolecule interaction quantitative structure-activity relationship modeling and its application in ion-exchange chromatography.

    PubMed

    Kittelmann, Jörg; Lang, Katharina M H; Ottens, Marcel; Hubbuch, Jürgen

    2017-01-27

    Quantitative structure-activity relationship (QSAR) modeling for prediction of biomolecule parameters has become an established technique in chromatographic purification process design. Unfortunately available descriptor sets fail to describe the orientation of biomolecules and the effects of ionic strength in the mobile phase on the interaction with the stationary phase. The literature describes several special descriptors used for chromatographic retention modeling, all of these do not describe the screening of electrostatic potential by the mobile phase in use. In this work we introduce two new approaches of descriptor calculations, namely surface patches and plane projection, which capture an oriented binding to charged surfaces and steric hindrance of the interaction with chromatographic ligands with regard to electrostatic potential screening by mobile phase ions. We present the use of the developed descriptor sets for predictive modeling of Langmuir isotherms for proteins at different pH values between pH 5 and 10 and varying ionic strength in the range of 10-100mM. The resulting model has a high correlation of calculated descriptors and experimental results, with a coefficient of determination of 0.82 and a predictive coefficient of determination of 0.92 for unknown molecular structures and conditions. The agreement of calculated molecular interaction orientations with both, experimental results as well as molecular dynamic simulations from literature is shown. The developed descriptors provide the means for improved QSAR models of chromatographic processes, as they reflect the complex interactions of biomolecules with chromatographic phases. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Technical Advance: Live-imaging analysis of human dendritic cell migrating behavior under the influence of immune-stimulating reagents in an organotypic model of lung

    PubMed Central

    Nguyen Hoang, Anh Thu; Chen, Puran; Björnfot, Sofia; Högstrand, Kari; Lock, John G.; Grandien, Alf; Coles, Mark; Svensson, Mattias

    2014-01-01

    This manuscript describes technical advances allowing manipulation and quantitative analyses of human DC migratory behavior in lung epithelial tissue. DCs are hematopoietic cells essential for the maintenance of tissue homeostasis and the induction of tissue-specific immune responses. Important functions include cytokine production and migration in response to infection for the induction of proper immune responses. To design appropriate strategies to exploit human DC functional properties in lung tissue for the purpose of clinical evaluation, e.g., candidate vaccination and immunotherapy strategies, we have developed a live-imaging assay based on our previously described organotypic model of the human lung. This assay allows provocations and subsequent quantitative investigations of DC functional properties under conditions mimicking morphological and functional features of the in vivo parental tissue. We present protocols to set up and prepare tissue models for 4D (x, y, z, time) fluorescence-imaging analysis that allow spatial and temporal studies of human DCs in live epithelial tissue, followed by flow cytometry analysis of DCs retrieved from digested tissue models. This model system can be useful for elucidating incompletely defined pathways controlling DC functional responses to infection and inflammation in lung epithelial tissue, as well as the efficacy of locally administered candidate interventions. PMID:24899587

  16. Choices and changes: Eccles' Expectancy-Value model and upper-secondary school students' longitudinal reflections about their choice of a STEM education

    NASA Astrophysics Data System (ADS)

    Lykkegaard, Eva; Ulriksen, Lars

    2016-03-01

    During the past 30 years, Eccles' comprehensive social-psychological Expectancy-Value Model of Motivated Behavioural Choices (EV-MBC model) has been proven suitable for studying educational choices related to Science, Technology, Engineering and/or Mathematics (STEM). The reflections of 15 students in their last year in upper-secondary school concerning their choice of tertiary education were examined using quantitative EV-MBC surveys and repeated qualitative interviews. This article presents the analyses of three cases in detail. The analytical focus was whether the factors indicated in the EV-MBC model could be used to detect significant changes in the students' educational choice processes. An important finding was that the quantitative EV-MBC surveys and the qualitative interviews gave quite different results concerning the students' considerations about the choice of tertiary education, and that significant changes in the students' reflections were not captured by the factors of the EV-MBC model. This questions the validity of the EV-MBC surveys. Moreover, the quantitative factors from the EV-MBC model did not sufficiently explain students' dynamical educational choice processes where students in parallel considered several different potential educational trajectories. We therefore call for further studies of the EV-MBC model's use in describing longitudinal choice processes and especially in investigating significant changes.

  17. Nursing students' evaluation of quality indicators during learning in clinical practice.

    PubMed

    Jansson, Inger; Ene, Kerstin W

    2016-09-01

    A supportive clinical learning environment is important for nursing students' learning. In this study, a contract between a county and a university involving a preceptor model of clinical education for nursing students is described. The aim of this study was to describe nursing students' clinical education based on quality indicators and to describe the students' experiences of what facilitated or hindered the learning process during their clinical practice. During autumn 2012 and spring 2013, 269 student evaluations with quantitative and qualitative answers were filled out anonymously. Quantitative data from the questionnaires concerning the quality indicators: Administration/information, Assessments/examinations and Reflection were processed to generate descriptive statistics that revealed gaps in what the preceptor model demands and what the students reported. The answers from the qualitative questions concerning the quality indicator Learning were analysed using content analysis. Four categories emerged: Independence and responsibility, continuity of learning, time, and the competence and attitudes of the staff. The study underlines that reflection, continuity, communication and feedback were important for the students' learning process, whereas heavy workload among staff and being supervised by many different preceptors were experienced as stressful and hindering by students. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Solitary Wave in One-dimensional Buckyball System at Nanoscale

    PubMed Central

    Xu, Jun; Zheng, Bowen; Liu, Yilun

    2016-01-01

    We have studied the stress wave propagation in one-dimensional (1-D) nanoscopic buckyball (C60) system by molecular dynamics (MD) simulation and quantitative modeling. Simulation results have shown that solitary waves are generated and propagating in the buckyball system through impacting one buckyball at one end of the buckyball chain. We have found the solitary wave behaviors are closely dependent on the initial temperature and impacting speed of the buckyball chain. There are almost no dispersion and dissipation of the solitary waves (stationary solitary wave) for relatively low temperature and high impacting speed. While for relatively high temperature and low impacting speed the profile of the solitary waves is highly distorted and dissipated after propagating several tens of buckyballs. A phase diagram is proposed to describe the effect of the temperature and impacting speed on the solitary wave behaviors in buckyball system. In order to quantitatively describe the wave behavior in buckyball system, a simple nonlinear-spring model is established, which can describe the MD simulation results at low temperature very well. The results presented in this work may lay a solid step towards the further understanding and manipulation of stress wave propagation and impact energy mitigation at nanoscale. PMID:26891624

  19. A method for evaluating the murine pulmonary vasculature using micro-computed tomography.

    PubMed

    Phillips, Michael R; Moore, Scott M; Shah, Mansi; Lee, Clara; Lee, Yueh Z; Faber, James E; McLean, Sean E

    2017-01-01

    Significant mortality and morbidity are associated with alterations in the pulmonary vasculature. While techniques have been described for quantitative morphometry of whole-lung arterial trees in larger animals, no methods have been described in mice. We report a method for the quantitative assessment of murine pulmonary arterial vasculature using high-resolution computed tomography scanning. Mice were harvested at 2 weeks, 4 weeks, and 3 months of age. The pulmonary artery vascular tree was pressure perfused to maximal dilation with a radio-opaque casting material with viscosity and pressure set to prevent capillary transit and venous filling. The lungs were fixed and scanned on a specimen computed tomography scanner at 8-μm resolution, and the vessels were segmented. Vessels were grouped into categories based on lumen diameter and branch generation. Robust high-resolution segmentation was achieved, permitting detailed quantitation of pulmonary vascular morphometrics. As expected, postnatal lung development was associated with progressive increase in small-vessel number and arterial branching complexity. These methods for quantitative analysis of the pulmonary vasculature in postnatal and adult mice provide a useful tool for the evaluation of mouse models of disease that affect the pulmonary vasculature. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. QSAR, QSPR and QSRR in Terms of 3-D-MoRSE Descriptors for In Silico Screening of Clofibric Acid Analogues.

    PubMed

    Di Tullio, Maurizio; Maccallini, Cristina; Ammazzalorso, Alessandra; Giampietro, Letizia; Amoroso, Rosa; De Filippis, Barbara; Fantacuzzi, Marialuigia; Wiczling, Paweł; Kaliszan, Roman

    2012-07-01

    A series of 27 analogues of clofibric acid, mostly heteroarylalkanoic derivatives, have been analyzed by a novel high-throughput reversed-phase HPLC method employing combined gradient of eluent's pH and organic modifier content. The such determined hydrophobicity (lipophilicity) parameters, log kw , and acidity constants, pKa , were subjected to multiple regression analysis to get a QSRR (Quantitative StructureRetention Relationships) and a QSPR (Quantitative Structure-Property Relationships) equation, respectively, describing these pharmacokinetics-determining physicochemical parameters in terms of the calculation chemistry derived structural descriptors. The previously determined in vitro log EC50 values - transactivation activity towards PPARα (human Peroxisome Proliferator-Activated Receptor α) - have also been described in a QSAR (Quantitative StructureActivity Relationships) equation in terms of the 3-D-MoRSE descriptors (3D-Molecule Representation of Structures based on Electron diffraction descriptors). The QSAR model derived can serve for an a priori prediction of bioactivity in vitro of any designed analogue, whereas the QSRR and the QSPR models can be used to evaluate lipophilicity and acidity, respectively, of the compounds, and hence to rational guide selection of structures of proper pharmacokinetics. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Light absorption by coated nano-sized carbonaceous particles

    NASA Astrophysics Data System (ADS)

    Gangl, Martin; Kocifaj, Miroslav; Videen, Gorden; Horvath, Helmuth

    The optical properties of strongly absorbing soot particles coated by transparent material are investigated experimentally and described by several modeling approaches. Soot is produced by spark discharge and passed through a Sinclair-La Mer generator where non-absorbing carnauba wax is condensed onto it to obtain internal soot-wax mixtures in a controlled way. Measurements of the extinction and volume scattering coefficient show an amplification of absorption by a factor of approximately 1.8. This behavior was described by different approaches of internally mixed materials for the modal diameters of the measured size distributions: concentric-sphere model, effective medium approximations and heterogeneous ellipsoids. The concentric-sphere model describes the absorption increase quantitatively; and hence, it is chosen to be applied to the entire particle population in the size distribution. The growth of the soot particles by condensing wax is described by a simplified growth model to estimate the different contributions of several soot particle diameters to the overall absorption cross-section.

  2. The Implementation of Cooperative Learning Model "Number Heads Together" ("NHT") in Improving the Students' Ability in Reading Comprehension

    ERIC Educational Resources Information Center

    Maman, Mayong; Rajab, Andi Aryani

    2016-01-01

    The study aimed at describing the implementation of cooperative learning model of (NHT) at student of SMPN 2 Maros. The method used was a classroom action research in two cycles. Data were collected using the test for the quantitative and non-test for the qualitative by employing observation, field note, student's workbook, student's reflection…

  3. Implications of Privacy Needs and Interpersonal Distancing Mechanisms for Space Station Design

    NASA Technical Reports Server (NTRS)

    Harrison, A. A.; Sommer, R.; Struthers, N.; Hoyt, K.

    1986-01-01

    The literature on privacy needs, personal space, interpersonal distancing, and crowding is reveiwed with special reference to spaceflight and spaceflight analogous conditions. A quantitative model is proposed for understanding privacy, interpersonal distancing, and performance. The implications for space station design is described.

  4. Introductory Life Science Mathematics and Quantitative Neuroscience Courses

    ERIC Educational Resources Information Center

    Duffus, Dwight; Olifer, Andrei

    2010-01-01

    We describe two sets of courses designed to enhance the mathematical, statistical, and computational training of life science undergraduates at Emory College. The first course is an introductory sequence in differential and integral calculus, modeling with differential equations, probability, and inferential statistics. The second is an…

  5. Developing sensor activity relationships for the JPL electronic nose sensors using molecular modeling and QSAR techniques

    NASA Technical Reports Server (NTRS)

    Shevade, A. V.; Ryan, M. A.; Homer, M. L.; Jewell, A. D.; Zhou, H.; Manatt, K.; Kisor, A. K.

    2005-01-01

    We report a Quantitative Structure-Activity Relationships (QSAR) study using Genetic Function Approximations (GFA) to describe the polymer-carbon composite sensor activities in the JPL Electronic Nose, when exposed to chemical vapors at parts-per-million concentration levels.

  6. SYN-JEM: A Quantitative Job-Exposure Matrix for Five Lung Carcinogens.

    PubMed

    Peters, Susan; Vermeulen, Roel; Portengen, Lützen; Olsson, Ann; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans

    2016-08-01

    The use of measurement data in occupational exposure assessment allows more quantitative analyses of possible exposure-response relations. We describe a quantitative exposure assessment approach for five lung carcinogens (i.e. asbestos, chromium-VI, nickel, polycyclic aromatic hydrocarbons (by its proxy benzo(a)pyrene (BaP)) and respirable crystalline silica). A quantitative job-exposure matrix (JEM) was developed based on statistical modeling of large quantities of personal measurements. Empirical linear models were developed using personal occupational exposure measurements (n = 102306) from Europe and Canada, as well as auxiliary information like job (industry), year of sampling, region, an a priori exposure rating of each job (none, low, and high exposed), sampling and analytical methods, and sampling duration. The model outcomes were used to create a JEM with a quantitative estimate of the level of exposure by job, year, and region. Decreasing time trends were observed for all agents between the 1970s and 2009, ranging from -1.2% per year for personal BaP and nickel exposures to -10.7% for asbestos (in the time period before an asbestos ban was implemented). Regional differences in exposure concentrations (adjusted for measured jobs, years of measurement, and sampling method and duration) varied by agent, ranging from a factor 3.3 for chromium-VI up to a factor 10.5 for asbestos. We estimated time-, job-, and region-specific exposure levels for four (asbestos, chromium-VI, nickel, and RCS) out of five considered lung carcinogens. Through statistical modeling of large amounts of personal occupational exposure measurement data we were able to derive a quantitative JEM to be used in community-based studies. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  7. PLS-based quantitative structure-activity relationship for substituted benzamides of clebopride type. Application of experimental design in drug design.

    PubMed

    Norinder, U; Högberg, T

    1992-04-01

    The advantageous approach of using an experimentally designed training set as the basis for establishing a quantitative structure-activity relationship with good predictive capability is described. The training set was selected from a fractional factorial design scheme based on a principal component description of physico-chemical parameters of aromatic substituents. The derived model successfully predicts the activities of additional substituted benzamides of 6-methoxy-N-(4-piperidyl)salicylamide type. The major influence on activity of the 3-substituent is demonstrated.

  8. The Dopamine Prediction Error: Contributions to Associative Models of Reward Learning

    PubMed Central

    Nasser, Helen M.; Calu, Donna J.; Schoenbaum, Geoffrey; Sharpe, Melissa J.

    2017-01-01

    Phasic activity of midbrain dopamine neurons is currently thought to encapsulate the prediction-error signal described in Sutton and Barto’s (1981) model-free reinforcement learning algorithm. This phasic signal is thought to contain information about the quantitative value of reward, which transfers to the reward-predictive cue after learning. This is argued to endow the reward-predictive cue with the value inherent in the reward, motivating behavior toward cues signaling the presence of reward. Yet theoretical and empirical research has implicated prediction-error signaling in learning that extends far beyond a transfer of quantitative value to a reward-predictive cue. Here, we review the research which demonstrates the complexity of how dopaminergic prediction errors facilitate learning. After briefly discussing the literature demonstrating that phasic dopaminergic signals can act in the manner described by Sutton and Barto (1981), we consider how these signals may also influence attentional processing across multiple attentional systems in distinct brain circuits. Then, we discuss how prediction errors encode and promote the development of context-specific associations between cues and rewards. Finally, we consider recent evidence that shows dopaminergic activity contains information about causal relationships between cues and rewards that reflect information garnered from rich associative models of the world that can be adapted in the absence of direct experience. In discussing this research we hope to support the expansion of how dopaminergic prediction errors are thought to contribute to the learning process beyond the traditional concept of transferring quantitative value. PMID:28275359

  9. Quantitative structure-toxicity relationship (QSTR) studies on the organophosphate insecticides.

    PubMed

    Can, Alper

    2014-11-04

    Organophosphate insecticides are the most commonly used pesticides in the world. In this study, quantitative structure-toxicity relationship (QSTR) models were derived for estimating the acute oral toxicity of organophosphate insecticides to male rats. The 20 chemicals of the training set and the seven compounds of the external testing set were described by means of using descriptors. Descriptors for lipophilicity, polarity and molecular geometry, as well as quantum chemical descriptors for energy were calculated. Model development to predict toxicity of organophosphate insecticides in different matrices was carried out using multiple linear regression. The model was validated internally and externally. In the present study, QSTR model was used for the first time to understand the inherent relationships between the organophosphate insecticide molecules and their toxicity behavior. Such studies provide mechanistic insight about structure-toxicity relationship and help in the design of less toxic insecticides. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Comparison of 3D quantitative structure-activity relationship methods: Analysis of the in vitro antimalarial activity of 154 artemisinin analogues by hypothetical active-site lattice and comparative molecular field analysis

    NASA Astrophysics Data System (ADS)

    Woolfrey, John R.; Avery, Mitchell A.; Doweyko, Arthur M.

    1998-03-01

    Two three-dimensional quantitative structure-activity relationship (3D-QSAR) methods, comparative molecular field analysis (CoMFA) and hypothetical active site lattice (HASL), were compared with respect to the analysis of a training set of 154 artemisinin analogues. Five models were created, including a complete HASL and two trimmed versions, as well as two CoMFA models (leave-one-out standard CoMFA and the guided-region selection protocol). Similar r2 and q2 values were obtained by each method, although some striking differences existed between CoMFA contour maps and the HASL output. Each of the four predictive models exhibited a similar ability to predict the activity of a test set of 23 artemisinin analogues, although some differences were noted as to which compounds were described well by either model.

  11. Parametric studies with an atmospheric diffusion model that assesses toxic fuel hazards due to the ground clouds generated by rocket launches

    NASA Technical Reports Server (NTRS)

    Stewart, R. B.; Grose, W. L.

    1975-01-01

    Parametric studies were made with a multilayer atmospheric diffusion model to place quantitative limits on the uncertainty of predicting ground-level toxic rocket-fuel concentrations. Exhaust distributions in the ground cloud, cloud stabilized geometry, atmospheric coefficients, the effects of exhaust plume afterburning of carbon monoxide CO, assumed surface mixing-layer division in the model, and model sensitivity to different meteorological regimes were studied. Large-scale differences in ground-level predictions are quantitatively described. Cloud alongwind growth for several meteorological conditions is shown to be in error because of incorrect application of previous diffusion theory. In addition, rocket-plume calculations indicate that almost all of the rocket-motor carbon monoxide is afterburned to carbon dioxide CO2, thus reducing toxic hazards due to CO. The afterburning is also shown to have a significant effect on cloud stabilization height and on ground-level concentrations of exhaust products.

  12. Mechanochemical models of processive molecular motors

    NASA Astrophysics Data System (ADS)

    Lan, Ganhui; Sun, Sean X.

    2012-05-01

    Motor proteins are the molecular engines powering the living cell. These nanometre-sized molecules convert chemical energy, both enthalpic and entropic, into useful mechanical work. High resolution single molecule experiments can now observe motor protein movement with increasing precision. The emerging data must be combined with structural and kinetic measurements to develop a quantitative mechanism. This article describes a modelling framework where quantitative understanding of motor behaviour can be developed based on the protein structure. The framework is applied to myosin motors, with emphasis on how synchrony between motor domains give rise to processive unidirectional movement. The modelling approach shows that the elasticity of protein domains are important in regulating motor function. Simple models of protein domain elasticity are presented. The framework can be generalized to other motor systems, or an ensemble of motors such as muscle contraction. Indeed, for hundreds of myosins, our framework can be reduced to the Huxely-Simmons description of muscle movement in the mean-field limit.

  13. Effect of quantum nuclear motion on hydrogen bonding

    NASA Astrophysics Data System (ADS)

    McKenzie, Ross H.; Bekker, Christiaan; Athokpam, Bijyalaxmi; Ramesh, Sai G.

    2014-05-01

    This work considers how the properties of hydrogen bonded complexes, X-H⋯Y, are modified by the quantum motion of the shared proton. Using a simple two-diabatic state model Hamiltonian, the analysis of the symmetric case, where the donor (X) and acceptor (Y) have the same proton affinity, is carried out. For quantitative comparisons, a parametrization specific to the O-H⋯O complexes is used. The vibrational energy levels of the one-dimensional ground state adiabatic potential of the model are used to make quantitative comparisons with a vast body of condensed phase data, spanning a donor-acceptor separation (R) range of about 2.4 - 3.0 Å, i.e., from strong to weak hydrogen bonds. The position of the proton (which determines the X-H bond length) and its longitudinal vibrational frequency, along with the isotope effects in both are described quantitatively. An analysis of the secondary geometric isotope effect, using a simple extension of the two-state model, yields an improved agreement of the predicted variation with R of frequency isotope effects. The role of bending modes is also considered: their quantum effects compete with those of the stretching mode for weak to moderate H-bond strengths. In spite of the economy in the parametrization of the model used, it offers key insights into the defining features of H-bonds, and semi-quantitatively captures several trends.

  14. A quantitative structure-activity relationship to predict efficacy of granular activated carbon adsorption to control emerging contaminants.

    PubMed

    Kennicutt, A R; Morkowchuk, L; Krein, M; Breneman, C M; Kilduff, J E

    2016-08-01

    A quantitative structure-activity relationship was developed to predict the efficacy of carbon adsorption as a control technology for endocrine-disrupting compounds, pharmaceuticals, and components of personal care products, as a tool for water quality professionals to protect public health. Here, we expand previous work to investigate a broad spectrum of molecular descriptors including subdivided surface areas, adjacency and distance matrix descriptors, electrostatic partial charges, potential energy descriptors, conformation-dependent charge descriptors, and Transferable Atom Equivalent (TAE) descriptors that characterize the regional electronic properties of molecules. We compare the efficacy of linear (Partial Least Squares) and non-linear (Support Vector Machine) machine learning methods to describe a broad chemical space and produce a user-friendly model. We employ cross-validation, y-scrambling, and external validation for quality control. The recommended Support Vector Machine model trained on 95 compounds having 23 descriptors offered a good balance between good performance statistics, low error, and low probability of over-fitting while describing a wide range of chemical features. The cross-validated model using a log-uptake (qe) response calculated at an aqueous equilibrium concentration (Ce) of 1 μM described the training dataset with an r(2) of 0.932, had a cross-validated r(2) of 0.833, and an average residual of 0.14 log units.

  15. Resolving dust emission responses to land cover change using an ecological land classification

    USDA-ARS?s Scientific Manuscript database

    Despite efforts to quantify the impacts of land cover change on wind erosion, assessment uncertainty remains large. We address this uncertainty by evaluating the application of ecological site concepts and state-and-transition models (STMs) for detecting and quantitatively describing the impacts of ...

  16. Coulombic Models in Chemical Bonding.

    ERIC Educational Resources Information Center

    Sacks, Lawrence J.

    1986-01-01

    Describes a bonding theory which provides a framework for the description of a wide range of substances and provides quantitative information of remarkable accuracy with far less computational effort than that required of other approaches. Includes applications, such as calculation of bond energies of two binary hydrides (methane and diborane).…

  17. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  18. The Ether Wind and the Global Positioning System.

    ERIC Educational Resources Information Center

    Muller, Rainer

    2000-01-01

    Explains how students can perform a refutation of the ether theory using information from the Global Positioning System (GPS). Discusses the functioning of the GPS, qualitatively describes how position determination would be affected by an ether wind, and illustrates the pertinent ideas with a simple quantitative model. (WRM)

  19. Modelling Mathematical Reasoning in Physics Education

    ERIC Educational Resources Information Center

    Uhden, Olaf; Karam, Ricardo; Pietrocola, Mauricio; Pospiech, Gesche

    2012-01-01

    Many findings from research as well as reports from teachers describe students' problem solving strategies as manipulation of formulas by rote. The resulting dissatisfaction with quantitative physical textbook problems seems to influence the attitude towards the role of mathematics in physics education in general. Mathematics is often seen as a…

  20. Quantitative Models Describing Past and Current Nutrient Fluxes and Associated Ecosystem Level Responses in the Narragansett Bay Ecosystem

    EPA Science Inventory

    Multiple drivers, including nutrient loading and climate change, affect the Narragansett Bay ecosystem in Rhode Island/Massachusetts, USA. Managers are interested in understanding the timing and magnitude of these effects, and ecosystem responses to restoration actions. To provid...

  1. Future Cities Engineering: Early Engineering Interventions in the Middle Grades

    ERIC Educational Resources Information Center

    McCue, Camille; James, David

    2008-01-01

    This paper describes qualitative and quantitative research conducted with middle school students participating in a Future Cities Engineering course. Insights were sought regarding both affective and cognitive changes which transpired during the one-semester schedule of activities focused on modeling the infrastructure of a city built 150 years in…

  2. Dissociation of the Ethyl Radical: An Exercise in Computational Chemistry

    ERIC Educational Resources Information Center

    Nassabeh, Nahal; Tran, Mark; Fleming, Patrick E.

    2014-01-01

    A set of exercises for use in a typical physical chemistry laboratory course are described, modeling the unimolecular dissociation of the ethyl radical to form ethylene and atomic hydrogen. Students analyze the computational results both qualitatively and quantitatively. Qualitative structural changes are compared to approximate predicted values…

  3. MODIA: Vol. 4. The Resource Utilization Model. A Project AIR FORCE Report.

    ERIC Educational Resources Information Center

    Gallegos, Margaret

    MODIA (Method of Designing Instructional Alternatives) was developed to help the Air Force manage resources for formal training by systematically and explicitly relating quantitative requirements for training resources to the details of course design and course operation during the planning stage. This report describes the Resource Utilization…

  4. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  5. Health impact assessment – A survey on quantifying tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org

    Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less

  6. An electricity consumption model for electric vehicular flow

    NASA Astrophysics Data System (ADS)

    Xiao, Hong; Huang, Hai-Jun; Tang, Tie-Qiao

    2016-09-01

    In this paper, we apply the relationships between the macro and micro variables of traffic flow to develop an electricity consumption model for electric vehicular flow. We use the proposed model to study the quantitative relationships between the electricity consumption/total power and speed/density under uniform flow, and the electricity consumptions during the evolution processes of shock, rarefaction wave and small perturbation. The numerical results indicate that the proposed model can perfectly describe the electricity consumption for electric vehicular flow, which shows that the proposed model is reasonable.

  7. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    PubMed

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  8. Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1994-01-01

    Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.

  9. Igneous intrusion models for floor fracturing in lunar craters

    NASA Technical Reports Server (NTRS)

    Wichman, R. W.; Schultz, P. H.

    1991-01-01

    Lunar floor-fractured craters are primarily located near the maria and frequently contain ponded mare units and dark mantling deposits. Fracturing is confined to the crater interior, often producing a moat-like feature near the floor edge, and crater depth is commonly reduced by uplift of the crater floor. Although viscous relaxation of crater topography can produce such uplift, the close association of modification with surface volcanism supports a model linking floor fracture to crater-centered igneous intrusions. The consequences of two intrusion models for the lunar interior are quantitatively explored. The first model is based on terrestrial laccoliths and describes a shallow intrusion beneath the crater. The second model is based on cone sheet complexes where surface deformation results from a deeper magma chamber. Both models, their fit to observed crater modifications and possible implications for local volcanism are described.

  10. Computational Medicine: Translating Models to Clinical Care

    PubMed Central

    Winslow, Raimond L.; Trayanova, Natalia; Geman, Donald; Miller, Michael I.

    2013-01-01

    Because of the inherent complexity of coupled nonlinear biological systems, the development of computational models is necessary for achieving a quantitative understanding of their structure and function in health and disease. Statistical learning is applied to high-dimensional biomolecular data to create models that describe relationships between molecules and networks. Multiscale modeling links networks to cells, organs, and organ systems. Computational approaches are used to characterize anatomic shape and its variations in health and disease. In each case, the purposes of modeling are to capture all that we know about disease and to develop improved therapies tailored to the needs of individuals. We discuss advances in computational medicine, with specific examples in the fields of cancer, diabetes, cardiology, and neurology. Advances in translating these computational methods to the clinic are described, as well as challenges in applying models for improving patient health. PMID:23115356

  11. Sensitivity analyses of exposure estimates from a quantitative job-exposure matrix (SYN-JEM) for use in community-based studies.

    PubMed

    Peters, Susan; Kromhout, Hans; Portengen, Lützen; Olsson, Ann; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Vermeulen, Roel

    2013-01-01

    We describe the elaboration and sensitivity analyses of a quantitative job-exposure matrix (SYN-JEM) for respirable crystalline silica (RCS). The aim was to gain insight into the robustness of the SYN-JEM RCS estimates based on critical decisions taken in the elaboration process. SYN-JEM for RCS exposure consists of three axes (job, region, and year) based on estimates derived from a previously developed statistical model. To elaborate SYN-JEM, several decisions were taken: i.e. the application of (i) a single time trend; (ii) region-specific adjustments in RCS exposure; and (iii) a prior job-specific exposure level (by the semi-quantitative DOM-JEM), with an override of 0 mg/m(3) for jobs a priori defined as non-exposed. Furthermore, we assumed that exposure levels reached a ceiling in 1960 and remained constant prior to this date. We applied SYN-JEM to the occupational histories of subjects from a large international pooled community-based case-control study. Cumulative exposure levels derived with SYN-JEM were compared with those from alternative models, described by Pearson correlation ((Rp)) and differences in unit of exposure (mg/m(3)-year). Alternative models concerned changes in application of job- and region-specific estimates and exposure ceiling, and omitting the a priori exposure ranking. Cumulative exposure levels for the study subjects ranged from 0.01 to 60 mg/m(3)-years, with a median of 1.76 mg/m(3)-years. Exposure levels derived from SYN-JEM and alternative models were overall highly correlated (R(p) > 0.90), although somewhat lower when omitting the region estimate ((Rp) = 0.80) or not taking into account the assigned semi-quantitative exposure level (R(p) = 0.65). Modification of the time trend (i.e. exposure ceiling at 1950 or 1970, or assuming a decline before 1960) caused the largest changes in absolute exposure levels (26-33% difference), but without changing the relative ranking ((Rp) = 0.99). Exposure estimates derived from SYN-JEM appeared to be plausible compared with (historical) levels described in the literature. Decisions taken in the development of SYN-JEM did not critically change the cumulative exposure levels. The influence of region-specific estimates needs to be explored in future risk analyses.

  12. The use of mode of action information in risk assessment: quantitative key events/dose-response framework for modeling the dose-response for key events.

    PubMed

    Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig

    2014-08-01

    The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.

  13. Distribution of Hydroxyl Groups in Kukersite Shale Oil: Quantitative Determination Using Fourier Transform Infrared (FT-IR) Spectroscopy.

    PubMed

    Baird, Zachariah Steven; Oja, Vahur; Järvik, Oliver

    2015-05-01

    This article describes the use of Fourier transform infrared (FT-IR) spectroscopy to quantitatively measure the hydroxyl concentrations among narrow boiling shale oil cuts. Shale oil samples were from an industrial solid heat carrier retort. Reference values were measured by titration and were used to create a partial least squares regression model from FT-IR data. The model had a root mean squared error (RMSE) of 0.44 wt% OH. This method was then used to study the distribution of hydroxyl groups among more than 100 shale oil cuts, which showed that hydroxyl content increased with the average boiling point of the cut up to about 350 °C and then leveled off and decreased.

  14. Spatiotemporal Characterization of a Fibrin Clot Using Quantitative Phase Imaging

    PubMed Central

    Gannavarpu, Rajshekhar; Bhaduri, Basanta; Tangella, Krishnarao; Popescu, Gabriel

    2014-01-01

    Studying the dynamics of fibrin clot formation and its morphology is an important problem in biology and has significant impact for several scientific and clinical applications. We present a label-free technique based on quantitative phase imaging to address this problem. Using quantitative phase information, we characterized fibrin polymerization in real-time and present a mathematical model describing the transition from liquid to gel state. By exploiting the inherent optical sectioning capability of our instrument, we measured the three-dimensional structure of the fibrin clot. From this data, we evaluated the fractal nature of the fibrin network and extracted the fractal dimension. Our non-invasive and speckle-free approach analyzes the clotting process without the need for external contrast agents. PMID:25386701

  15. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  16. Optimising the combination dosing strategy of abemaciclib and vemurafenib in BRAF-mutated melanoma xenograft tumours.

    PubMed

    Tate, Sonya C; Burke, Teresa F; Hartman, Daisy; Kulanthaivel, Palaniappan; Beckmann, Richard P; Cronier, Damien M

    2016-03-15

    Resistance to BRAF inhibition is a major cause of treatment failure for BRAF-mutated metastatic melanoma patients. Abemaciclib, a cyclin-dependent kinase 4 and 6 inhibitor, overcomes this resistance in xenograft tumours and offers a promising drug combination. The present work aims to characterise the quantitative pharmacology of the abemaciclib/vemurafenib combination using a semimechanistic pharmacokinetic/pharmacodynamic modelling approach and to identify an optimum dosing regimen for potential clinical evaluation. A PK/biomarker model was developed to connect abemaciclib/vemurafenib concentrations to changes in MAPK and cell cycle pathway biomarkers in A375 BRAF-mutated melanoma xenografts. Resultant tumour growth inhibition was described by relating (i) MAPK pathway inhibition to apoptosis, (ii) mitotic cell density to tumour growth and, under resistant conditions, (iii) retinoblastoma protein inhibition to cell survival. The model successfully described vemurafenib/abemaciclib-mediated changes in MAPK pathway and cell cycle biomarkers. Initial tumour shrinkage by vemurafenib, acquisition of resistance and subsequent abemaciclib-mediated efficacy were successfully captured and externally validated. Model simulations illustrate the benefit of intermittent vemurafenib therapy over continuous treatment, and indicate that continuous abemaciclib in combination with intermittent vemurafenib offers the potential for considerable tumour regression. The quantitative pharmacology of the abemaciclib/vemurafenib combination was successfully characterised and an optimised, clinically-relevant dosing strategy was identified.

  17. Modeling Drug- and Chemical-Induced Hepatotoxicity with Systems Biology Approaches

    PubMed Central

    Bhattacharya, Sudin; Shoda, Lisl K.M.; Zhang, Qiang; Woods, Courtney G.; Howell, Brett A.; Siler, Scott Q.; Woodhead, Jeffrey L.; Yang, Yuching; McMullen, Patrick; Watkins, Paul B.; Andersen, Melvin E.

    2012-01-01

    We provide an overview of computational systems biology approaches as applied to the study of chemical- and drug-induced toxicity. The concept of “toxicity pathways” is described in the context of the 2007 US National Academies of Science report, “Toxicity testing in the 21st Century: A Vision and A Strategy.” Pathway mapping and modeling based on network biology concepts are a key component of the vision laid out in this report for a more biologically based analysis of dose-response behavior and the safety of chemicals and drugs. We focus on toxicity of the liver (hepatotoxicity) – a complex phenotypic response with contributions from a number of different cell types and biological processes. We describe three case studies of complementary multi-scale computational modeling approaches to understand perturbation of toxicity pathways in the human liver as a result of exposure to environmental contaminants and specific drugs. One approach involves development of a spatial, multicellular “virtual tissue” model of the liver lobule that combines molecular circuits in individual hepatocytes with cell–cell interactions and blood-mediated transport of toxicants through hepatic sinusoids, to enable quantitative, mechanistic prediction of hepatic dose-response for activation of the aryl hydrocarbon receptor toxicity pathway. Simultaneously, methods are being developing to extract quantitative maps of intracellular signaling and transcriptional regulatory networks perturbed by environmental contaminants, using a combination of gene expression and genome-wide protein-DNA interaction data. A predictive physiological model (DILIsym™) to understand drug-induced liver injury (DILI), the most common adverse event leading to termination of clinical development programs and regulatory actions on drugs, is also described. The model initially focuses on reactive metabolite-induced DILI in response to administration of acetaminophen, and spans multiple biological scales. PMID:23248599

  18. A quantitative systems pharmacology model of blood coagulation network describes in vivo biomarker changes in non-bleeding subjects.

    PubMed

    Lee, D; Nayak, S; Martin, S W; Heatherington, A C; Vicini, P; Hua, F

    2016-12-01

    Essentials Baseline coagulation activity can be detected in non-bleeding state by in vivo biomarker levels. A detailed mathematical model of coagulation was developed to describe the non-bleeding state. Optimized model described in vivo biomarkers with recombinant activated factor VII treatment. Sensitivity analysis predicted prothrombin fragment 1 + 2 and D-dimer are regulated differently. Background Prothrombin fragment 1 + 2 (F 1 + 2 ), thrombin-antithrombin III complex (TAT) and D-dimer can be detected in plasma from non-bleeding hemostatically normal subjects or hemophilic patients. They are often used as safety or pharmacodynamic biomarkers for hemostatis-modulating therapies in the clinic, and provide insights into in vivo coagulation activity. Objectives To develop a quantitative systems pharmacology (QSP) model of the blood coagulation network to describe in vivo biomarkers, including F 1 + 2 , TAT, and D-dimer, under non-bleeding conditions. Methods The QSP model included intrinsic and extrinsic coagulation pathways, platelet activation state-dependent kinetics, and a two-compartment pharmacokinetics model for recombinant activated factor VII (rFVIIa). Literature data on F 1 + 2 and D-dimer at baseline and changes with rFVIIa treatment were used for parameter optimization. Multiparametric sensitivity analysis (MPSA) was used to understand key proteins that regulate F 1 + 2 , TAT and D-dimer levels. Results The model was able to describe tissue factor (TF)-dependent baseline levels of F 1 + 2 , TAT and D-dimer in a non-bleeding state, and their increases in hemostatically normal subjects and hemophilic patients treated with different doses of rFVIIa. The amount of TF required is predicted to be very low in a non-bleeding state. The model also predicts that these biomarker levels will be similar in hemostatically normal subjects and hemophilic patients. MPSA revealed that F 1 + 2 and TAT levels are highly correlated, and that D-dimer is more sensitive to the perturbation of coagulation protein concentrations. Conclusions A QSP model for non-bleeding baseline coagulation activity was established with data from clinically relevant in vivo biomarkers at baseline and changes in response to rFVIIa treatment. This model will provide future mechanistic insights into this system. © 2016 International Society on Thrombosis and Haemostasis.

  19. Analysis of airborne MAIS imaging spectrometric data for mineral exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Jinnian; Zheng Lanfen; Tong Qingxi

    1996-11-01

    The high spectral resolution imaging spectrometric system made quantitative analysis and mapping of surface composition possible. The key issue will be the quantitative approach for analysis of surface parameters for imaging spectrometer data. This paper describes the methods and the stages of quantitative analysis. (1) Extracting surface reflectance from imaging spectrometer image. Lab. and inflight field measurements are conducted for calibration of imaging spectrometer data, and the atmospheric correction has also been used to obtain ground reflectance by using empirical line method and radiation transfer modeling. (2) Determining quantitative relationship between absorption band parameters from the imaging spectrometer data andmore » chemical composition of minerals. (3) Spectral comparison between the spectra of spectral library and the spectra derived from the imagery. The wavelet analysis-based spectrum-matching techniques for quantitative analysis of imaging spectrometer data has beer, developed. Airborne MAIS imaging spectrometer data were used for analysis and the analysis results have been applied to the mineral and petroleum exploration in Tarim Basin area china. 8 refs., 8 figs.« less

  20. Evaluating a multi-criteria model for hazard assessment in urban design. The Porto Marghera case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luria, Paolo; Aspinall, Peter A

    2003-08-01

    The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based onmore » a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)« less

  1. On the feasibility of quantitative ultrasonic determination of fracture toughness: A literature review

    NASA Technical Reports Server (NTRS)

    Fu, L. S.

    1980-01-01

    The three main topics covered are: (1) fracture toughness and microstructure, (2) quantitative ultrasonic and microstructure; and (3) scattering and related mathematical methods. Literature in these areas is reviewed to give insight to the search of a theoretical foundation for quantitative ultrasonic measurement of fracture toughness. The literature review shows that fracture toughness is inherently related to the microstructure and in particular, it depends upon the spacing of inclusions or second particles and the aspect ratio of second phase particles. There are indications that ultrasonic velocity attenuation measurements can be used to determine fracture toughness. The leads to a review of the mathematical models available in solving boundary value problems related to microstructural factors that govern facture toughness and wave motion. A framework towards the theoretical study for the quantitative determination of fracture toughness is described and suggestions for future research are proposed.

  2. Wavelet modeling and prediction of the stability of states: the Roman Empire and the European Union

    NASA Astrophysics Data System (ADS)

    Yaroshenko, Tatyana Y.; Krysko, Dmitri V.; Dobriyan, Vitalii; Zhigalov, Maksim V.; Vos, Hendrik; Vandenabeele, Peter; Krysko, Vadim A.

    2015-09-01

    How can the stability of a state be quantitatively determined and its future stability predicted? The rise and collapse of empires and states is very complex, and it is exceedingly difficult to understand and predict it. Existing theories are usually formulated as verbal models and, consequently, do not yield sharply defined, quantitative prediction that can be unambiguously validated with data. Here we describe a model that determines whether the state is in a stable or chaotic condition and predicts its future condition. The central model, which we test, is that growth and collapse of states is reflected by the changes of their territories, populations and budgets. The model was simulated within the historical societies of the Roman Empire (400 BC to 400 AD) and the European Union (1957-2007) by using wavelets and analysis of the sign change of the spectrum of Lyapunov exponents. The model matches well with the historical events. During wars and crises, the state becomes unstable; this is reflected in the wavelet analysis by a significant increase in the frequency ω (t) and wavelet coefficients W (ω, t) and the sign of the largest Lyapunov exponent becomes positive, indicating chaos. We successfully reconstructed and forecasted time series in the Roman Empire and the European Union by applying artificial neural network. The proposed model helps to quantitatively determine and forecast the stability of a state.

  3. Physics of lumen growth.

    PubMed

    Dasgupta, Sabyasachi; Gupta, Kapish; Zhang, Yue; Viasnoff, Virgile; Prost, Jacques

    2018-05-22

    We model the dynamics of formation of intercellular secretory lumens. Using conservation laws, we quantitatively study the balance between paracellular leaks and the build-up of osmotic pressure in the lumen. Our model predicts a critical pumping threshold to expand stable lumens. Consistently with experimental observations in bile canaliculi, the model also describes a transition between a monotonous and oscillatory regime during luminogenesis as a function of ion and water transport parameters. We finally discuss the possible importance of regulation of paracellular leaks in intercellular tubulogenesis.

  4. Engaging Students In Modeling Instruction for Introductory Physics

    NASA Astrophysics Data System (ADS)

    Brewe, Eric

    2016-05-01

    Teaching introductory physics is arguably one of the most important things that a physics department does. It is the primary way that students from other science disciplines engage with physics and it is the introduction to physics for majors. Modeling instruction is an active learning strategy for introductory physics built on the premise that science proceeds through the iterative process of model construction, development, deployment, and revision. We describe the role that participating in authentic modeling has in learning and then explore how students engage in this process in the classroom. In this presentation, we provide a theoretical background on models and modeling and describe how these theoretical elements are enacted in the introductory university physics classroom. We provide both quantitative and video data to link the development of a conceptual model to the design of the learning environment and to student outcomes. This work is supported in part by DUE #1140706.

  5. Quantitative weight of evidence assessment of risk to honeybee colonies from use of imidacloprid, clothianidin, and thiamethoxam as seed treatments: a postscript.

    PubMed

    Solomon, Keith R; Stephenson, Gladys L

    2017-01-01

    This paper is a postscript to the four companion papers in this issue of the Journal (Solomon and Stephenson 2017a , 2017b ; Stephenson and Solomon 2017a , 2017b ). The first paper in the series described the conceptual model and the methods of the QWoE process. The other three papers described the application of the QWoE process to studies on imidacloprid (IMI), clothianidin (CTD), and thiamethoxam (TMX). This postscript was written to summarize the utility of the methods used in the quantitative weight of evidence (QWoE), the overall relevance of the results, and the environmental implications of the findings. Hopefully, this will be helpful to others who wish to conduct QWoEs and use these methods in assessment of risks.

  6. From Inverse Problems in Mathematical Physiology to Quantitative Differential Diagnoses

    PubMed Central

    Zenker, Sven; Rubin, Jonathan; Clermont, Gilles

    2007-01-01

    The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting), using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge). We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of differential diagnoses. We outline possible steps toward translating this computational approach to the bedside, to supplement today's evidence-based medicine with a quantitatively founded model-based medicine that integrates mechanistic knowledge with patient-specific information. PMID:17997590

  7. String model for the dynamics of glass-forming liquids

    PubMed Central

    Pazmiño Betancourt, Beatriz A.; Douglas, Jack F.; Starr, Francis W.

    2014-01-01

    We test the applicability of a living polymerization theory to describe cooperative string-like particle rearrangement clusters (strings) observed in simulations of a coarse-grained polymer melt. The theory quantitatively describes the interrelation between the average string length L, configurational entropy Sconf, and the order parameter for string assembly Φ without free parameters. Combining this theory with the Adam-Gibbs model allows us to predict the relaxation time τ in a lower temperature T range than accessible by current simulations. In particular, the combined theories suggest a return to Arrhenius behavior near Tg and a low T residual entropy, thus avoiding a Kauzmann “entropy crisis.” PMID:24880303

  8. String model for the dynamics of glass-forming liquids.

    PubMed

    Pazmiño Betancourt, Beatriz A; Douglas, Jack F; Starr, Francis W

    2014-05-28

    We test the applicability of a living polymerization theory to describe cooperative string-like particle rearrangement clusters (strings) observed in simulations of a coarse-grained polymer melt. The theory quantitatively describes the interrelation between the average string length L, configurational entropy Sconf, and the order parameter for string assembly Φ without free parameters. Combining this theory with the Adam-Gibbs model allows us to predict the relaxation time τ in a lower temperature T range than accessible by current simulations. In particular, the combined theories suggest a return to Arrhenius behavior near Tg and a low T residual entropy, thus avoiding a Kauzmann "entropy crisis."

  9. [Quantitative risk model for verocytotoxigenic Escherichia coli cross-contamination during homemade hamburger preparation].

    PubMed

    Signorini, M L; Frizzo, L S

    2009-01-01

    The objective of this study was to develop a quantitative risk model for verocytotoxigenic Escherichia coil (VTEC) cross-contamination during hamburger preparation at home. Published scientific information about the disease was considered for the elaboration of the model, which included a number of routines performed during food preparation in kitchens. The associated probabilities of bacterial transference between food items and kitchen utensils which best described each stage of the process were incorporated into the model by using @Risk software. Handling raw meat before preparing ready-to-eat foods (Odds ratio, OR, 6.57), as well as hand (OR = 12.02) and cutting board (OR = 5.02) washing habits were the major risk factors of VTEC cross-contamination from meat to vegetables. The information provided by this model should be considered when designing public information campaigns on hemolytic uremic syndrome risk directed to food handlers, in order to stress the importance of the above mentioned factors in disease transmission.

  10. Technical advance: live-imaging analysis of human dendritic cell migrating behavior under the influence of immune-stimulating reagents in an organotypic model of lung.

    PubMed

    Nguyen Hoang, Anh Thu; Chen, Puran; Björnfot, Sofia; Högstrand, Kari; Lock, John G; Grandien, Alf; Coles, Mark; Svensson, Mattias

    2014-09-01

    This manuscript describes technical advances allowing manipulation and quantitative analyses of human DC migratory behavior in lung epithelial tissue. DCs are hematopoietic cells essential for the maintenance of tissue homeostasis and the induction of tissue-specific immune responses. Important functions include cytokine production and migration in response to infection for the induction of proper immune responses. To design appropriate strategies to exploit human DC functional properties in lung tissue for the purpose of clinical evaluation, e.g., candidate vaccination and immunotherapy strategies, we have developed a live-imaging assay based on our previously described organotypic model of the human lung. This assay allows provocations and subsequent quantitative investigations of DC functional properties under conditions mimicking morphological and functional features of the in vivo parental tissue. We present protocols to set up and prepare tissue models for 4D (x, y, z, time) fluorescence-imaging analysis that allow spatial and temporal studies of human DCs in live epithelial tissue, followed by flow cytometry analysis of DCs retrieved from digested tissue models. This model system can be useful for elucidating incompletely defined pathways controlling DC functional responses to infection and inflammation in lung epithelial tissue, as well as the efficacy of locally administered candidate interventions. © 2014 Society for Leukocyte Biology.

  11. Electronic field emission models beyond the Fowler-Nordheim one

    NASA Astrophysics Data System (ADS)

    Lepetit, Bruno

    2017-12-01

    We propose several quantum mechanical models to describe electronic field emission from first principles. These models allow us to correlate quantitatively the electronic emission current with the electrode surface details at the atomic scale. They all rely on electronic potential energy surfaces obtained from three dimensional density functional theory calculations. They differ by the various quantum mechanical methods (exact or perturbative, time dependent or time independent), which are used to describe tunneling through the electronic potential energy barrier. Comparison of these models between them and with the standard Fowler-Nordheim one in the context of one dimensional tunneling allows us to assess the impact on the accuracy of the computed current of the approximations made in each model. Among these methods, the time dependent perturbative one provides a well-balanced trade-off between accuracy and computational cost.

  12. Dissipative-particle-dynamics model of biofilm growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zhijie; Meakin, Paul; Tartakovsky, Alexandre M.

    2011-06-13

    A dissipative particle dynamics (DPD) model for the quantitative simulation of biofilm growth controlled by substrate (nutrient) consumption, advective and diffusive substrate transport, and hydrodynamic interactions with fluid flow (including fragmentation and reattachment) is described. The model was used to simulate biomass growth, decay, and spreading. It predicts how the biofilm morphology depends on flow conditions, biofilm growth kinetics, the rheomechanical properties of the biofilm and adhesion to solid surfaces. The morphology of the model biofilm depends strongly on its rigidity and the magnitude of the body force that drives the fluid over the biofilm.

  13. Using ‘particle in a box’ models to calculate energy levels in semiconductor quantum well structures

    NASA Astrophysics Data System (ADS)

    Ebbens, A. T.

    2018-07-01

    Although infinite potential ‘particle in a box’ models are widely used to introduce quantised energy levels their predictions cannot be quantitatively compared with atomic emission spectra. Here, this problem is overcome by describing how both infinite and finite potential well models can be used to calculate the confined energy levels of semiconductor quantum wells. This is done by using physics and mathematics concepts that are accessible to pre-university students. The results of the models are compared with experimental data and their accuracy discussed.

  14. Punishment in human choice: direct or competitive suppression?

    PubMed Central

    Critchfield, Thomas S; Paletz, Elliott M; MacAleese, Kenneth R; Newland, M Christopher

    2003-01-01

    This investigation compared the predictions of two models describing the integration of reinforcement and punishment effects in operant choice. Deluty's (1976) competitive-suppression model (conceptually related to two-factor punishment theories) and de Villiers' (1980) direct-suppression model (conceptually related to one-factor punishment theories) have been tested previously in nonhumans but not at the individual level in humans. Mouse clicking by college students was maintained in a two-alternative concurrent schedule of variable-interval money reinforcement. Punishment consisted of variable-interval money losses. Experiment 1 verified that money loss was an effective punisher in this context. Experiment 2 consisted of qualitative model comparisons similar to those used in previous studies involving nonhumans. Following a no-punishment baseline, punishment was superimposed upon both response alternatives. Under schedule values for which the direct-suppression model, but not the competitive-suppression model, predicted distinct shifts from baseline performance, or vice versa, 12 of 14 individual-subject functions, generated by 7 subjects, supported the direct-suppression model. When the punishment models were converted to the form of the generalized matching law, least-squares linear regression fits for a direct-suppression model were superior to those of a competitive-suppression model for 6 of 7 subjects. In Experiment 3, a more thorough quantitative test of the modified models, fits for a direct-suppression model were superior in 11 of 13 cases. These results correspond well to those of investigations conducted with nonhumans and provide the first individual-subject evidence that a direct-suppression model, evaluated both qualitatively and quantitatively, describes human punishment better than a competitive-suppression model. We discuss implications for developing better punishment models and future investigations of punishment in human choice. PMID:13677606

  15. On the mechanochemical theory of biological pattern formation with application to vasculogenesis.

    PubMed

    Murray, James D

    2003-02-01

    We first describe the Murray-Oster mechanical theory of pattern formation, the biological basis of which is experimentally well documented. The model quantifies the interaction of cells and the extracellular matrix via the cell-generated forces. The model framework is described in quantitative detail. Vascular endothelial cells, when cultured on gelled basement membrane matrix, rapidly aggregate into clusters while deforming the matrix into a network of cord-like structures tessellating the planar culture. We apply the mechanical theory of pattern formation to this culture system and show that neither strain-biased anisotropic cell traction nor cell migration are necessary for pattern formation: isotropic, strain-stimulated cell traction is sufficient to form the observed patterns. Predictions from the model were confirmed experimentally.

  16. Antineutrino Charged-Current Reactions on Hydrocarbon with Low Momentum Transfer

    NASA Astrophysics Data System (ADS)

    Gran, R.; Betancourt, M.; Elkins, M.; Rodrigues, P. A.; Akbar, F.; Aliaga, L.; Andrade, D. A.; Bashyal, A.; Bellantoni, L.; Bercellie, A.; Bodek, A.; Bravar, A.; Budd, H.; Vera, G. F. R. Caceres; Cai, T.; Carneiro, M. F.; Coplowe, D.; da Motta, H.; Dytman, S. A.; Díaz, G. A.; Felix, J.; Fields, L.; Fine, R.; Gallagher, H.; Ghosh, A.; Haider, H.; Han, J. Y.; Harris, D. A.; Henry, S.; Jena, D.; Kleykamp, J.; Kordosky, M.; Le, T.; Leistico, J. R.; Lovlein, A.; Lu, X.-G.; Maher, E.; Manly, S.; Mann, W. A.; Marshall, C. M.; McFarland, K. S.; McGowan, A. M.; Messerly, B.; Miller, J.; Mislivec, A.; Morfín, J. G.; Mousseau, J.; Naples, D.; Nelson, J. K.; Nguyen, C.; Norrick, A.; Nuruzzaman, Olivier, A.; Paolone, V.; Patrick, C. E.; Perdue, G. N.; Ramírez, M. A.; Ransome, R. D.; Ray, H.; Ren, L.; Rimal, D.; Ruterbories, D.; Schellman, H.; Salinas, C. J. Solano; Su, H.; Sultana, M.; Falero, S. Sánchez; Valencia, E.; Wolcott, J.; Wospakrik, M.; Yaeggy, B.; Minerva Collaboration

    2018-06-01

    We report on multinucleon effects in low momentum transfer (<0.8 GeV /c ) antineutrino interactions on plastic (CH) scintillator. These data are from the 2010-2011 antineutrino phase of the MINERvA experiment at Fermilab. The hadronic energy spectrum of this inclusive sample is well described when a screening effect at a low energy transfer and a two-nucleon knockout process are added to a relativistic Fermi gas model of quasielastic, Δ resonance, and higher resonance processes. In this analysis, model elements introduced to describe previously published neutrino results have quantitatively similar benefits for this antineutrino sample. We present the results as a double-differential cross section to accelerate the investigation of alternate models for antineutrino scattering off nuclei.

  17. Antineutrino Charged-Current Reactions on Hydrocarbon with Low Momentum Transfer.

    PubMed

    Gran, R; Betancourt, M; Elkins, M; Rodrigues, P A; Akbar, F; Aliaga, L; Andrade, D A; Bashyal, A; Bellantoni, L; Bercellie, A; Bodek, A; Bravar, A; Budd, H; Vera, G F R Caceres; Cai, T; Carneiro, M F; Coplowe, D; da Motta, H; Dytman, S A; Díaz, G A; Felix, J; Fields, L; Fine, R; Gallagher, H; Ghosh, A; Haider, H; Han, J Y; Harris, D A; Henry, S; Jena, D; Kleykamp, J; Kordosky, M; Le, T; Leistico, J R; Lovlein, A; Lu, X-G; Maher, E; Manly, S; Mann, W A; Marshall, C M; McFarland, K S; McGowan, A M; Messerly, B; Miller, J; Mislivec, A; Morfín, J G; Mousseau, J; Naples, D; Nelson, J K; Nguyen, C; Norrick, A; Nuruzzaman; Olivier, A; Paolone, V; Patrick, C E; Perdue, G N; Ramírez, M A; Ransome, R D; Ray, H; Ren, L; Rimal, D; Ruterbories, D; Schellman, H; Salinas, C J Solano; Su, H; Sultana, M; Falero, S Sánchez; Valencia, E; Wolcott, J; Wospakrik, M; Yaeggy, B

    2018-06-01

    We report on multinucleon effects in low momentum transfer (<0.8  GeV/c) antineutrino interactions on plastic (CH) scintillator. These data are from the 2010-2011 antineutrino phase of the MINERvA experiment at Fermilab. The hadronic energy spectrum of this inclusive sample is well described when a screening effect at a low energy transfer and a two-nucleon knockout process are added to a relativistic Fermi gas model of quasielastic, Δ resonance, and higher resonance processes. In this analysis, model elements introduced to describe previously published neutrino results have quantitatively similar benefits for this antineutrino sample. We present the results as a double-differential cross section to accelerate the investigation of alternate models for antineutrino scattering off nuclei.

  18. Probing the Interaction of Ionic Liquids with CO2: A Raman Spectroscopy and Ab Initio Study

    DTIC Science & Technology

    2008-05-05

    called physisorption. Regardless of the sorption mechanism, a few quantitative parameters can be used to describe a gas-liquid interaction at a...Controllers (I and II) I II 22 and tube fittings, 0.062 inch perfluoroalkoxy ( PFA ) tubing, Tylan General Model FC-128 Flow Controllers

  19. Energy Cascade in Fermi-Pasta Models

    NASA Astrophysics Data System (ADS)

    Ponno, A.; Bambusi, D.

    We show that, for long-wavelength initial conditions, the FPU dynamics is described, up to a certain time, by two KdV-like equations, which represent the resonant Hamiltonian normal form of the system. The energy cascade taking place in the system is then quantitatively characterized by arguments of dimensional analysis based on such equations.

  20. Photopolarimetry of scattering surfaces and their interpretation by computer model

    NASA Technical Reports Server (NTRS)

    Wolff, M.

    1979-01-01

    Wolff's computer model of a rough planetary surface was simplified and revised. Close adherence to the actual geometry of a pitted surface and the inclusion of a function for diffuse light resulted in a quantitative model comparable to observations by planetary satellites and asteroids. A function is also derived to describe diffuse light emitted from a particulate surface. The function is in terms of the indices of refraction of the surface material, particle size, and viewing angles. Computer-generated plots describe the observable and theoretical light components for the Moon, Mercury, Mars and a spectrum of asteroids. Other plots describe the effects of changing surface material properties. Mathematical results are generated to relate the parameters of the negative polarization branch to the properties of surface pitting. An explanation is offered for the polarization of the rings of Saturn, and the average diameter of ring objects is found to be 30 to 40 centimeters.

  1. Quantitative and Qualitative Differences in Morphological Traits Revealed between Diploid Fragaria Species

    PubMed Central

    SARGENT, DANIEL J.; GEIBEL, M.; HAWKINS, J. A.; WILKINSON, M. J.; BATTEY, N. H.; SIMPSON, D. W.

    2004-01-01

    • Background and Aims The aims of this investigation were to highlight the qualitative and quantitative diversity apparent between nine diploid Fragaria species and produce interspecific populations segregating for a large number of morphological characters suitable for quantitative trait loci analysis. • Methods A qualitative comparison of eight described diploid Fragaria species was performed and measurements were taken of 23 morphological traits from 19 accessions including eight described species and one previously undescribed species. A principal components analysis was performed on 14 mathematically unrelated traits from these accessions, which partitioned the species accessions into distinct morphological groups. Interspecific crosses were performed with accessions of species that displayed significant quantitative divergence and, from these, populations that should segregate for a range of quantitative traits were raised. • Key Results Significant differences between species were observed for all 23 morphological traits quantified and three distinct groups of species accessions were observed after the principal components analysis. Interspecific crosses were performed between these groups, and F2 and backcross populations were raised that should segregate for a range of morphological characters. In addition, the study highlighted a number of distinctive morphological characters in many of the species studied. • Conclusions Diploid Fragaria species are morphologically diverse, yet remain highly interfertile, making the group an ideal model for the study of the genetic basis of phenotypic differences between species through map-based investigation using quantitative trait loci. The segregating interspecific populations raised will be ideal for such investigations and could also provide insights into the nature and extent of genome evolution within this group. PMID:15469944

  2. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  3. Quantitating Antibody Uptake In Vivo: Conditional Dependence on Antigen Expression Levels

    PubMed Central

    Thurber, Greg M.; Weissleder, Ralph

    2010-01-01

    Purpose Antibodies form an important class of cancer therapeutics, and there is intense interest in using them for imaging applications in diagnosis and monitoring of cancer treatment. Despite the expanding body of knowledge describing pharmacokinetic and pharmacodynamic interactions of antibodies in vivo, discrepancies remain over the effect of antigen expression level on tumoral uptake with some reports indicating a relationship between uptake and expression and others showing no correlation. Procedures Using a cell line with high EpCAM expression and moderate EGFR expression, fluorescent antibodies with similar plasma clearance were imaged in vivo. A mathematical model and mouse xenograft experiments were used to describe the effect of antigen expression on uptake of these high affinity antibodies. Results As predicted by the theoretical model, under subsaturating conditions, uptake of the antibodies in such tumors is similar because localization of both probes is limited by delivery from the vasculature. In a separate experiment, when the tumor is saturated, the uptake becomes dependent on the number of available binding sites. In addition, targeting of small micrometastases is shown to be higher than larger vascularized tumors. Conclusions These results are consistent with the prediction that high affinity antibody uptake is dependent on antigen expression levels for saturating doses and delivery for subsaturating doses. It is imperative for any probe to understand whether quantitative uptake is a measure of biomarker expression or transport to the region of interest. The data provide support for a predictive theoretical model of antibody uptake, enabling it to be used as a starting point for the design of more efficacious therapies and timely quantitative imaging probes. PMID:20809210

  4. Quantitating antibody uptake in vivo: conditional dependence on antigen expression levels.

    PubMed

    Thurber, Greg M; Weissleder, Ralph

    2011-08-01

    Antibodies form an important class of cancer therapeutics, and there is intense interest in using them for imaging applications in diagnosis and monitoring of cancer treatment. Despite the expanding body of knowledge describing pharmacokinetic and pharmacodynamic interactions of antibodies in vivo, discrepancies remain over the effect of antigen expression level on tumoral uptake with some reports indicating a relationship between uptake and expression and others showing no correlation. Using a cell line with high epithelial cell adhesion molecule expression and moderate epidermal growth factor receptor expression, fluorescent antibodies with similar plasma clearance were imaged in vivo. A mathematical model and mouse xenograft experiments were used to describe the effect of antigen expression on uptake of these high-affinity antibodies. As predicted by the theoretical model, under subsaturating conditions, uptake of the antibodies in such tumors is similar because localization of both probes is limited by delivery from the vasculature. In a separate experiment, when the tumor is saturated, the uptake becomes dependent on the number of available binding sites. In addition, targeting of small micrometastases is shown to be higher than larger vascularized tumors. These results are consistent with the prediction that high affinity antibody uptake is dependent on antigen expression levels for saturating doses and delivery for subsaturating doses. It is imperative for any probe to understand whether quantitative uptake is a measure of biomarker expression or transport to the region of interest. The data provide support for a predictive theoretical model of antibody uptake, enabling it to be used as a starting point for the design of more efficacious therapies and timely quantitative imaging probes.

  5. Proposal for a quantitative index of flood disasters.

    PubMed

    Feng, Lihua; Luo, Gaoyuan

    2010-07-01

    Drawing on calculations of wind scale and earthquake magnitude, this paper develops a new quantitative method for measuring flood magnitude and disaster intensity. Flood magnitude is the quantitative index that describes the scale of a flood; the flood's disaster intensity is the quantitative index describing the losses caused. Both indices have numerous theoretical and practical advantages with definable concepts and simple applications, which lend them key practical significance.

  6. Measuring and modeling salience with the theory of visual attention.

    PubMed

    Krüger, Alexander; Tünnermann, Jan; Scharlau, Ingrid

    2017-08-01

    For almost three decades, the theory of visual attention (TVA) has been successful in mathematically describing and explaining a wide variety of phenomena in visual selection and recognition with high quantitative precision. Interestingly, the influence of feature contrast on attention has been included in TVA only recently, although it has been extensively studied outside the TVA framework. The present approach further develops this extension of TVA's scope by measuring and modeling salience. An empirical measure of salience is achieved by linking different (orientation and luminance) contrasts to a TVA parameter. In the modeling part, the function relating feature contrasts to salience is described mathematically and tested against alternatives by Bayesian model comparison. This model comparison reveals that the power function is an appropriate model of salience growth in the dimensions of orientation and luminance contrast. Furthermore, if contrasts from the two dimensions are combined, salience adds up additively.

  7. A physical model describing the interaction of nuclear transport receptors with FG nucleoporin domain assemblies.

    PubMed

    Zahn, Raphael; Osmanović, Dino; Ehret, Severin; Araya Callis, Carolina; Frey, Steffen; Stewart, Murray; You, Changjiang; Görlich, Dirk; Hoogenboom, Bart W; Richter, Ralf P

    2016-04-08

    The permeability barrier of nuclear pore complexes (NPCs) controls bulk nucleocytoplasmic exchange. It consists of nucleoporin domains rich in phenylalanine-glycine motifs (FG domains). As a bottom-up nanoscale model for the permeability barrier, we have used planar films produced with three different end-grafted FG domains, and quantitatively analyzed the binding of two different nuclear transport receptors (NTRs), NTF2 and Importin β, together with the concomitant film thickness changes. NTR binding caused only moderate changes in film thickness; the binding isotherms showed negative cooperativity and could all be mapped onto a single master curve. This universal NTR binding behavior - a key element for the transport selectivity of the NPC - was quantitatively reproduced by a physical model that treats FG domains as regular, flexible polymers, and NTRs as spherical colloids with a homogeneous surface, ignoring the detailed arrangement of interaction sites along FG domains and on the NTR surface.

  8. A Quantitative Model of Motility Reveals Low-Dimensional Variation in Exploratory Behavior Across Multiple Nematode Species

    NASA Astrophysics Data System (ADS)

    Helms, Stephen; Avery, Leon; Stephens, Greg; Shimizu, Tom

    2014-03-01

    Animal behavior emerges from many layers of biological organization--from molecular signaling pathways and neuronal networks to mechanical outputs of muscles. In principle, the large number of interconnected variables at each of these layers could imply dynamics that are complex and hard to control or even tinker with. Yet, for organisms to survive in a competitive, ever-changing environment, behavior must readily adapt. We applied quantitative modeling to identify important aspects of behavior in chromadorean nematodes ranging from the lab strain C. elegans N2 to wild strains and distant species. We revealed subtle yet important features such as speed control and heavy-tailed directional changes. We found that the parameters describing this behavioral model varied among individuals and across species in a correlated way that is consistent with a trade-off between exploratory and exploitative behavior.

  9. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  10. Experimental methods and transport models for drug delivery across the blood-brain barrier.

    PubMed

    Fu, Bingmei M

    2012-06-01

    The blood-brain barrier (BBB) is a dynamic barrier essential for maintaining the micro-environment of the brain. Although the special anatomical features of the BBB determine its protective role for the central nervous system (CNS) from blood-born neurotoxins, however, the BBB extremely limits the therapeutic efficacy of drugs into the CNS, which greatly hinders the treatment of major brain diseases. This review summarized the unique structures of the BBB, described a variety of in vivo and in vitro experimental methods for determining the transport properties of the BBB, e.g., the permeability of the BBB to water, ions, and solutes including nutrients, therapeutic agents and drug carriers, and presented newly developed mathematical models which quantitatively correlate the anatomical structures of the BBB with its barrier functions. Finally, on the basis of the experimental observations and the quantitative models, several strategies for drug delivery through the BBB were proposed.

  11. Experimental Methods and Transport Models for Drug Delivery across the Blood-Brain Barrier

    PubMed Central

    Fu, Bingmei M

    2017-01-01

    The blood-brain barrier (BBB) is a dynamic barrier essential for maintaining the micro-environment of the brain. Although the special anatomical features of the BBB determine its protective role for the central nervous system (CNS) from blood-born neurotoxins, however, the BBB extremely limits the therapeutic efficacy of drugs into the CNS, which greatly hinders the treatment of major brain diseases. This review summarized the unique structures of the BBB, described a variety of in vivo and in vitro experimental methods for determining the transport properties of the BBB, e.g., the permeability of the BBB to water, ions, and solutes including nutrients, therapeutic agents and drug carriers, and presented newly developed mathematical models which quantitatively correlate the anatomical structures of the BBB with its barrier functions. Finally, on the basis of the experimental observations and the quantitative models, several strategies for drug delivery through the BBB were proposed. PMID:22201587

  12. Quantitative 3D determination of self-assembled structures on nanoparticles using small angle neutron scattering.

    PubMed

    Luo, Zhi; Marson, Domenico; Ong, Quy K; Loiudice, Anna; Kohlbrecher, Joachim; Radulescu, Aurel; Krause-Heuer, Anwen; Darwish, Tamim; Balog, Sandor; Buonsanti, Raffaella; Svergun, Dmitri I; Posocco, Paola; Stellacci, Francesco

    2018-04-09

    The ligand shell (LS) determines a number of nanoparticles' properties. Nanoparticles' cores can be accurately characterized; yet the structure of the LS, when composed of mixture of molecules, can be described only qualitatively (e.g., patchy, Janus, and random). Here we show that quantitative description of the LS' morphology of monodisperse nanoparticles can be obtained using small-angle neutron scattering (SANS), measured at multiple contrasts, achieved by either ligand or solvent deuteration. Three-dimensional models of the nanoparticles' core and LS are generated using an ab initio reconstruction method. Characteristic length scales extracted from the models are compared with simulations. We also characterize the evolution of the LS upon thermal annealing, and investigate the LS morphology of mixed-ligand copper and silver nanoparticles as well as gold nanoparticles coated with ternary mixtures. Our results suggest that SANS combined with multiphase modeling is a versatile approach for the characterization of nanoparticles' LS.

  13. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  14. Agent-based modeling as a tool for program design and evaluation.

    PubMed

    Lawlor, Jennifer A; McGirr, Sara

    2017-12-01

    Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. The Structure of Psychopathology: Toward an Expanded Quantitative Empirical Model

    PubMed Central

    Wright, Aidan G.C.; Krueger, Robert F.; Hobbs, Megan J.; Markon, Kristian E.; Eaton, Nicholas R.; Slade, Tim

    2013-01-01

    There has been substantial recent interest in the development of a quantitative, empirically based model of psychopathology. However, the majority of pertinent research has focused on analyses of diagnoses, as described in current official nosologies. This is a significant limitation because existing diagnostic categories are often heterogeneous. In the current research, we aimed to redress this limitation of the existing literature, and to directly compare the fit of categorical, continuous, and hybrid (i.e., combined categorical and continuous) models of syndromes derived from indicators more fine-grained than diagnoses. We analyzed data from a large representative epidemiologic sample (the 2007 Australian National Survey of Mental Health and Wellbeing; N = 8,841). Continuous models provided the best fit for each syndrome we observed (Distress, Obsessive Compulsivity, Fear, Alcohol Problems, Drug Problems, and Psychotic Experiences). In addition, the best fitting higher-order model of these syndromes grouped them into three broad spectra: Internalizing, Externalizing, and Psychotic Experiences. We discuss these results in terms of future efforts to refine emerging empirically based, dimensional-spectrum model of psychopathology, and to use the model to frame psychopathology research more broadly. PMID:23067258

  16. The effects of intra-particle concentration gradient on consecutive adsorption-desorption of oryzanol from rice bran oil in packed-column

    NASA Astrophysics Data System (ADS)

    Susanti, Ari Diana; Sediawan, Wahyudi Budi; Wirawan, Sang Kompiang; Budhijanto

    2017-05-01

    Utilization of valuable trace components in agriculture by product such as rice bran oil is interesting to be explored. Among the valuables, oryzanol, a healthy nutrition for cardiovascular prevention, is the most promising one. Literature studies suggest that adsorption-desorption is a prospective method for oryzanol isolation. Design of commercial scale adsorption-desorption system for oryzanol needs a quantitative description of the phenomena involved. In this study, quantitative modeling of the consecutive adsorption-desorption in packed column has been proposed and verified through experimental data. The offered model takes into account the intra-particle concentration gradient in the adsorbent particle. In this model, the rate of mass transfer from the bulk of the liquid to the surface of the adsorbent particle or vice versa is expressed by film theory. The mass transfer of oryzanol from the liquid in the pore of the particle to the adjacent pore surface is assumed to be instantaneous, so solid-liquid equilibrium on the surface of the pores is always attained. For simplicity, the adsorption equilibrium model applied was coefficient distribution approach. The values of the parameters implicated in the model were obtained by curve fitting to the experimental data. It verified that the model proposed works well to quantitatively describe the consecutive adsorption-desorption of oryzanol from rice bran oil in packed column.

  17. Using concept maps to describe undergraduate students’ mental model in microbiology course

    NASA Astrophysics Data System (ADS)

    Hamdiyati, Y.; Sudargo, F.; Redjeki, S.; Fitriani, A.

    2018-05-01

    The purpose of this research was to describe students’ mental model in a mental model based-microbiology course using concept map as assessment tool. Respondents were 5th semester of undergraduate students of Biology Education of Universitas Pendidikan Indonesia. The mental modelling instrument used was concept maps. Data were taken on Bacteria sub subject. A concept map rubric was subsequently developed with a maximum score of 4. Quantitative data was converted into a qualitative one to determine mental model level, namely: emergent = score 1, transitional = score 2, close to extended = score 3, and extended = score 4. The results showed that mental model level on bacteria sub subject before the implementation of mental model based-microbiology course was at the transitional level. After implementation of mental model based-microbiology course, mental model was at transitional level, close to extended, and extended. This indicated an increase in the level of students’ mental model after the implementation of mental model based-microbiology course using concept map as assessment tool.

  18. Longitudinal Differences in the Low-latitude Ionosphere and in the Ionospheric Variability

    NASA Astrophysics Data System (ADS)

    Goncharenko, L. P.; Zhang, S.; Liu, H.; Tsugawa, T.; Batista, I. S.; Reinisch, B. W.

    2017-12-01

    Analysis of longitudinal differences in ionospheric parameters can illuminate variety of mechanisms responsible for ionospheric variability. In this study, we aim to 1) quantitatively describe major features of longitudinal differences in peak electron density in the low-latitude ionosphere; 2) examine differences in ionospheric variability at different longitude sectors, and 3) illustrate longitudinal differences in ionospheric response to a large disturbance event, sudden stratospheric warming of 2016. We examine NmF2 observations by a network of ionosondes in the American (30-80W) and Asian (110-170E) longitudinal sectors. Selected instruments are located in the vicinity of EIA troughs (Jicamarca, Sao Luis, Guam, Kwajalein), northern and southern crests of EIA (Boa Vista, Tucuman, Cachoeira Paulista, Okinawa), and beyond EIA crests (Ramey, Yamagawa, Kokubunji). To examine main ionospheric features at each location, we use long-term datasets collected at each site to construct empirical models that describe variations in NmF2 as a function of local time, season, solar flux, and geomagnetic activity. This set of empirical models can be used to accurately describe background ionospheric behavior and serve as a set of observational benchmarks for global circulation models. It reveals, for example, higher NmF2 in the EIA trough in the Asian sector as compared to the American sector. Further, we quantitatively describe variability in NmF2 as a difference between local observations and local empirical model, and find that American sector's EIA trough has overall higher variability that maximizes for all local times during wintertime, while Asian sector trough variability does not change significantly with season. Additionally, local empirical models are used to isolate ionospheric features resulting from dynamical disturbances of different origin (e.g. geomagnetic storms, convective activity, sudden stratospheric warming events, etc.). We illustrate this approach with the case of sudden stratospheric warming of 2016.

  19. Quantitative study of FORC diagrams in thermally corrected Stoner- Wohlfarth nanoparticles systems

    NASA Astrophysics Data System (ADS)

    De Biasi, E.; Curiale, J.; Zysler, R. D.

    2016-12-01

    The use of FORC diagrams is becoming increasingly popular among researchers devoted to magnetism and magnetic materials. However, a thorough interpretation of this kind of diagrams, in order to achieve quantitative information, requires an appropriate model of the studied system. For that reason most of the FORC studies are used for a qualitative analysis. In magnetic systems thermal fluctuations "blur" the signatures of the anisotropy, volume and particle interactions distributions, therefore thermal effects in nanoparticles systems conspire against a proper interpretation and analysis of these diagrams. Motivated by this fact, we have quantitatively studied the degree of accuracy of the information extracted from FORC diagrams for the special case of single-domain thermal corrected Stoner- Wohlfarth (easy axes along the external field orientation) nanoparticles systems. In this work, the starting point is an analytical model that describes the behavior of a magnetic nanoparticles system as a function of field, anisotropy, temperature and measurement time. In order to study the quantitative degree of accuracy of our model, we built FORC diagrams for different archetypical cases of magnetic nanoparticles. Our results show that from the quantitative information obtained from the diagrams, under the hypotheses of the proposed model, is possible to recover the features of the original system with accuracy above 95%. This accuracy is improved at low temperatures and also it is possible to access to the anisotropy distribution directly from the FORC coercive field profile. Indeed, our simulations predict that the volume distribution plays a secondary role being the mean value and its deviation the only important parameters. Therefore it is possible to obtain an accurate result for the inversion and interaction fields despite the features of the volume distribution.

  20. The Separation and Quantitation of Peptides with and without Oxidation of Methionine and Deamidation of Asparagine Using Hydrophilic Interaction Liquid Chromatography with Mass Spectrometry (HILIC-MS)

    NASA Astrophysics Data System (ADS)

    Badgett, Majors J.; Boyes, Barry; Orlando, Ron

    2017-05-01

    Peptides with deamidated asparagine residues and oxidized methionine residues are often not resolved sufficiently to allow quantitation of their native and modified forms using reversed phase (RP) chromatography. The accurate quantitation of these modifications is vital in protein biotherapeutic analysis because they can affect a protein's function, activity, and stability. We demonstrate here that hydrophilic interaction liquid chromatography (HILIC) adequately and predictably separates peptides with these modifications from their native counterparts. Furthermore, coefficients describing the extent of the hydrophilicity of these modifications have been derived and were incorporated into a previously made peptide retention prediction model that is capable of predicting the retention times of peptides with and without these modifications.

  1. The Separation and Quantitation of Peptides with and without Oxidation of Methionine and Deamidation of Asparagine Using Hydrophilic Interaction Liquid Chromatography with Mass Spectrometry (HILIC-MS).

    PubMed

    Badgett, Majors J; Boyes, Barry; Orlando, Ron

    2017-05-01

    Peptides with deamidated asparagine residues and oxidized methionine residues are often not resolved sufficiently to allow quantitation of their native and modified forms using reversed phase (RP) chromatography. The accurate quantitation of these modifications is vital in protein biotherapeutic analysis because they can affect a protein's function, activity, and stability. We demonstrate here that hydrophilic interaction liquid chromatography (HILIC) adequately and predictably separates peptides with these modifications from their native counterparts. Furthermore, coefficients describing the extent of the hydrophilicity of these modifications have been derived and were incorporated into a previously made peptide retention prediction model that is capable of predicting the retention times of peptides with and without these modifications. Graphical Abstract ᅟ.

  2. Evaluation of a quantitative structure-property relationship (QSPR) for predicting mid-visible refractive index of secondary organic aerosol (SOA).

    PubMed

    Redmond, Haley; Thompson, Jonathan E

    2011-04-21

    In this work we describe and evaluate a simple scheme by which the refractive index (λ = 589 nm) of non-absorbing components common to secondary organic aerosols (SOA) may be predicted from molecular formula and density (g cm(-3)). The QSPR approach described is based on three parameters linked to refractive index-molecular polarizability, the ratio of mass density to molecular weight, and degree of unsaturation. After computing these quantities for a training set of 111 compounds common to atmospheric aerosols, multi-linear regression analysis was conducted to establish a quantitative relationship between the parameters and accepted value of refractive index. The resulting quantitative relationship can often estimate refractive index to ±0.01 when averaged across a variety of compound classes. A notable exception is for alcohols for which the model consistently underestimates refractive index. Homogenous internal mixtures can conceivably be addressed through use of either the volume or mole fraction mixing rules commonly used in the aerosol community. Predicted refractive indices reconstructed from chemical composition data presented in the literature generally agree with previous reports of SOA refractive index. Additionally, the predicted refractive indices lie near measured values we report for λ = 532 nm for SOA generated from vapors of α-pinene (R.I. 1.49-1.51) and toluene (R.I. 1.49-1.50). We envision the QSPR method may find use in reconstructing optical scattering of organic aerosols if mass composition data is known. Alternatively, the method described could be incorporated into in models of organic aerosol formation/phase partitioning to better constrain organic aerosol optical properties.

  3. Ranking and validation of the spallation models for description of intermediate mass fragment emission from p + Ag collisions at 480 MeV incident proton beam energy

    NASA Astrophysics Data System (ADS)

    Sharma, Sushil K.; Kamys, Bogusław; Goldenbaum, Frank; Filges, Detlef

    2016-06-01

    Double-differential cross-sections d2σ/dΩ dE for isotopically identified intermediate mass fragments ( 6Li up to 27Mg from nuclear reactions induced by 480 MeV protons impinging on a silver target were analyzed in the frame of a two-step model. The first step of the reaction was described by the intranuclear cascade model INCL4.6 and the second one by four different models (ABLA07,GEM2, GEMINI++, and SMM). The experimental spectra reveal the presence of low-energy, isotropic as well as high-energy, forward-peaked contributions. The INCL4.6 model offers a possibility to describe the latter contribution for light intermediate mass fragments by coalescence of the emitted nucleons. The qualitative agreement of the model predictions with the data was observed but the high-energy tails of the spectra were significantly overestimated. The shape of the isotropic part of the spectra was reproduced by all four models. The GEM2 model strongly underestimated the value of the cross-sections for heavier IMF whereas the SMM and ABLA07 models generally overestimated the data. The best quantitative description of the data was offered by GEMINI++, however, a discrepancy between the data and the model cross-sections still remained for almost all reaction products, especially at forward angles. It indicates that non-equilibrium processes are present which cannot be reproduced by the applied models. The goodness of the data description was judged quantitatively using two statistical deviation factors, the H-factor and the M-factor, as a tool for ranking and validation of the theoretical models.

  4. An Inside View: The Utility of Quantitative Observation in Understanding College Educational Experiences

    ERIC Educational Resources Information Center

    Campbell, Corbin M.

    2017-01-01

    This article describes quantitative observation as a method for understanding college educational experiences. Quantitative observation has been used widely in several fields and in K-12 education, but has had limited application to research in higher education and student affairs to date. The article describes the central tenets of quantitative…

  5. A General Model for Estimating Macroevolutionary Landscapes.

    PubMed

    Boucher, Florian C; Démery, Vincent; Conti, Elena; Harmon, Luke J; Uyeda, Josef

    2018-03-01

    The evolution of quantitative characters over long timescales is often studied using stochastic diffusion models. The current toolbox available to students of macroevolution is however limited to two main models: Brownian motion and the Ornstein-Uhlenbeck process, plus some of their extensions. Here, we present a very general model for inferring the dynamics of quantitative characters evolving under both random diffusion and deterministic forces of any possible shape and strength, which can accommodate interesting evolutionary scenarios like directional trends, disruptive selection, or macroevolutionary landscapes with multiple peaks. This model is based on a general partial differential equation widely used in statistical mechanics: the Fokker-Planck equation, also known in population genetics as the Kolmogorov forward equation. We thus call the model FPK, for Fokker-Planck-Kolmogorov. We first explain how this model can be used to describe macroevolutionary landscapes over which quantitative traits evolve and, more importantly, we detail how it can be fitted to empirical data. Using simulations, we show that the model has good behavior both in terms of discrimination from alternative models and in terms of parameter inference. We provide R code to fit the model to empirical data using either maximum-likelihood or Bayesian estimation, and illustrate the use of this code with two empirical examples of body mass evolution in mammals. FPK should greatly expand the set of macroevolutionary scenarios that can be studied since it opens the way to estimating macroevolutionary landscapes of any conceivable shape. [Adaptation; bounds; diffusion; FPK model; macroevolution; maximum-likelihood estimation; MCMC methods; phylogenetic comparative data; selection.].

  6. Rheological properties of aging thermosensitive suspensions.

    PubMed

    Purnomo, Eko H; van den Ende, Dirk; Mellema, Jorrit; Mugele, Frieder

    2007-08-01

    Aging observed in soft glassy materials inherently affects the rheological properties of these systems and has been described by the soft glassy rheology (SGR) model [S. M. Fielding, J. Rheol. 44, 323 (2000)]. In this paper, we report the measured linear rheological behavior of thermosensitive microgel suspensions and compare it quantitatively with the predictions of the SGR model. The dynamic moduli [G'(omega,t) and G''(omega,t)] obtained from oscillatory measurements are in good agreement with the model. The model also predicts quantitatively the creep compliance J(t - t(w),t(w)), obtained from step stress experiments, for the short time regime [(t - t(w)) < t(w)]. The relative effective temperature X/X(g) obtained from both the oscillatory and the step stress experiments is indeed less than 1 (XX(g) < 1) in agreement with the definition of aging. Moreover, the elasticity of the compressed particles (G(p)) increases with increased compression, i.e., the degree of hindrance and consequently also the bulk elasticity (G' and 1/J) increases with the degree of compression.

  7. Rheological properties of aging thermosensitive suspensions

    NASA Astrophysics Data System (ADS)

    Purnomo, Eko H.; van den Ende, Dirk; Mellema, Jorrit; Mugele, Frieder

    2007-08-01

    Aging observed in soft glassy materials inherently affects the rheological properties of these systems and has been described by the soft glassy rheology (SGR) model [S. M. Fielding , J. Rheol. 44, 323 (2000)]. In this paper, we report the measured linear rheological behavior of thermosensitive microgel suspensions and compare it quantitatively with the predictions of the SGR model. The dynamic moduli [ G'(ω,t) and G″(ω,t) ] obtained from oscillatory measurements are in good agreement with the model. The model also predicts quantitatively the creep compliance J(t-tw,tw) , obtained from step stress experiments, for the short time regime [(t-tw)

  8. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    NASA Astrophysics Data System (ADS)

    Han, Xue; Sandels, Claes; Zhu, Kun; Nordström, Lars

    2013-08-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles.

  9. Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.

    PubMed

    Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian

    2015-12-16

    Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.

  10. Investigating the quality of mental models deployed by undergraduate engineering students in creating explanations: The case of thermally activated phenomena

    NASA Astrophysics Data System (ADS)

    Fazio, Claudio; Battaglia, Onofrio Rosario; Di Paola, Benedetto

    2013-12-01

    This paper describes a method aimed at pointing out the quality of the mental models undergraduate engineering students deploy when asked to create explanations for phenomena or processes and/or use a given model in the same context. Student responses to a specially designed written questionnaire are quantitatively analyzed using researcher-generated categories of reasoning, based on the physics education research literature on student understanding of the relevant physics content. The use of statistical implicative analysis tools allows us to successfully identify clusters of students with respect to the similarity to the reasoning categories, defined as “practical or everyday,” “descriptive,” or “explicative.” Through the use of similarity and implication indexes our method also enables us to study the consistency in students’ deployment of mental models. A qualitative analysis of interviews conducted with students after they had completed the questionnaire is used to clarify some aspects which emerged from the quantitative analysis and validate the results obtained. Some implications of this joint use of quantitative and qualitative analysis for the design of a learning environment focused on the understanding of some aspects of the world at the level of causation and mechanisms of functioning are discussed.

  11. Propagating mass accretion rate fluctuations in black hole X-ray binaries: quantitative tests

    NASA Astrophysics Data System (ADS)

    Rapisarda, S.; Ingram, A.; van der Klis, M.

    2017-10-01

    Over the past 20 years, a consistent phenomenology has been established to describe the variability properties of Black Hole X-ray Binaries (BHBs). However, the physics behind the observational data is still poorly understood. The recently proposed model PROPFLUC assumes a truncated disc/hot inner flow geometry, with mass accretion rate fluctuations propagating through a precessing inner flow. These two processes give rise respectively to broad band variability and QPO. Because of propagation, the emission from different regions of the disc/hot flow geometry is correlated. In our study we applied the model PROPFLUC on different BHBs (including XTE J1550-564 and Cygnus X-1) in different spectral states, fitting jointly the power spectra in two energy bands and the cross-spectrum between these two bands. This represents the first study to utilize quantitive fitting of a physical model simultaneously to observed power and cross-spectra. For the case of XTE J1550-564, which displays a strong QPO, we found quantitative and qualitative discrepancies between model predictions and data, whereas we find a good fit for the Cygnus X-1 data, which does not display a QPO. We conclude that the discrepancies are generic to the propagating fluctuations paradigm, and may be related to the mechanism originating the QPO.

  12. A quantitative speciation model for the adsorption of organic pollutants on activated carbon.

    PubMed

    Grivé, M; García, D; Domènech, C; Richard, L; Rojo, I; Martínez, X; Rovira, M

    2013-01-01

    Granular activated carbon (GAC) is commonly used as adsorbent in water treatment plants given its high capacity for retaining organic pollutants in aqueous phase. The current knowledge on GAC behaviour is essentially empirical, and no quantitative description of the chemical relationships between GAC surface groups and pollutants has been proposed. In this paper, we describe a quantitative model for the adsorption of atrazine onto GAC surface. The model is based on results of potentiometric titrations and three types of adsorption experiments which have been carried out in order to determine the nature and distribution of the functional groups on the GAC surface, and evaluate the adsorption characteristics of GAC towards atrazine. Potentiometric titrations have indicated the existence of at least two different families of chemical groups on the GAC surface, including phenolic- and benzoic-type surface groups. Adsorption experiments with atrazine have been satisfactorily modelled with the geochemical code PhreeqC, assuming that atrazine is sorbed onto the GAC surface in equilibrium (log Ks = 5.1 ± 0.5). Independent thermodynamic calculations suggest a possible adsorption of atrazine on a benzoic derivative. The present work opens a new approach for improving the adsorption capabilities of GAC towards organic pollutants by modifying its chemical properties.

  13. The Interrelationship between Promoter Strength, Gene Expression, and Growth Rate

    PubMed Central

    Klesmith, Justin R.; Detwiler, Emily E.; Tomek, Kyle J.; Whitehead, Timothy A.

    2014-01-01

    In exponentially growing bacteria, expression of heterologous protein impedes cellular growth rates. Quantitative understanding of the relationship between expression and growth rate will advance our ability to forward engineer bacteria, important for metabolic engineering and synthetic biology applications. Recently, a work described a scaling model based on optimal allocation of ribosomes for protein translation. This model quantitatively predicts a linear relationship between microbial growth rate and heterologous protein expression with no free parameters. With the aim of validating this model, we have rigorously quantified the fitness cost of gene expression by using a library of synthetic constitutive promoters to drive expression of two separate proteins (eGFP and amiE) in E. coli in different strains and growth media. In all cases, we demonstrate that the fitness cost is consistent with the previous findings. We expand upon the previous theory by introducing a simple promoter activity model to quantitatively predict how basal promoter strength relates to growth rate and protein expression. We then estimate the amount of protein expression needed to support high flux through a heterologous metabolic pathway and predict the sizable fitness cost associated with enzyme production. This work has broad implications across applied biological sciences because it allows for prediction of the interplay between promoter strength, protein expression, and the resulting cost to microbial growth rates. PMID:25286161

  14. An engineering approach to modelling, decision support and control for sustainable systems.

    PubMed

    Day, W; Audsley, E; Frost, A R

    2008-02-12

    Engineering research and development contributes to the advance of sustainable agriculture both through innovative methods to manage and control processes, and through quantitative understanding of the operation of practical agricultural systems using decision models. This paper describes how an engineering approach, drawing on mathematical models of systems and processes, contributes new methods that support decision making at all levels from strategy and planning to tactics and real-time control. The ability to describe the system or process by a simple and robust mathematical model is critical, and the outputs range from guidance to policy makers on strategic decisions relating to land use, through intelligent decision support to farmers and on to real-time engineering control of specific processes. Precision in decision making leads to decreased use of inputs, less environmental emissions and enhanced profitability-all essential to sustainable systems.

  15. The absorption and first-pass metabolism of [14C]-1,3-dinitrobenzene in the isolated vascularly perfused rat small intestine.

    PubMed

    Adams, P C; Rickert, D E

    1996-11-01

    We tested the hypothesis that the small intestine is capable of the first-pass, reductive metabolism of xenobiotics. A simplified version of the isolated vascularly perfused rat small intestine was developed to test this hypothesis with 1,3-dinitrobenzene (1,3-DNB) as a model xenobiotic. Both 3-nitroaniline (3-NA) and 3-nitroacetanilide (3-NAA) were formed and absorbed following intralumenal doses of 1,3-DNB (1.8 or 4.2 mumol) to isolated vascularly perfused rat small intestine. Dose, fasting, or antibiotic pretreatment had no effect on the absorption and metabolism of 1,3-DNB in this model system. The failure of antibiotic pretreatment to alter the metabolism of 1,3-DNA indicated that 1,3-DNB metabolism was mammalian rather than microfloral in origin. All data from experiments initiated with lumenal 1,3-DNB were fit to a pharmacokinetic model (model A). ANOVA analysis revealed that dose, fasting, or antibiotic pretreatment had no statistically significant effect on the model-dependent parameters. 3-NA (1.5 mumol) was administered to the lumen of isolated vascularly perfused rat small intestine to evaluate model A predictions for the absorption and metabolism of this metabolite. All data from experiments initiated with 3-NA were fit to a pharmacokinetic model (model B). Comparison of corresponding model-dependent pharmacokinetic parameters (i.e. those parameters which describe the same processes in models A and B) revealed quantitative differences. Evidence for significant quantitative differences in the pharmacokinetics or metabolism of formed versus preformed 3-NA in rat small intestine may require better definition of the rate constants used to describe tissue and lumenal processes or identification and incorporation of the remaining unidentified metabolites into the models.

  16. Density matrix modeling of quantum cascade lasers without an artificially localized basis: A generalized scattering approach

    NASA Astrophysics Data System (ADS)

    Pan, Andrew; Burnett, Benjamin A.; Chui, Chi On; Williams, Benjamin S.

    2017-08-01

    We derive a density matrix (DM) theory for quantum cascade lasers (QCLs) that describes the influence of scattering on coherences through a generalized scattering superoperator. The theory enables quantitative modeling of QCLs, including localization and tunneling effects, using the well-defined energy eigenstates rather than the ad hoc localized basis states required by most previous DM models. Our microscopic approach to scattering also eliminates the need for phenomenological transition or dephasing rates. We discuss the physical interpretation and numerical implementation of the theory, presenting sets of both energy-resolved and thermally averaged equations, which can be used for detailed or compact device modeling. We illustrate the theory's applications by simulating a high performance resonant-phonon terahertz (THz) QCL design, which cannot be easily or accurately modeled using conventional DM methods. We show that the theory's inclusion of coherences is crucial for describing localization and tunneling effects consistent with experiment.

  17. Proactive Encouragement of Interdisciplinary Research Teams in a Business School Environment: Strategy and Results

    ERIC Educational Resources Information Center

    Adams, Susan M.; Carter, Nathan C.; Hadlock, Charles R.; Haughton, Dominique M.; Sirbu, George

    2008-01-01

    This case study describes efforts to promote collaborative research across traditional boundaries in a business-oriented university as part of an institutional transformation. We model this activity within the framework of social network analysis and use quantitative tools from that field to characterize resulting impacts. (Contains 4 tables and 2…

  18. Modeling the Effect of Nail Corrosion on the Lateral Strength of Joints

    Treesearch

    Samuel L. Zelinka; Douglas R. Rammer

    2012-01-01

    This article describes a theoretical method of linking fastener corrosion in wood connections to potential reduction in lateral shear strength. It builds upon published quantitative data of corrosion rates of metals in contact with treated wood for several different wood preservatives. These corrosion rates are then combined with yield theory equations to calculate a...

  19. The Relationship between Agriculture Knowledge Bases for Teaching and Sources of Knowledge

    ERIC Educational Resources Information Center

    Rice, Amber H.; Kitchel, Tracy

    2015-01-01

    The purpose of this study was to describe the agriculture knowledge bases for teaching of agriculture teachers and to see if a relationship existed between years of teaching experience, sources of knowledge, and development of pedagogical content knowledge (PCK), using quantitative methods. A model of PCK from mathematics was utilized as a…

  20. Neural Mechanisms of Attention

    DTIC Science & Technology

    1993-05-21

    of Attention 39 The Element Superiority Effect : Attention? 46 Animal Models of Attention Deficit 47 Conditioned Attention Theory 50 2 ATTENTION AND...fails to obtain the necessary quantitative information about the effects of parametric manipulations on the dissociation, or the parametric results...neuroscience endeavor as described here. If simultaneously psychologists ignore the brain arid neuroscientists ignore the mind, no effective translation

  1. Forage resource evaluation system for habitat—deer: an interactive deer habitat model

    Treesearch

    Thomas A. Hanley; Donald E. Spalinger; Kenrick J. Mock; Oran L. Weaver; Grant M. Harris

    2012-01-01

    We describe a food-based system for quantitatively evaluating habitat quality for deer called the Forage Resource Evaluation System for Habitat and provide its rationale and suggestions for use. The system was developed as a tool for wildlife biologists and other natural resource managers and planners interested in evaluating habitat quality and, especially, comparing...

  2. A High School Intensive Summer Mandarin Course: Program Model and Learner Outcomes

    ERIC Educational Resources Information Center

    Xu, Xiaoqiu; Padilla, Amado M.; Silva, Duarte; Masuda, Norman

    2012-01-01

    This article describes a STARTALK intensive summer high school Mandarin language and culture program that was conducted for three summers. Participants across the three years included 40 Mandarin Level II and 53 Mandarin Level III high school students. Quantitative and qualitative data are presented to show the effectiveness of the program.…

  3. Trap-assisted tunneling in InGaN/GaN single-quantum-well light-emitting diodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auf der Maur, M., E-mail: auf.der.maur@ing.uniroma2.it; Di Carlo, A.; Galler, B.

    Based on numerical simulation and comparison with measured current characteristics, we show that the current in InGaN/GaN single-quantum-well light-emitting diodes at low forward bias can be accurately described by a standard trap-assisted tunneling model. The qualitative and quantitative differences in the current characteristics of devices with different emission wavelengths are demonstrated to be correlated in a physically consistent way with the tunneling model parameters.

  4. Observing Clonal Dynamics across Spatiotemporal Axes: A Prelude to Quantitative Fitness Models for Cancer.

    PubMed

    McPherson, Andrew W; Chan, Fong Chun; Shah, Sohrab P

    2018-02-01

    The ability to accurately model evolutionary dynamics in cancer would allow for prediction of progression and response to therapy. As a prelude to quantitative understanding of evolutionary dynamics, researchers must gather observations of in vivo tumor evolution. High-throughput genome sequencing now provides the means to profile the mutational content of evolving tumor clones from patient biopsies. Together with the development of models of tumor evolution, reconstructing evolutionary histories of individual tumors generates hypotheses about the dynamics of evolution that produced the observed clones. In this review, we provide a brief overview of the concepts involved in predicting evolutionary histories, and provide a workflow based on bulk and targeted-genome sequencing. We then describe the application of this workflow to time series data obtained for transformed and progressed follicular lymphomas (FL), and contrast the observed evolutionary dynamics between these two subtypes. We next describe results from a spatial sampling study of high-grade serous (HGS) ovarian cancer, propose mechanisms of disease spread based on the observed clonal mixtures, and provide examples of diversification through subclonal acquisition of driver mutations and convergent evolution. Finally, we state implications of the techniques discussed in this review as a necessary but insufficient step on the path to predictive modelling of disease dynamics. Copyright © 2018 Cold Spring Harbor Laboratory Press; all rights reserved.

  5. Optimising the combination dosing strategy of abemaciclib and vemurafenib in BRAF-mutated melanoma xenograft tumours

    PubMed Central

    Tate, Sonya C; Burke, Teresa F; Hartman, Daisy; Kulanthaivel, Palaniappan; Beckmann, Richard P; Cronier, Damien M

    2016-01-01

    Background: Resistance to BRAF inhibition is a major cause of treatment failure for BRAF-mutated metastatic melanoma patients. Abemaciclib, a cyclin-dependent kinase 4 and 6 inhibitor, overcomes this resistance in xenograft tumours and offers a promising drug combination. The present work aims to characterise the quantitative pharmacology of the abemaciclib/vemurafenib combination using a semimechanistic pharmacokinetic/pharmacodynamic modelling approach and to identify an optimum dosing regimen for potential clinical evaluation. Methods: A PK/biomarker model was developed to connect abemaciclib/vemurafenib concentrations to changes in MAPK and cell cycle pathway biomarkers in A375 BRAF-mutated melanoma xenografts. Resultant tumour growth inhibition was described by relating (i) MAPK pathway inhibition to apoptosis, (ii) mitotic cell density to tumour growth and, under resistant conditions, (iii) retinoblastoma protein inhibition to cell survival. Results: The model successfully described vemurafenib/abemaciclib-mediated changes in MAPK pathway and cell cycle biomarkers. Initial tumour shrinkage by vemurafenib, acquisition of resistance and subsequent abemaciclib-mediated efficacy were successfully captured and externally validated. Model simulations illustrate the benefit of intermittent vemurafenib therapy over continuous treatment, and indicate that continuous abemaciclib in combination with intermittent vemurafenib offers the potential for considerable tumour regression. Conclusions: The quantitative pharmacology of the abemaciclib/vemurafenib combination was successfully characterised and an optimised, clinically-relevant dosing strategy was identified. PMID:26978007

  6. Quantitative diffusion and swelling kinetic measurements using large-angle interferometric refractometry.

    PubMed

    Saunders, John E; Chen, Hao; Brauer, Chris; Clayton, McGregor; Chen, Weijian; Barnes, Jack A; Loock, Hans-Peter

    2015-12-07

    The uptake and release of sorbates into films and coatings is typically accompanied by changes of the films' refractive index and thickness. We provide a comprehensive model to calculate the concentration of the sorbate from the average refractive index and the film thickness, and validate the model experimentally. The mass fraction of the analyte partitioned into a film is described quantitatively by the Lorentz-Lorenz equation and the Clausius-Mosotti equation. To validate the model, the uptake kinetics of water and other solvents into SU-8 films (d = 40-45 μm) were explored. Large-angle interferometric refractometry measurements can be used to characterize films that are between 15 μm to 150 μm thick and, Fourier analysis, is used to determine independently the thickness, the average refractive index and the refractive index at the film-substrate interface at one-second time intervals. From these values the mass fraction of water in SU-8 was calculated. The kinetics were best described by two independent uptake processes having different rates. Each process followed one-dimensional Fickian diffusion kinetics with diffusion coefficients for water into SU-8 photoresist film of 5.67 × 10(-9) cm(2) s(-1) and 61.2 × 10(-9) cm(2) s(-1).

  7. Quantitative modeling of multiscale neural activity

    NASA Astrophysics Data System (ADS)

    Robinson, Peter A.; Rennie, Christopher J.

    2007-01-01

    The electrical activity of the brain has been observed for over a century and is widely used to probe brain function and disorders, chiefly through the electroencephalogram (EEG) recorded by electrodes on the scalp. However, the connections between physiology and EEGs have been chiefly qualitative until recently, and most uses of the EEG have been based on phenomenological correlations. A quantitative mean-field model of brain electrical activity is described that spans the range of physiological and anatomical scales from microscopic synapses to the whole brain. Its parameters measure quantities such as synaptic strengths, signal delays, cellular time constants, and neural ranges, and are all constrained by independent physiological measurements. Application of standard techniques from wave physics allows successful predictions to be made of a wide range of EEG phenomena, including time series and spectra, evoked responses to stimuli, dependence on arousal state, seizure dynamics, and relationships to functional magnetic resonance imaging (fMRI). Fitting to experimental data also enables physiological parameters to be infered, giving a new noninvasive window into brain function, especially when referenced to a standardized database of subjects. Modifications of the core model to treat mm-scale patchy interconnections in the visual cortex are also described, and it is shown that resulting waves obey the Schroedinger equation. This opens the possibility of classical cortical analogs of quantum phenomena.

  8. A quantitative dynamic systems model of health-related quality of life among older adults

    PubMed Central

    Roppolo, Mattia; Kunnen, E Saskia; van Geert, Paul L; Mulasso, Anna; Rabaglietti, Emanuela

    2015-01-01

    Health-related quality of life (HRQOL) is a person-centered concept. The analysis of HRQOL is highly relevant in the aged population, which is generally suffering from health decline. Starting from a conceptual dynamic systems model that describes the development of HRQOL in individuals over time, this study aims to develop and test a quantitative dynamic systems model, in order to reveal the possible dynamic trends of HRQOL among older adults. The model is tested in different ways: first, with a calibration procedure to test whether the model produces theoretically plausible results, and second, with a preliminary validation procedure using empirical data of 194 older adults. This first validation tested the prediction that given a particular starting point (first empirical data point), the model will generate dynamic trajectories that lead to the observed endpoint (second empirical data point). The analyses reveal that the quantitative model produces theoretically plausible trajectories, thus providing support for the calibration procedure. Furthermore, the analyses of validation show a good fit between empirical and simulated data. In fact, no differences were found in the comparison between empirical and simulated final data for the same subgroup of participants, whereas the comparison between different subgroups of people resulted in significant differences. These data provide an initial basis of evidence for the dynamic nature of HRQOL during the aging process. Therefore, these data may give new theoretical and applied insights into the study of HRQOL and its development with time in the aging population. PMID:26604722

  9. Continuous-time system identification of a smoking cessation intervention

    NASA Astrophysics Data System (ADS)

    Timms, Kevin P.; Rivera, Daniel E.; Collins, Linda M.; Piper, Megan E.

    2014-07-01

    Cigarette smoking is a major global public health issue and the leading cause of preventable death in the United States. Toward a goal of designing better smoking cessation treatments, system identification techniques are applied to intervention data to describe smoking cessation as a process of behaviour change. System identification problems that draw from two modelling paradigms in quantitative psychology (statistical mediation and self-regulation) are considered, consisting of a series of continuous-time estimation problems. A continuous-time dynamic modelling approach is employed to describe the response of craving and smoking rates during a quit attempt, as captured in data from a smoking cessation clinical trial. The use of continuous-time models provide benefits of parsimony, ease of interpretation, and the opportunity to work with uneven or missing data.

  10. Antineutrino Charged-Current Reactions on Hydrocarbon with Low Momentum Transfer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gran, R.; Betancourt, M.; Elkins, M.

    We report on multi-nucleon effects in low momentum transfer (more » $< 0.8$ GeV/c) anti-neutrino interactions on scintillator. These data are from the 2010-11 anti-neutrino phase of the MINERvA experiment at Fermilab. The hadronic energy spectrum of this inclusive sample is well-described when a screening effect at low energy transfer and a two-nucleon knockout process are added to a relativistic Fermi gas model of quasi-elastic, $$\\Delta$$ resonance, and higher resonance processes. In this analysis, model elements introduced to describe previously published neutrino results have quantitatively similar benefits for this anti-neutrino sample. We present the results as a double-differential cross section to accelerate investigation of alternate models for anti-neutrino scattering off nuclei.« less

  11. Anti-Neutrino Charged-Current Reactions on Scintillator with Low Momentum Transfer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gran, R.; et al.

    2018-03-25

    We report on multi-nucleon effects in low momentum transfer (more » $< 0.8$ GeV/c) anti-neutrino interactions on scintillator. These data are from the 2010-11 anti-neutrino phase of the MINERvA experiment at Fermilab. The hadronic energy spectrum of this inclusive sample is well-described when a screening effect at low energy transfer and a two-nucleon knockout process are added to a relativistic Fermi gas model of quasi-elastic, $$\\Delta$$ resonance, and higher resonance processes. In this analysis, model elements introduced to describe previously published neutrino results have quantitatively similar benefits for this anti-neutrino sample. We present the results as a double-differential cross section to accelerate investigation of alternate models for anti-neutrino scattering off nuclei.« less

  12. Anti-Neutrino Charged-Current Reactions on Scintillator with Low Momentum Transfer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gran, R.; et al.

    2018-06-01

    We report on multi-nucleon effects in low momentum transfer (more » $< 0.8$ GeV/c) anti-neutrino interactions on scintillator. These data are from the 2010-11 anti-neutrino phase of the MINERvA experiment at Fermilab. The hadronic energy spectrum of this inclusive sample is well-described when a screening effect at low energy transfer and a two-nucleon knockout process are added to a relativistic Fermi gas model of quasi-elastic, $$\\Delta$$ resonance, and higher resonance processes. In this analysis, model elements introduced to describe previously published neutrino results have quantitatively similar benefits for this anti-neutrino sample. We present the results as a double-differential cross section to accelerate investigation of alternate models for anti-neutrino scattering off nuclei.« less

  13. Antineutrino Charged-Current Reactions on Hydrocarbon with Low Momentum Transfer

    DOE PAGES

    Gran, R.; Betancourt, M.; Elkins, M.; ...

    2018-06-01

    We report on multi-nucleon effects in low momentum transfer (more » $< 0.8$ GeV/c) anti-neutrino interactions on scintillator. These data are from the 2010-11 anti-neutrino phase of the MINERvA experiment at Fermilab. The hadronic energy spectrum of this inclusive sample is well-described when a screening effect at low energy transfer and a two-nucleon knockout process are added to a relativistic Fermi gas model of quasi-elastic, $$\\Delta$$ resonance, and higher resonance processes. In this analysis, model elements introduced to describe previously published neutrino results have quantitatively similar benefits for this anti-neutrino sample. We present the results as a double-differential cross section to accelerate investigation of alternate models for anti-neutrino scattering off nuclei.« less

  14. Quantitative risk stratification in Markov chains with limiting conditional distributions.

    PubMed

    Chan, David C; Pollett, Philip K; Weinstein, Milton C

    2009-01-01

    Many clinical decisions require patient risk stratification. The authors introduce the concept of limiting conditional distributions, which describe the equilibrium proportion of surviving patients occupying each disease state in a Markov chain with death. Such distributions can quantitatively describe risk stratification. The authors first establish conditions for the existence of a positive limiting conditional distribution in a general Markov chain and describe a framework for risk stratification using the limiting conditional distribution. They then apply their framework to a clinical example of a treatment indicated for high-risk patients, first to infer the risk of patients selected for treatment in clinical trials and then to predict the outcomes of expanding treatment to other populations of risk. For the general chain, a positive limiting conditional distribution exists only if patients in the earliest state have the lowest combined risk of progression or death. The authors show that in their general framework, outcomes and population risk are interchangeable. For the clinical example, they estimate that previous clinical trials have selected the upper quintile of patient risk for this treatment, but they also show that expanded treatment would weakly dominate this degree of targeted treatment, and universal treatment may be cost-effective. Limiting conditional distributions exist in most Markov models of progressive diseases and are well suited to represent risk stratification quantitatively. This framework can characterize patient risk in clinical trials and predict outcomes for other populations of risk.

  15. A Computational Model of the Rainbow Trout Hypothalamus-Pituitary-Ovary-Liver Axis

    PubMed Central

    Gillies, Kendall; Krone, Stephen M.; Nagler, James J.; Schultz, Irvin R.

    2016-01-01

    Reproduction in fishes and other vertebrates represents the timely coordination of many endocrine factors that culminate in the production of mature, viable gametes. In recent years there has been rapid growth in understanding fish reproductive biology, which has been motivated in part by recognition of the potential effects that climate change, habitat destruction and contaminant exposure can have on natural and cultured fish populations. New approaches to understanding the impacts of these stressors are being developed that require a systems biology approach with more biologically accurate and detailed mathematical models. We have developed a multi-scale mathematical model of the female rainbow trout hypothalamus-pituitary-ovary-liver axis to use as a tool to help understand the functioning of the system and for extrapolation of laboratory findings of stressor impacts on specific components of the axis. The model describes the essential endocrine components of the female rainbow trout reproductive axis. The model also describes the stage specific growth of maturing oocytes within the ovary and permits the presence of sub-populations of oocytes at different stages of development. Model formulation and parametrization was largely based on previously published in vivo and in vitro data in rainbow trout and new data on the synthesis of gonadotropins in the pituitary. Model predictions were validated against several previously published data sets for annual changes in gonadotropins and estradiol in rainbow trout. Estimates of select model parameters can be obtained from in vitro assays using either quantitative (direct estimation of rate constants) or qualitative (relative change from control values) approaches. This is an important aspect of mathematical models as in vitro, cell-based assays are expected to provide the bulk of experimental data for future risk assessments and will require quantitative physiological models to extrapolate across biological scales. PMID:27096735

  16. A Computational Model of the Rainbow Trout Hypothalamus-Pituitary-Ovary-Liver Axis.

    PubMed

    Gillies, Kendall; Krone, Stephen M; Nagler, James J; Schultz, Irvin R

    2016-04-01

    Reproduction in fishes and other vertebrates represents the timely coordination of many endocrine factors that culminate in the production of mature, viable gametes. In recent years there has been rapid growth in understanding fish reproductive biology, which has been motivated in part by recognition of the potential effects that climate change, habitat destruction and contaminant exposure can have on natural and cultured fish populations. New approaches to understanding the impacts of these stressors are being developed that require a systems biology approach with more biologically accurate and detailed mathematical models. We have developed a multi-scale mathematical model of the female rainbow trout hypothalamus-pituitary-ovary-liver axis to use as a tool to help understand the functioning of the system and for extrapolation of laboratory findings of stressor impacts on specific components of the axis. The model describes the essential endocrine components of the female rainbow trout reproductive axis. The model also describes the stage specific growth of maturing oocytes within the ovary and permits the presence of sub-populations of oocytes at different stages of development. Model formulation and parametrization was largely based on previously published in vivo and in vitro data in rainbow trout and new data on the synthesis of gonadotropins in the pituitary. Model predictions were validated against several previously published data sets for annual changes in gonadotropins and estradiol in rainbow trout. Estimates of select model parameters can be obtained from in vitro assays using either quantitative (direct estimation of rate constants) or qualitative (relative change from control values) approaches. This is an important aspect of mathematical models as in vitro, cell-based assays are expected to provide the bulk of experimental data for future risk assessments and will require quantitative physiological models to extrapolate across biological scales.

  17. Modelling home televisiting services using systems dynamic theory.

    PubMed

    Valero, M A; Arredondo, M T; del Nogal, F; Gallar, P; Insausti, J; Del Pozo, F

    2001-01-01

    A quantitative model was developed to study the provision of a home televisiting service. Systems dynamic theory was used to describe the relationships between quality of care, accessibility and cost-effectiveness. Input information was gathered from the telemedicine literature, as well as from over 75 sessions of a televisiting service provided by the Severo Ochoa Hospital to 18 housebound patients from three different medical specialties. The model allowed the Severo Ochoa Hospital to estimate the equipment needed to support increased medical contacts for intensive cardiac and other patients.

  18. Quantitative and empirical demonstration of the Matthew effect in a study of career longevity

    PubMed Central

    Petersen, Alexander M.; Jung, Woo-Sung; Yang, Jae-Suk; Stanley, H. Eugene

    2011-01-01

    The Matthew effect refers to the adage written some two-thousand years ago in the Gospel of St. Matthew: “For to all those who have, more will be given.” Even two millennia later, this idiom is used by sociologists to qualitatively describe the dynamics of individual progress and the interplay between status and reward. Quantitative studies of professional careers are traditionally limited by the difficulty in measuring progress and the lack of data on individual careers. However, in some professions, there are well-defined metrics that quantify career longevity, success, and prowess, which together contribute to the overall success rating for an individual employee. Here we demonstrate testable evidence of the age-old Matthew “rich get richer” effect, wherein the longevity and past success of an individual lead to a cumulative advantage in further developing his or her career. We develop an exactly solvable stochastic career progress model that quantitatively incorporates the Matthew effect and validate our model predictions for several competitive professions. We test our model on the careers of 400,000 scientists using data from six high-impact journals and further confirm our findings by testing the model on the careers of more than 20,000 athletes in four sports leagues. Our model highlights the importance of early career development, showing that many careers are stunted by the relative disadvantage associated with inexperience. PMID:21173276

  19. A Simulation and Modeling Framework for Space Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less

  20. Discrete dynamic modeling of cellular signaling networks.

    PubMed

    Albert, Réka; Wang, Rui-Sheng

    2009-01-01

    Understanding signal transduction in cellular systems is a central issue in systems biology. Numerous experiments from different laboratories generate an abundance of individual components and causal interactions mediating environmental and developmental signals. However, for many signal transduction systems there is insufficient information on the overall structure and the molecular mechanisms involved in the signaling network. Moreover, lack of kinetic and temporal information makes it difficult to construct quantitative models of signal transduction pathways. Discrete dynamic modeling, combined with network analysis, provides an effective way to integrate fragmentary knowledge of regulatory interactions into a predictive mathematical model which is able to describe the time evolution of the system without the requirement for kinetic parameters. This chapter introduces the fundamental concepts of discrete dynamic modeling, particularly focusing on Boolean dynamic models. We describe this method step-by-step in the context of cellular signaling networks. Several variants of Boolean dynamic models including threshold Boolean networks and piecewise linear systems are also covered, followed by two examples of successful application of discrete dynamic modeling in cell biology.

  1. Reducible or irreducible? Mathematical reasoning and the ontological method.

    PubMed

    Fisher, William P

    2010-01-01

    Science is often described as nothing but the practice of measurement. This perspective follows from longstanding respect for the roles mathematics and quantification have played as media through which alternative hypotheses are evaluated and experience becomes better managed. Many figures in the history of science and psychology have contributed to what has been called the "quantitative imperative," the demand that fields of study employ number and mathematics even when they do not constitute the language in which investigators think together. But what makes an area of study scientific is, of course, not the mere use of number, but communities of investigators who share common mathematical languages for exchanging quantitative and quantitative value. Such languages require rigorous theoretical underpinning, a basis in data sufficient to the task, and instruments traceable to reference standard quantitative metrics. The values shared and exchanged by such communities typically involve the application of mathematical models that specify the sufficient and invariant relationships necessary for rigorous theorizing and instrument equating. The mathematical metaphysics of science are explored with the aim of connecting principles of quantitative measurement with the structures of sufficient reason.

  2. Potential for the dynamics of pedestrians in a socially interacting group

    NASA Astrophysics Data System (ADS)

    Zanlungo, Francesco; Ikeda, Tetsushi; Kanda, Takayuki

    2014-01-01

    We introduce a simple potential to describe the dynamics of the relative motion of two pedestrians socially interacting in a walking group. We show that the proposed potential, based on basic empirical observations and theoretical considerations, can qualitatively describe the statistical properties of pedestrian behavior. In detail, we show that the two-dimensional probability distribution of the relative distance is determined by the proposed potential through a Boltzmann distribution. After calibrating the parameters of the model on the two-pedestrian group data, we apply the model to three-pedestrian groups, showing that it describes qualitatively and quantitatively well their behavior. In particular, the model predicts that three-pedestrian groups walk in a V-shaped formation and provides accurate values for the position of the three pedestrians. Furthermore, the model correctly predicts the average walking velocity of three-person groups based on the velocity of two-person ones. Possible extensions to larger groups, along with alternative explanations of the social dynamics that may be implied by our model, are discussed at the end of the paper.

  3. Principles of quantitation of viral loads using nucleic acid sequence-based amplification in combination with homogeneous detection using molecular beacons.

    PubMed

    Weusten, Jos J A M; Carpay, Wim M; Oosterlaken, Tom A M; van Zuijlen, Martien C A; van de Wiel, Paul A

    2002-03-15

    For quantitative NASBA-based viral load assays using homogeneous detection with molecular beacons, such as the NucliSens EasyQ HIV-1 assay, a quantitation algorithm is required. During the amplification process there is a constant growth in the concentration of amplicons to which the beacon can bind while generating a fluorescence signal. The overall fluorescence curve contains kinetic information on both amplicon formation and beacon binding, but only the former is relevant for quantitation. In the current paper, mathematical modeling of the relevant processes is used to develop an equation describing the fluorescence curve as a function of the amplification time and the relevant kinetic parameters. This equation allows reconstruction of RNA formation, which is characterized by an exponential increase in concentrations as long as the primer concentrations are not rate limiting and by linear growth over time after the primer pool is depleted. During the linear growth phase, the actual quantitation is based on assessing the amplicon formation rate from the viral RNA relative to that from a fixed amount of calibrator RNA. The quantitation procedure has been successfully applied in the NucliSens EasyQ HIV-1 assay.

  4. Qualitative versus quantitative methods in psychiatric research.

    PubMed

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  5. Reliability based fatigue design and maintenance procedures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1977-01-01

    A stochastic model has been developed to describe a probability for fatigue process by assuming a varying hazard rate. This stochastic model can be used to obtain the desired probability of a crack of certain length at a given location after a certain number of cycles or time. Quantitative estimation of the developed model was also discussed. Application of the model to develop a procedure for reliability-based cost-effective fail-safe structural design is presented. This design procedure includes the reliability improvement due to inspection and repair. Methods of obtaining optimum inspection and maintenance schemes are treated.

  6. Communications satellite systems operations with the space station, volume 2

    NASA Technical Reports Server (NTRS)

    Price, K.; Dixon, J.; Weyandt, C.

    1987-01-01

    A financial model was developed which described quantitatively the economics of the space segment of communication satellite systems. The model describes the economics of the space system throughout the lifetime of the satellite. The expected state-of-the-art status of communications satellite systems and operations beginning service in 1995 were assessed and described. New or enhanced space-based activities and associated satellite system designs that have the potential to achieve future communications satellite operations in geostationary orbit with improved economic performance were postulated and defined. Three scenarios using combinations of space-based activities were analyzed: a spin stabilized satellite, a three axis satellite, and assembly at the Space Station and GEO servicing. Functional and technical requirements placed on the Space Station by the scenarios were detailed. Requirements on the satellite were also listed.

  7. The power laws of nanoscale forces in ambient conditions

    NASA Astrophysics Data System (ADS)

    Chiesa, Matteo; Santos, Sergio; Lai, Chia-Yun

    Power laws are ubiquitous in the physical sciences and indispensable to qualitatively and quantitatively describe physical phenomena. A nanoscale force law that accurately describes the phenomena observed in ambient conditions at several nm or fractions of a nm above a surface however is still lacking. Here we report a power law derived from experimental data and describing the interaction between an atomic force microscope AFM tip modelled as a sphere and a surface in ambient conditions. By employing a graphite surface as a model system the resulting effective power is found to be a function of the tip radius and the distance. The data suggest a nano to mesoscale transition in the power law that results in relative agreement with the distance-dependencies predicted by the Hamaker and Lifshitz theories for van der Waals forces for the larger tip radii only

  8. Understanding the masses of elementary particles: a step towards understanding the massless photon?

    NASA Astrophysics Data System (ADS)

    Greulich, K. O.

    2011-09-01

    A so far unnoticed simple explanation of elementary particle masses is given by m = N * melectron/α, where alpha (=1/137) is the fine structure constant. On the other hand photons can be described by two oppositely oscillating clouds of e / √α elementary charges. Such a model describes a number of features of the photon in a quantitatively correct manner. For example, the energy of the oscillating clouds is E = h ν, the spin is 1 and the spatial dimension is λ / 2 π. When the charge e / √α is assigned to the Planck mass mPl, the resulting charge density is e / (mPl√α) = 8,62 * 10-11 Cb / kg. This is identical to √ (G / ko) where G is the gravitational constant and ko the Coulomb constant. When one assigns this very small charge density to any matter, gravitation can be completely described as Coulomb interaction between such charges of the corresponding masses. Thus, there is a tight quantitative connection between the photon, nonzero rest masses and gravitation / Coulomb interaction.

  9. Quantitation of dissolved gas content in emulsions and in blood using mass spectrometric detection

    PubMed Central

    Grimley, Everett; Turner, Nicole; Newell, Clayton; Simpkins, Cuthbert; Rodriguez, Juan

    2011-01-01

    Quantitation of dissolved gases in blood or in other biological media is essential for understanding the dynamics of metabolic processes. Current detection techniques, while enabling rapid and convenient assessment of dissolved gases, provide only direct information on the partial pressure of gases dissolved in the aqueous fraction of the fluid. The more relevant quantity known as gas content, which refers to the total amount of the gas in all fractions of the sample, can be inferred from those partial pressures, but only indirectly through mathematical modeling. Here we describe a simple mass spectrometric technique for rapid and direct quantitation of gas content for a wide range of gases. The technique is based on a mass spectrometer detector that continuously monitors gases that are rapidly extracted from samples injected into a purge vessel. The accuracy and sample processing speed of the system is demonstrated with experiments that reproduce within minutes literature values for the solubility of various gases in water. The capability of the technique is further demonstrated through accurate determination of O2 content in a lipid emulsion and in whole blood, using as little as 20 μL of sample. The approach to gas content quantitation described here should greatly expand the range of animals and conditions that may be used in studies of metabolic gas exchange, and facilitate the development of artificial oxygen carriers and resuscitation fluids. PMID:21497566

  10. A stochastic model of solid state thin film deposition: Application to chalcopyrite growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lovelett, Robert J.; Pang, Xueqi; Roberts, Tyler M.

    Developing high fidelity quantitative models of solid state reaction systems can be challenging, especially in deposition systems where, in addition to the multiple competing processes occurring simultaneously, the solid interacts with its atmosphere. In this work, we develop a model for the growth of a thin solid film where species from the atmosphere adsorb, diffuse, and react with the film. The model is mesoscale and describes an entire film with thickness on the order of microns. Because it is stochastic, the model allows us to examine inhomogeneities and agglomerations that would be impossible to characterize with deterministic methods. We demonstratemore » the modeling approach with the example of chalcopyrite Cu(InGa)(SeS){sub 2} thin film growth via precursor reaction, which is a common industrial method for fabricating thin film photovoltaic modules. The model is used to understand how and why through-film variation in the composition of Cu(InGa)(SeS){sub 2} thin films arises and persists. We believe that the model will be valuable as an effective quantitative description of many other materials systems used in semiconductors, energy storage, and other fast-growing industries.« less

  11. A stochastic model of solid state thin film deposition: Application to chalcopyrite growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lovelett, Robert J.; Pang, Xueqi; Roberts, Tyler M.

    Developing high fidelity quantitative models of solid state reaction systems can be challenging, especially in deposition systems where, in addition to the multiple competing processes occurring simultaneously, the solid interacts with its atmosphere. Here, we develop a model for the growth of a thin solid film where species from the atmosphere adsorb, diffuse, and react with the film. The model is mesoscale and describes an entire film with thickness on the order of microns. Because it is stochastic, the model allows us to examine inhomogeneities and agglomerations that would be impossible to characterize with deterministic methods. We also demonstrate themore » modeling approach with the example of chalcopyrite Cu(InGa)(SeS) 2 thin film growth via precursor reaction, which is a common industrial method for fabricating thin film photovoltaic modules. The model is used to understand how and why through-film variation in the composition of Cu(InGa)(SeS) 2 thin films arises and persists. Finally, we believe that the model will be valuable as an effective quantitative description of many other materials systems used in semiconductors, energy storage, and other fast-growing industries.« less

  12. A stochastic model of solid state thin film deposition: Application to chalcopyrite growth

    DOE PAGES

    Lovelett, Robert J.; Pang, Xueqi; Roberts, Tyler M.; ...

    2016-04-01

    Developing high fidelity quantitative models of solid state reaction systems can be challenging, especially in deposition systems where, in addition to the multiple competing processes occurring simultaneously, the solid interacts with its atmosphere. Here, we develop a model for the growth of a thin solid film where species from the atmosphere adsorb, diffuse, and react with the film. The model is mesoscale and describes an entire film with thickness on the order of microns. Because it is stochastic, the model allows us to examine inhomogeneities and agglomerations that would be impossible to characterize with deterministic methods. We also demonstrate themore » modeling approach with the example of chalcopyrite Cu(InGa)(SeS) 2 thin film growth via precursor reaction, which is a common industrial method for fabricating thin film photovoltaic modules. The model is used to understand how and why through-film variation in the composition of Cu(InGa)(SeS) 2 thin films arises and persists. Finally, we believe that the model will be valuable as an effective quantitative description of many other materials systems used in semiconductors, energy storage, and other fast-growing industries.« less

  13. Quantitative genetics

    USDA-ARS?s Scientific Manuscript database

    The majority of economically important traits targeted for cotton improvement are quantitatively inherited. In this chapter, the current state of cotton quantitative genetics is described and separated into four components. These components include: 1) traditional quantitative inheritance analysis, ...

  14. A model for methane production in anaerobic digestion of swine wastewater.

    PubMed

    Yang, Hongnan; Deng, Liangwei; Liu, Gangjin; Yang, Di; Liu, Yi; Chen, Ziai

    2016-10-01

    A study was conducted using a laboratory-scale anaerobic sequencing batch digester to investigate the quantitative influence of organic loading rates (OLRs) on the methane production rate during digestion of swine wastewater at temperatures between 15 °C and 35 °C. The volumetric production rate of methane (Rp) at different OLRs and temperatures was obtained. The maximum volumetric methane production rates (Rpmax) were 0.136, 0.796, 1.294, 1.527 and 1.952 LCH4 L(-1) d(-1) at corresponding organic loading rates of 1.2, 3.6, 5.6, 5.6 and 7.2 g volatile solids L(-1) d(-1), respectively, which occurred at 15, 20, 25, 30 and 35 °C, respectively. A new model was developed to describe the quantitative relationship between Rp and OLR. In addition to the maximum volumetric methane production rate (Rpmax) and the half-saturation constant (KLR) commonly used in previous models such as the modified Stover-Kincannon model and Deng model, the new model introduced a new index (KD) that denoted the speed of volumetric methane production rate approaching the maximum as a function of temperature. The new model more satisfactorily described the influence of OLR on the rate of methane production than other models as confirmed by higher determination coefficients (R(2)) (0.9717-0.9900) and lower bias between the experimental and predicted data in terms of the root mean square error and the Akaike Information Criterion. Data from other published research also validated the applicability and generality of the new kinetic model to different types of wastewater. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Quantitative Evaluation of Performance during Robot-assisted Treatment.

    PubMed

    Peri, E; Biffi, E; Maghini, C; Servodio Iammarrone, F; Gagliardi, C; Germiniasi, C; Pedrocchi, A; Turconi, A C; Reni, G

    2016-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The great potential of robots in extracting quantitative and meaningful data is not always exploited in clinical practice. The aim of the present work is to describe a simple parameter to assess the performance of subjects during upper limb robotic training exploiting data automatically recorded by the robot, with no additional effort for patients and clinicians. Fourteen children affected by cerebral palsy (CP) performed a training with Armeo®Spring. Each session was evaluated with P, a simple parameter that depends on the overall performance recorded, and median and interquartile values were computed to perform a group analysis. Median (interquartile) values of P significantly increased from 0.27 (0.21) at T0 to 0.55 (0.27) at T1 . This improvement was functionally validated by a significant increase of the Melbourne Assessment of Unilateral Upper Limb Function. The parameter described here was able to show variations in performance over time and enabled a quantitative evaluation of motion abilities in a way that is reliable with respect to a well-known clinical scale.

  16. Quantitative Analysis of the Trends Exhibited by the Three Interdisciplinary Biological Sciences: Biophysics, Bioinformatics, and Systems Biology.

    PubMed

    Kang, Jonghoon; Park, Seyeon; Venkat, Aarya; Gopinath, Adarsh

    2015-12-01

    New interdisciplinary biological sciences like bioinformatics, biophysics, and systems biology have become increasingly relevant in modern science. Many papers have suggested the importance of adding these subjects, particularly bioinformatics, to an undergraduate curriculum; however, most of their assertions have relied on qualitative arguments. In this paper, we will show our metadata analysis of a scientific literature database (PubMed) that quantitatively describes the importance of the subjects of bioinformatics, systems biology, and biophysics as compared with a well-established interdisciplinary subject, biochemistry. Specifically, we found that the development of each subject assessed by its publication volume was well described by a set of simple nonlinear equations, allowing us to characterize them quantitatively. Bioinformatics, which had the highest ratio of publications produced, was predicted to grow between 77% and 93% by 2025 according to the model. Due to the large number of publications produced in bioinformatics, which nearly matches the number published in biochemistry, it can be inferred that bioinformatics is almost equal in significance to biochemistry. Based on our analysis, we suggest that bioinformatics be added to the standard biology undergraduate curriculum. Adding this course to an undergraduate curriculum will better prepare students for future research in biology.

  17. Helicopter Pilot Performance for Discrete-maneuver Flight Tasks

    NASA Technical Reports Server (NTRS)

    Heffley, R. K.; Bourne, S. M.; Hindson, W. S.

    1984-01-01

    This paper describes a current study of several basic helicopter flight maneuvers. The data base consists of in-flight measurements from instrumented helicopters using experienced pilots. The analysis technique is simple enough to apply without automatic data processing, and the results can be used to build quantitative matah models of the flight task and some aspects of the pilot control strategy. In addition to describing the performance measurement technqiue, some results are presented which define the aggressiveness and amplitude of maneuvering for several lateral maneuvers including turns and sidesteps.

  18. Photogrammetry Applied to Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Liu, Tian-Shu; Cattafesta, L. N., III; Radeztsky, R. H.; Burner, A. W.

    2000-01-01

    In image-based measurements, quantitative image data must be mapped to three-dimensional object space. Analytical photogrammetric methods, which may be used to accomplish this task, are discussed from the viewpoint of experimental fluid dynamicists. The Direct Linear Transformation (DLT) for camera calibration, used in pressure sensitive paint, is summarized. An optimization method for camera calibration is developed that can be used to determine the camera calibration parameters, including those describing lens distortion, from a single image. Combined with the DLT method, this method allows a rapid and comprehensive in-situ camera calibration and therefore is particularly useful for quantitative flow visualization and other measurements such as model attitude and deformation in production wind tunnels. The paper also includes a brief description of typical photogrammetric applications to temperature- and pressure-sensitive paint measurements and model deformation measurements in wind tunnels.

  19. Quantitative analysis of RNA-protein interactions on a massively parallel array for mapping biophysical and evolutionary landscapes

    PubMed Central

    Buenrostro, Jason D.; Chircus, Lauren M.; Araya, Carlos L.; Layton, Curtis J.; Chang, Howard Y.; Snyder, Michael P.; Greenleaf, William J.

    2015-01-01

    RNA-protein interactions drive fundamental biological processes and are targets for molecular engineering, yet quantitative and comprehensive understanding of the sequence determinants of affinity remains limited. Here we repurpose a high-throughput sequencing instrument to quantitatively measure binding and dissociation of MS2 coat protein to >107 RNA targets generated on a flow-cell surface by in situ transcription and inter-molecular tethering of RNA to DNA. We decompose the binding energy contributions from primary and secondary RNA structure, finding that differences in affinity are often driven by sequence-specific changes in association rates. By analyzing the biophysical constraints and modeling mutational paths describing the molecular evolution of MS2 from low- to high-affinity hairpins, we quantify widespread molecular epistasis, and a long-hypothesized structure-dependent preference for G:U base pairs over C:A intermediates in evolutionary trajectories. Our results suggest that quantitative analysis of RNA on a massively parallel array (RNAMaP) relationships across molecular variants. PMID:24727714

  20. Bound Pool Fractions Complement Diffusion Measures to Describe White Matter Micro and Macrostructure

    PubMed Central

    Stikov, Nikola; Perry, Lee M.; Mezer, Aviv; Rykhlevskaia, Elena; Wandell, Brian A.; Pauly, John M.; Dougherty, Robert F.

    2010-01-01

    Diffusion imaging and bound pool fraction (BPF) mapping are two quantitative magnetic resonance imaging techniques that measure microstructural features of the white matter of the brain. Diffusion imaging provides a quantitative measure of the diffusivity of water in tissue. BPF mapping is a quantitative magnetization transfer (qMT) technique that estimates the proportion of exchanging protons bound to macromolecules, such as those found in myelin, and is thus a more direct measure of myelin content than diffusion. In this work, we combine BPF estimates of macromolecular content with measurements of diffusivity within human white matter tracts. Within the white matter, the correlation between BPFs and diffusivity measures such as fractional anisotropy and radial diffusivity was modest, suggesting that diffusion tensor imaging and bound pool fractions are complementary techniques. We found that several major tracts have high BPF, suggesting a higher density of myelin in these tracts. We interpret these results in the context of a quantitative tissue model. PMID:20828622

  1. Computationally Efficient Multiscale Reactive Molecular Dynamics to Describe Amino Acid Deprotonation in Proteins

    PubMed Central

    2016-01-01

    An important challenge in the simulation of biomolecular systems is a quantitative description of the protonation and deprotonation process of amino acid residues. Despite the seeming simplicity of adding or removing a positively charged hydrogen nucleus, simulating the actual protonation/deprotonation process is inherently difficult. It requires both the explicit treatment of the excess proton, including its charge defect delocalization and Grotthuss shuttling through inhomogeneous moieties (water and amino residues), and extensive sampling of coupled condensed phase motions. In a recent paper (J. Chem. Theory Comput.2014, 10, 2729−273725061442), a multiscale approach was developed to map high-level quantum mechanics/molecular mechanics (QM/MM) data into a multiscale reactive molecular dynamics (MS-RMD) model in order to describe amino acid deprotonation in bulk water. In this article, we extend the fitting approach (called FitRMD) to create MS-RMD models for ionizable amino acids within proteins. The resulting models are shown to faithfully reproduce the free energy profiles of the reference QM/MM Hamiltonian for PT inside an example protein, the ClC-ec1 H+/Cl– antiporter. Moreover, we show that the resulting MS-RMD models are computationally efficient enough to then characterize more complex 2-dimensional free energy surfaces due to slow degrees of freedom such as water hydration of internal protein cavities that can be inherently coupled to the excess proton charge translocation. The FitRMD method is thus shown to be an effective way to map ab initio level accuracy into a much more computationally efficient reactive MD method in order to explicitly simulate and quantitatively describe amino acid protonation/deprotonation in proteins. PMID:26734942

  2. Computationally Efficient Multiscale Reactive Molecular Dynamics to Describe Amino Acid Deprotonation in Proteins.

    PubMed

    Lee, Sangyun; Liang, Ruibin; Voth, Gregory A; Swanson, Jessica M J

    2016-02-09

    An important challenge in the simulation of biomolecular systems is a quantitative description of the protonation and deprotonation process of amino acid residues. Despite the seeming simplicity of adding or removing a positively charged hydrogen nucleus, simulating the actual protonation/deprotonation process is inherently difficult. It requires both the explicit treatment of the excess proton, including its charge defect delocalization and Grotthuss shuttling through inhomogeneous moieties (water and amino residues), and extensive sampling of coupled condensed phase motions. In a recent paper (J. Chem. Theory Comput. 2014, 10, 2729-2737), a multiscale approach was developed to map high-level quantum mechanics/molecular mechanics (QM/MM) data into a multiscale reactive molecular dynamics (MS-RMD) model in order to describe amino acid deprotonation in bulk water. In this article, we extend the fitting approach (called FitRMD) to create MS-RMD models for ionizable amino acids within proteins. The resulting models are shown to faithfully reproduce the free energy profiles of the reference QM/MM Hamiltonian for PT inside an example protein, the ClC-ec1 H(+)/Cl(-) antiporter. Moreover, we show that the resulting MS-RMD models are computationally efficient enough to then characterize more complex 2-dimensional free energy surfaces due to slow degrees of freedom such as water hydration of internal protein cavities that can be inherently coupled to the excess proton charge translocation. The FitRMD method is thus shown to be an effective way to map ab initio level accuracy into a much more computationally efficient reactive MD method in order to explicitly simulate and quantitatively describe amino acid protonation/deprotonation in proteins.

  3. Speech motor development: Integrating muscles, movements, and linguistic units.

    PubMed

    Smith, Anne

    2006-01-01

    A fundamental problem for those interested in human communication is to determine how ideas and the various units of language structure are communicated through speaking. The physiological concepts involved in the control of muscle contraction and movement are theoretically distant from the processing levels and units postulated to exist in language production models. A review of the literature on adult speakers suggests that they engage complex, parallel processes involving many units, including sentence, phrase, syllable, and phoneme levels. Infants must develop multilayered interactions among language and motor systems. This discussion describes recent studies of speech motor performance relative to varying linguistic goals during the childhood, teenage, and young adult years. Studies of the developing interactions between speech motor and language systems reveal both qualitative and quantitative differences between the developing and the mature systems. These studies provide an experimental basis for a more comprehensive theoretical account of how mappings between units of language and units of action are formed and how they function. Readers will be able to: (1) understand the theoretical differences between models of speech motor control and models of language processing, as well as the nature of the concepts used in the two different kinds of models, (2) explain the concept of coarticulation and state why this phenomenon has confounded attempts to determine the role of linguistic units, such as syllables and phonemes, in speech production, (3) describe the development of speech motor performance skills and specify quantitative and qualitative differences between speech motor performance in children and adults, and (4) describe experimental methods that allow scientists to study speech and limb motor control, as well as compare units of action used to study non-speech and speech movements.

  4. Highly predictive and interpretable models for PAMPA permeability.

    PubMed

    Sun, Hongmao; Nguyen, Kimloan; Kerns, Edward; Yan, Zhengyin; Yu, Kyeong Ri; Shah, Pranav; Jadhav, Ajit; Xu, Xin

    2017-02-01

    Cell membrane permeability is an important determinant for oral absorption and bioavailability of a drug molecule. An in silico model predicting drug permeability is described, which is built based on a large permeability dataset of 7488 compound entries or 5435 structurally unique molecules measured by the same lab using parallel artificial membrane permeability assay (PAMPA). On the basis of customized molecular descriptors, the support vector regression (SVR) model trained with 4071 compounds with quantitative data is able to predict the remaining 1364 compounds with the qualitative data with an area under the curve of receiver operating characteristic (AUC-ROC) of 0.90. The support vector classification (SVC) model trained with half of the whole dataset comprised of both the quantitative and the qualitative data produced accurate predictions to the remaining data with the AUC-ROC of 0.88. The results suggest that the developed SVR model is highly predictive and provides medicinal chemists a useful in silico tool to facilitate design and synthesis of novel compounds with optimal drug-like properties, and thus accelerate the lead optimization in drug discovery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. A quantitative study of the benefits of co-regulation using the spoIIA operon as an example

    PubMed Central

    Iber, Dagmar

    2006-01-01

    The distribution of most genes is not random, and functionally linked genes are often found in clusters. Several theories have been put forward to explain the emergence and persistence of operons in bacteria. Careful analysis of genomic data favours the co-regulation model, where gene organization into operons is driven by the benefits of coordinated gene expression and regulation. Direct evidence that coexpression increases the individual's fitness enough to ensure operon formation and maintenance is, however, still lacking. Here, a previously described quantitative model of the network that controls the transcription factor σF during sporulation in Bacillus subtilis is employed to quantify the benefits arising from both organization of the sporulation genes into the spoIIA operon and from translational coupling. The analysis shows that operon organization, together with translational coupling, is important because of the inherent stochastic nature of gene expression, which skews the ratios between protein concentrations in the absence of co-regulation. The predicted impact of different forms of gene regulation on fitness and survival agrees quantitatively with published sporulation efficiencies. PMID:16924264

  6. A quantitative study of the benefits of co-regulation using the spoIIA operon as an example.

    PubMed

    Iber, Dagmar

    2006-01-01

    The distribution of most genes is not random, and functionally linked genes are often found in clusters. Several theories have been put forward to explain the emergence and persistence of operons in bacteria. Careful analysis of genomic data favours the co-regulation model, where gene organization into operons is driven by the benefits of coordinated gene expression and regulation. Direct evidence that coexpression increases the individual's fitness enough to ensure operon formation and maintenance is, however, still lacking. Here, a previously described quantitative model of the network that controls the transcription factor sigma(F) during sporulation in Bacillus subtilis is employed to quantify the benefits arising from both organization of the sporulation genes into the spoIIA operon and from translational coupling. The analysis shows that operon organization, together with translational coupling, is important because of the inherent stochastic nature of gene expression, which skews the ratios between protein concentrations in the absence of co-regulation. The predicted impact of different forms of gene regulation on fitness and survival agrees quantitatively with published sporulation efficiencies.

  7. Quantitative structure-activity relationship modeling on in vitro endocrine effects and metabolic stability involving 26 selected brominated flame retardants.

    PubMed

    Harju, Mikael; Hamers, Timo; Kamstra, Jorke H; Sonneveld, Edwin; Boon, Jan P; Tysklind, Mats; Andersson, Patrik L

    2007-04-01

    In this work, quantitative structure-activity relationships (QSARs) were developed to aid human and environmental risk assessment processes for brominated flame retardants (BFRs). Brominated flame retardants, such as the high-production-volume chemicals polybrominated diphenyl ethers (PBDEs), tetrabromobisphenol A, and hexabromocyclododecane, have been identified as potential endocrine disruptors. Quantitative structure-activity relationship models were built based on the in vitro potencies of 26 selected BFRs. The in vitro assays included interactions with, for example, androgen, progesterone, estrogen, and dioxin (aryl hydrocarbon) receptor, plus competition with thyroxine for its plasma carrier protein (transthyretin), inhibition of estradiol sulfation via sulfotransferase, and finally, rate of metabolization. The QSAR modeling, a number of physicochemical parameters were calculated describing the electronic, lipophilic, and structural characteristics of the molecules. These include frontier molecular orbitals, molecular charges, polarities, log octanol/water partitioning coefficient, and two- and three-dimensional molecularproperties. Experimental properties were included and measured for PBDEs, such as their individual ultraviolet spectra (200-320 nm) and retention times on three different high-performance liquid chromatography columns and one nonpolar gas chromatography column. Quantitative structure-activity relationship models based on androgen antagonism and metabolic degradation rates generally gave similar results, suggesting that lower-brominated PBDEs with bromine substitutions in ortho positions and bromine-free meta- and para positions had the highest potencies and metabolic degradation rates. Predictions made for the constituents of the technical flame retardant Bromkal 70-5DE found BDE 17 to be a potent androgen antagonist and BDE 66, which is a relevant PBDE in environmental samples, to be only a weak antagonist.

  8. A comparison of operational and LANDSAT-aided snow water content estimation systems. [Feather River Basin, California

    NASA Technical Reports Server (NTRS)

    Sharp, J. M.; Thomas, R. W.

    1975-01-01

    How LANDSAT imagery can be cost effectively employed to augment an operational hydrologic model is described. Attention is directed toward the estimation of snow water content, a major predictor variable in the volumetric runoff forecasting model. A stratified double sampling scheme is supplemented with qualitative and quantitative analyses of existing operations to develop a comparison between the existing and satellite-aided approaches to snow water content estimation. Results show a decided advantage for the LANDSAT-aided approach.

  9. Mean-field methods in evolutionary duplication-innovation-loss models for the genome-level repertoire of protein domains.

    PubMed

    Angelini, A; Amato, A; Bianconi, G; Bassetti, B; Cosentino Lagomarsino, M

    2010-02-01

    We present a combined mean-field and simulation approach to different models describing the dynamics of classes formed by elements that can appear, disappear, or copy themselves. These models, related to a paradigm duplication-innovation model known as Chinese restaurant process, are devised to reproduce the scaling behavior observed in the genome-wide repertoire of protein domains of all known species. In view of these data, we discuss the qualitative and quantitative differences of the alternative model formulations, focusing in particular on the roles of element loss and of the specificity of empirical domain classes.

  10. Mean-field methods in evolutionary duplication-innovation-loss models for the genome-level repertoire of protein domains

    NASA Astrophysics Data System (ADS)

    Angelini, A.; Amato, A.; Bianconi, G.; Bassetti, B.; Cosentino Lagomarsino, M.

    2010-02-01

    We present a combined mean-field and simulation approach to different models describing the dynamics of classes formed by elements that can appear, disappear, or copy themselves. These models, related to a paradigm duplication-innovation model known as Chinese restaurant process, are devised to reproduce the scaling behavior observed in the genome-wide repertoire of protein domains of all known species. In view of these data, we discuss the qualitative and quantitative differences of the alternative model formulations, focusing in particular on the roles of element loss and of the specificity of empirical domain classes.

  11. Ion Channel ElectroPhysiology Ontology (ICEPO) - a case study of text mining assisted ontology development.

    PubMed

    Elayavilli, Ravikumar Komandur; Liu, Hongfang

    2016-01-01

    Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.

  12. Predictive Power of Attention and Reading Readiness Variables on Auditory Reasoning and Processing Skills of Six-Year-Old Children

    ERIC Educational Resources Information Center

    Erbay, Filiz

    2013-01-01

    The aim of present research was to describe the relation of six-year-old children's attention and reading readiness skills (general knowledge, word comprehension, sentences, and matching) with their auditory reasoning and processing skills. This was a quantitative study based on scanning model. Research sampling consisted of 204 kindergarten…

  13. New Method for Quantitation of Lipid Droplet Volume From Light Microscopic Images With an Application to Determination of PAT Protein Density on the Droplet Surface.

    PubMed

    Dejgaard, Selma Y; Presley, John F

    2018-06-01

    Determination of lipid droplet (LD) volume has depended on direct measurement of the diameter of individual LDs, which is not possible when LDs are small or closely apposed. To overcome this problem, we describe a new method in which a volume-fluorescence relationship is determined from automated analysis of calibration samples containing well-resolved LDs. This relationship is then used to estimate total cellular droplet volume in experimental samples, where the LDs need not be individually resolved, or to determine the volumes of individual LDs. We describe quantitatively the effects of various factors, including image noise, LD crowding, and variation in LD composition on the accuracy of this method. We then demonstrate this method by utilizing it to address a scientifically interesting question, to determine the density of green fluorescent protein (GFP)-tagged Perilipin-Adipocyte-Tail (PAT) proteins on the LD surface. We find that PAT proteins cover only a minority of the LD surface, consistent with models in which they primarily serve as scaffolds for binding of regulatory proteins and enzymes, but inconsistent with models in which their major function is to sterically block access to the droplet surface.

  14. A novel quantitative model of cell cycle progression based on cyclin-dependent kinases activity and population balances.

    PubMed

    Pisu, Massimo; Concas, Alessandro; Cao, Giacomo

    2015-04-01

    Cell cycle regulates proliferative cell capacity under normal or pathologic conditions, and in general it governs all in vivo/in vitro cell growth and proliferation processes. Mathematical simulation by means of reliable and predictive models represents an important tool to interpret experiment results, to facilitate the definition of the optimal operating conditions for in vitro cultivation, or to predict the effect of a specific drug in normal/pathologic mammalian cells. Along these lines, a novel model of cell cycle progression is proposed in this work. Specifically, it is based on a population balance (PB) approach that allows one to quantitatively describe cell cycle progression through the different phases experienced by each cell of the entire population during its own life. The transition between two consecutive cell cycle phases is simulated by taking advantage of the biochemical kinetic model developed by Gérard and Goldbeter (2009) which involves cyclin-dependent kinases (CDKs) whose regulation is achieved through a variety of mechanisms that include association with cyclins and protein inhibitors, phosphorylation-dephosphorylation, and cyclin synthesis or degradation. This biochemical model properly describes the entire cell cycle of mammalian cells by maintaining a sufficient level of detail useful to identify check point for transition and to estimate phase duration required by PB. Specific examples are discussed to illustrate the ability of the proposed model to simulate the effect of drugs for in vitro trials of interest in oncology, regenerative medicine and tissue engineering. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Modeling Human Dynamics of Face-to-Face Interaction Networks

    NASA Astrophysics Data System (ADS)

    Starnini, Michele; Baronchelli, Andrea; Pastor-Satorras, Romualdo

    2013-04-01

    Face-to-face interaction networks describe social interactions in human gatherings, and are the substrate for processes such as epidemic spreading and gossip propagation. The bursty nature of human behavior characterizes many aspects of empirical data, such as the distribution of conversation lengths, of conversations per person, or of interconversation times. Despite several recent attempts, a general theoretical understanding of the global picture emerging from data is still lacking. Here we present a simple model that reproduces quantitatively most of the relevant features of empirical face-to-face interaction networks. The model describes agents that perform a random walk in a two-dimensional space and are characterized by an attractiveness whose effect is to slow down the motion of people around them. The proposed framework sheds light on the dynamics of human interactions and can improve the modeling of dynamical processes taking place on the ensuing dynamical social networks.

  16. Source-term development for a contaminant plume for use by multimedia risk assessment models

    NASA Astrophysics Data System (ADS)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    2000-02-01

    Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.

  17. Mechanisms underlying anomalous diffusion in the plasma membrane.

    PubMed

    Krapf, Diego

    2015-01-01

    The plasma membrane is a complex fluid where lipids and proteins undergo diffusive motion critical to biochemical reactions. Through quantitative imaging analyses such as single-particle tracking, it is observed that diffusion in the cell membrane is usually anomalous in the sense that the mean squared displacement is not linear with time. This chapter describes the different models that are employed to describe anomalous diffusion, paying special attention to the experimental evidence that supports these models in the plasma membrane. We review models based on anticorrelated displacements, such as fractional Brownian motion and obstructed diffusion, and nonstationary models such as continuous time random walks. We also emphasize evidence for the formation of distinct compartments that transiently form on the cell surface. Finally, we overview heterogeneous diffusion processes in the plasma membrane, which have recently attracted considerable interest. Copyright © 2015. Published by Elsevier Inc.

  18. A rigorous approach to investigating common assumptions about disease transmission: Process algebra as an emerging modelling methodology for epidemiology.

    PubMed

    McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron

    2011-03-01

    Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.

  19. Quantitative precipitation forecasts in the Alps - an assessment from the Forecast Demonstration Project MAP D-PHASE

    NASA Astrophysics Data System (ADS)

    Ament, F.; Weusthoff, T.; Arpagaus, M.; Rotach, M.

    2009-04-01

    The main aim of the WWRP Forecast Demonstration Project MAP D-PHASE is to demonstrate the performance of today's models to forecast heavy precipitation and flood events in the Alpine region. Therefore an end-to-end, real-time forecasting system was installed and operated during the D PHASE Operations Period from June to November 2007. Part of this system are 30 numerical weather prediction models (deterministic as well as ensemble systems) operated by weather services and research institutes, which issue alerts if predicted precipitation accumulations exceed critical thresholds. Additionally to the real-time alerts, all relevant model fields of these simulations are stored in a central data archive. This comprehensive data set allows a detailed assessment of today's quantitative precipitation forecast (QPF) performance in the Alpine region. We will present results of QPF verifications against Swiss radar and rain gauge data both from a qualitative point of view, in terms of alerts, as well as from a quantitative perspective, in terms of precipitation rate. Various influencing factors like lead time, accumulation time, selection of warning thresholds, or bias corrections will be discussed. Additional to traditional verifications of area average precipitation amounts, the performance of the models to predict the correct precipitation statistics without requiring a point-to-point match will be described by using modern Fuzzy verification techniques. Both analyses reveal significant advantages of deep convection resolving models compared to coarser models with parameterized convection. An intercomparison of the model forecasts themselves reveals a remarkably high variability between different models, and makes it worthwhile to evaluate the potential of a multi-model ensemble. Various multi-model ensemble strategies will be tested by combining D-PHASE models to virtual ensemble systems.

  20. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  1. Understanding the effects of diffusion and relaxation in magnetic resonance imaging using computational modeling

    NASA Astrophysics Data System (ADS)

    Russell, Greg

    The work described in this dissertation was motivated by a desire to better understand the cellular pathology of ischemic stroke. Two of the three bodies of research presented herein address and issue directly related to the investigation of ischemic stroke through the use of diffusion weighted magnetic resonance imaging (DWMRI) methods. The first topic concerns the development of a computationally efficient finite difference method, designed to evaluate the impact of microscopic tissue properties on the formation of DWMRI signal. For the second body of work, the effect of changing the intrinsic diffusion coefficient of a restricted sample on clinical DWMRI experiments is explored. The final body of work, while motivated by the desire to understand stroke, addresses the issue of acquiring large amounts of MRI data well suited for quantitative analysis in reduced scan time. In theory, the method could be used to generate quantitative parametric maps, including those depicting information gleaned through the use of DWMRI methods. Chapter 1 provides an introduction to several topics. A description of the use of DWMRI methods in the study of ischemic stroke is covered. An introduction to the fundamental physical principles at work in MRI is also provided. In this section the means by which magnetization is created in MRI experiments, how MRI signal is induced, as well as the influence of spin-spin and spin-lattice relaxation are discussed. Attention is also given to describing how MRI measurements can be sensitized to diffusion through the use of qualitative and quantitative descriptions of the process. Finally, the reader is given a brief introduction to the use of numerical methods for solving partial differential equations. In Chapters 2, 3 and 4, three related bodies of research are presented in terms of research papers. In Chapter 2, a novel computational method is described. The method reduces the computation resources required to simulate DWMRI experiments. In Chapter 3, a detailed study on how changes in the intrinsic intracellular diffusion coefficient may influence clinical DWMRI experiments is described. In Chapter 4, a novel, non-steady state quantitative MRI method is described.

  2. Matrix viscoplasticity and its shielding by active mechanics in microtissue models: experiments and mathematical modeling

    NASA Astrophysics Data System (ADS)

    Liu, Alan S.; Wang, Hailong; Copeland, Craig R.; Chen, Christopher S.; Shenoy, Vivek B.; Reich, Daniel H.

    2016-09-01

    The biomechanical behavior of tissues under mechanical stimulation is critically important to physiological function. We report a combined experimental and modeling study of bioengineered 3D smooth muscle microtissues that reveals a previously unappreciated interaction between active cell mechanics and the viscoplastic properties of the extracellular matrix. The microtissues’ response to stretch/unstretch actuations, as probed by microcantilever force sensors, was dominated by cellular actomyosin dynamics. However, cell lysis revealed a viscoplastic response of the underlying model collagen/fibrin matrix. A model coupling Hill-type actomyosin dynamics with a plastic perfectly viscoplastic description of the matrix quantitatively accounts for the microtissue dynamics, including notably the cells’ shielding of the matrix plasticity. Stretch measurements of single cells confirmed the active cell dynamics, and were well described by a single-cell version of our model. These results reveal the need for new focus on matrix plasticity and its interactions with active cell mechanics in describing tissue dynamics.

  3. Simulation of Nonisothermal Consolidation of Saturated Soils Based on a Thermodynamic Model

    PubMed Central

    Cheng, Xiaohui

    2013-01-01

    Based on the nonequilibrium thermodynamics, a thermo-hydro-mechanical coupling model for saturated soils is established, including a constitutive model without such concepts as yield surface and flow rule. An elastic potential energy density function is defined to derive a hyperelastic relation among the effective stress, the elastic strain, and the dry density. The classical linear non-equilibrium thermodynamic theory is employed to quantitatively describe the unrecoverable energy processes like the nonelastic deformation development in materials by the concepts of dissipative force and dissipative flow. In particular the granular fluctuation, which represents the kinetic energy fluctuation and elastic potential energy fluctuation at particulate scale caused by the irregular mutual movement between particles, is introduced in the model and described by the concept of granular entropy. Using this model, the nonisothermal consolidation of saturated clays under cyclic thermal loadings is simulated in this paper to validate the model. The results show that the nonisothermal consolidation is heavily OCR dependent and unrecoverable. PMID:23983623

  4. Matrix viscoplasticity and its shielding by active mechanics in microtissue models: experiments and mathematical modeling

    PubMed Central

    Liu, Alan S.; Wang, Hailong; Copeland, Craig R.; Chen, Christopher S.; Shenoy, Vivek B.; Reich, Daniel H.

    2016-01-01

    The biomechanical behavior of tissues under mechanical stimulation is critically important to physiological function. We report a combined experimental and modeling study of bioengineered 3D smooth muscle microtissues that reveals a previously unappreciated interaction between active cell mechanics and the viscoplastic properties of the extracellular matrix. The microtissues’ response to stretch/unstretch actuations, as probed by microcantilever force sensors, was dominated by cellular actomyosin dynamics. However, cell lysis revealed a viscoplastic response of the underlying model collagen/fibrin matrix. A model coupling Hill-type actomyosin dynamics with a plastic perfectly viscoplastic description of the matrix quantitatively accounts for the microtissue dynamics, including notably the cells’ shielding of the matrix plasticity. Stretch measurements of single cells confirmed the active cell dynamics, and were well described by a single-cell version of our model. These results reveal the need for new focus on matrix plasticity and its interactions with active cell mechanics in describing tissue dynamics. PMID:27671239

  5. Simulation of nonisothermal consolidation of saturated soils based on a thermodynamic model.

    PubMed

    Zhang, Zhichao; Cheng, Xiaohui

    2013-01-01

    Based on the nonequilibrium thermodynamics, a thermo-hydro-mechanical coupling model for saturated soils is established, including a constitutive model without such concepts as yield surface and flow rule. An elastic potential energy density function is defined to derive a hyperelastic relation among the effective stress, the elastic strain, and the dry density. The classical linear non-equilibrium thermodynamic theory is employed to quantitatively describe the unrecoverable energy processes like the nonelastic deformation development in materials by the concepts of dissipative force and dissipative flow. In particular the granular fluctuation, which represents the kinetic energy fluctuation and elastic potential energy fluctuation at particulate scale caused by the irregular mutual movement between particles, is introduced in the model and described by the concept of granular entropy. Using this model, the nonisothermal consolidation of saturated clays under cyclic thermal loadings is simulated in this paper to validate the model. The results show that the nonisothermal consolidation is heavily OCR dependent and unrecoverable.

  6. Intrinsically disordered proteins--relation to general model expressing the active role of the water environment.

    PubMed

    Kalinowska, Barbara; Banach, Mateusz; Konieczny, Leszek; Marchewka, Damian; Roterman, Irena

    2014-01-01

    This work discusses the role of unstructured polypeptide chain fragments in shaping the protein's hydrophobic core. Based on the "fuzzy oil drop" model, which assumes an idealized distribution of hydrophobicity density described by the 3D Gaussian, we can determine which fragments make up the core and pinpoint residues whose location conflicts with theoretical predictions. We show that the structural influence of the water environment determines the positions of disordered fragments, leading to the formation of a hydrophobic core overlaid by a hydrophilic mantle. This phenomenon is further described by studying selected proteins which are known to be unstable and contain intrinsically disordered fragments. Their properties are established quantitatively, explaining the causative relation between the protein's structure and function and facilitating further comparative analyses of various structural models. © 2014 Elsevier Inc. All rights reserved.

  7. Tools for evaluating Veterinary Services: an external auditing model for the quality assurance process.

    PubMed

    Melo, E Correa

    2003-08-01

    The author describes the reasons why evaluation processes should be applied to the Veterinary Services of Member Countries, either for trade in animals and animal products and by-products between two countries, or for establishing essential measures to improve the Veterinary Service concerned. The author also describes the basic elements involved in conducting an evaluation process, including the instruments for doing so. These basic elements centre on the following:--designing a model, or desirable image, against which a comparison can be made--establishing a list of processes to be analysed and defining the qualitative and quantitative mechanisms for this analysis--establishing a multidisciplinary evaluation team and developing a process for standardising the evaluation criteria.

  8. Contextual Advantage for State Discrimination

    NASA Astrophysics Data System (ADS)

    Schmid, David; Spekkens, Robert W.

    2018-02-01

    Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum-error state discrimination. Namely, we identify quantitative limits on the success probability for minimum-error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios and demonstrate a tight connection between our minimum-error state discrimination scenario and a Bell scenario.

  9. A physical model describing the interaction of nuclear transport receptors with FG nucleoporin domain assemblies

    PubMed Central

    Zahn, Raphael; Osmanović, Dino; Ehret, Severin; Araya Callis, Carolina; Frey, Steffen; Stewart, Murray; You, Changjiang; Görlich, Dirk; Hoogenboom, Bart W; Richter, Ralf P

    2016-01-01

    The permeability barrier of nuclear pore complexes (NPCs) controls bulk nucleocytoplasmic exchange. It consists of nucleoporin domains rich in phenylalanine-glycine motifs (FG domains). As a bottom-up nanoscale model for the permeability barrier, we have used planar films produced with three different end-grafted FG domains, and quantitatively analyzed the binding of two different nuclear transport receptors (NTRs), NTF2 and Importin β, together with the concomitant film thickness changes. NTR binding caused only moderate changes in film thickness; the binding isotherms showed negative cooperativity and could all be mapped onto a single master curve. This universal NTR binding behavior – a key element for the transport selectivity of the NPC – was quantitatively reproduced by a physical model that treats FG domains as regular, flexible polymers, and NTRs as spherical colloids with a homogeneous surface, ignoring the detailed arrangement of interaction sites along FG domains and on the NTR surface. DOI: http://dx.doi.org/10.7554/eLife.14119.001 PMID:27058170

  10. Transmission of Bacterial Zoonotic Pathogens between Pets and Humans: The Role of Pet Food.

    PubMed

    Lambertini, Elisabetta; Buchanan, Robert L; Narrod, Clare; Pradhan, Abani K

    2016-01-01

    Recent Salmonella outbreaks associated with dry pet food and treats raised the level of concern for these products as vehicle of pathogen exposure for both pets and their owners. The need to characterize the microbiological and risk profiles of this class of products is currently not supported by sufficient specific data. This systematic review summarizes existing data on the main variables needed to support an ingredients-to-consumer quantitative risk model to (1) describe the microbial ecology of bacterial pathogens in the dry pet food production chain, (2) estimate pet exposure to pathogens through dry food consumption, and (3) assess human exposure and illness incidence due to contact with pet food and pets in the household. Risk models populated with the data here summarized will provide a tool to quantitatively address the emerging public health concerns associated with pet food and the effectiveness of mitigation measures. Results of such models can provide a basis for improvements in production processes, risk communication to consumers, and regulatory action.

  11. Describing Myxococcus xanthus Aggregation Using Ostwald Ripening Equations for Thin Liquid Films

    PubMed Central

    Bahar, Fatmagül; Pratt-Szeliga, Philip C.; Angus, Stuart; Guo, Jiaye; Welch, Roy D.

    2014-01-01

    When starved, a swarm of millions of Myxococcus xanthus cells coordinate their movement from outward swarming to inward coalescence. The cells then execute a synchronous program of multicellular development, arranging themselves into dome shaped aggregates. Over the course of development, about half of the initial aggregates disappear, while others persist and mature into fruiting bodies. This work seeks to develop a quantitative model for aggregation that accurately simulates which will disappear and which will persist. We analyzed time-lapse movies of M. xanthus development, modeled aggregation using the equations that describe Ostwald ripening of droplets in thin liquid films, and predicted the disappearance and persistence of aggregates with an average accuracy of 85%. We then experimentally validated a prediction that is fundamental to this model by tracking individual fluorescent cells as they moved between aggregates and demonstrating that cell movement towards and away from aggregates correlates with aggregate disappearance. Describing development through this model may limit the number and type of molecular genetic signals needed to complete M. xanthus development, and it provides numerous additional testable predictions. PMID:25231319

  12. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    PubMed

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production. © 2015 Blackwell Verlag GmbH.

  13. Mass Spectrometry Based Identification of Geometric Isomers during Metabolic Stability Study of a New Cytotoxic Sulfonamide Derivatives Supported by Quantitative Structure-Retention Relationships

    PubMed Central

    Belka, Mariusz; Hewelt-Belka, Weronika; Sławiński, Jarosław; Bączek, Tomasz

    2014-01-01

    A set of 15 new sulphonamide derivatives, presenting antitumor activity have been subjected to a metabolic stability study. The results showed that besides products of biotransformation, some additional peaks occurred in chromatograms. Tandem mass spectrometry revealed the same mass and fragmentation pathway, suggesting that geometric isomerization occurred. Thus, to support this hypothesis, quantitative structure-retention relationships were applied. Human liver microsomes were used as an in vitro model of metabolism. The biotransformation reactions were tracked by liquid chromatography assay and additionally, fragmentation mass spectra were recorded. In silico molecular modeling at a semi-empirical level was conducted as a starting point for molecular descriptor calculations. A quantitative structure-retention relationship model was built applying multiple linear regression based on selected three-dimensional descriptors. The studied compounds revealed high metabolic stability, with a tendency to form hydroxylated biotransformation products. However, significant chemical instability in conditions simulating human body fluids was noticed. According to literature and MS data geometrical isomerization was suggested. The developed in sillico model was able to describe the relationship between the geometry of isomer pairs and their chromatographic retention properties, thus it supported the hypothesis that the observed pairs of peaks are most likely geometric isomers. However, extensive structural investigations are needed to fully identify isomers’ geometry. An effort to describe MS fragmentation pathways of novel chemical structures is often not enough to propose structures of potent metabolites and products of other chemical reactions that can be observed in compound solutions at early drug discovery studies. The results indicate that the relatively non-expensive and not time- and labor-consuming in sillico approach could be a good supportive tool assisting the identification of cis-trans isomers based on retention data. This methodology can be helpful during the structural identification of biotransformation and degradation products of new chemical entities - potential new drugs. PMID:24893169

  14. Coupling biology and oceanography in models.

    PubMed

    Fennel, W; Neumann, T

    2001-08-01

    The dynamics of marine ecosystems, i.e. the changes of observable chemical-biological quantities in space and time, are driven by biological and physical processes. Predictions of future developments of marine systems need a theoretical framework, i.e. models, solidly based on research and understanding of the different processes involved. The natural way to describe marine systems theoretically seems to be the embedding of chemical-biological models into circulation models. However, while circulation models are relatively advanced the quantitative theoretical description of chemical-biological processes lags behind. This paper discusses some of the approaches and problems in the development of consistent theories and indicates the beneficial potential of the coupling of marine biology and oceanography in models.

  15. A method for testing whether model predictions fall within a prescribed factor of true values, with an application to pesticide leaching

    USGS Publications Warehouse

    Parrish, Rudolph S.; Smith, Charles N.

    1990-01-01

    A quantitative method is described for testing whether model predictions fall within a specified factor of true values. The technique is based on classical theory for confidence regions on unknown population parameters and can be related to hypothesis testing in both univariate and multivariate situations. A capability index is defined that can be used as a measure of predictive capability of a model, and its properties are discussed. The testing approach and the capability index should facilitate model validation efforts and permit comparisons among competing models. An example is given for a pesticide leaching model that predicts chemical concentrations in the soil profile.

  16. Information processing in bacteria: memory, computation, and statistical physics: a key issues review

    NASA Astrophysics Data System (ADS)

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    Living systems have to constantly sense their external environment and adjust their internal state in order to survive and reproduce. Biological systems, from as complex as the brain to a single E. coli cell, have to process these data in order to make appropriate decisions. How do biological systems sense external signals? How do they process the information? How do they respond to signals? Through years of intense study by biologists, many key molecular players and their interactions have been identified in different biological machineries that carry out these signaling functions. However, an integrated, quantitative understanding of the whole system is still lacking for most cellular signaling pathways, not to say the more complicated neural circuits. To study signaling processes in biology, the key thing to measure is the input-output relationship. The input is the signal itself, such as chemical concentration, external temperature, light (intensity and frequency), and more complex signals such as the face of a cat. The output can be protein conformational changes and covalent modifications (phosphorylation, methylation, etc), gene expression, cell growth and motility, as well as more complex output such as neuron firing patterns and behaviors of higher animals. Due to the inherent noise in biological systems, the measured input-output dependence is often noisy. These noisy data can be analysed by using powerful tools and concepts from information theory such as mutual information, channel capacity, and the maximum entropy hypothesis. This information theory approach has been successfully used to reveal the underlying correlations between key components of biological networks, to set bounds for network performance, and to understand possible network architecture in generating observed correlations. Although the information theory approach provides a general tool in analysing noisy biological data and may be used to suggest possible network architectures in preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also study the thermodynamic costs of adaptation for cells to maintain an accurate memory. The statistical physics based approach described here should be useful in understanding design principles for cellular biochemical circuits in general.

  17. Information processing in bacteria: memory, computation, and statistical physics: a key issues review.

    PubMed

    Lan, Ganhui; Tu, Yuhai

    2016-05-01

    Living systems have to constantly sense their external environment and adjust their internal state in order to survive and reproduce. Biological systems, from as complex as the brain to a single E. coli cell, have to process these data in order to make appropriate decisions. How do biological systems sense external signals? How do they process the information? How do they respond to signals? Through years of intense study by biologists, many key molecular players and their interactions have been identified in different biological machineries that carry out these signaling functions. However, an integrated, quantitative understanding of the whole system is still lacking for most cellular signaling pathways, not to say the more complicated neural circuits. To study signaling processes in biology, the key thing to measure is the input-output relationship. The input is the signal itself, such as chemical concentration, external temperature, light (intensity and frequency), and more complex signals such as the face of a cat. The output can be protein conformational changes and covalent modifications (phosphorylation, methylation, etc), gene expression, cell growth and motility, as well as more complex output such as neuron firing patterns and behaviors of higher animals. Due to the inherent noise in biological systems, the measured input-output dependence is often noisy. These noisy data can be analysed by using powerful tools and concepts from information theory such as mutual information, channel capacity, and the maximum entropy hypothesis. This information theory approach has been successfully used to reveal the underlying correlations between key components of biological networks, to set bounds for network performance, and to understand possible network architecture in generating observed correlations. Although the information theory approach provides a general tool in analysing noisy biological data and may be used to suggest possible network architectures in preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network-the main players (nodes) and their interactions (links)-in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also study the thermodynamic costs of adaptation for cells to maintain an accurate memory. The statistical physics based approach described here should be useful in understanding design principles for cellular biochemical circuits in general.

  18. Mechanistic quantitative structure-activity relationship model for the photoinduced toxicity of polycyclic aromatic hydrocarbons. 2: An empirical model for the toxicity of 16 polycyclic aromatic hydrocarbons to the duckweed Lemna gibba L. G-3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, X.D.; Krylov, S.N.; Ren, L.

    1997-11-01

    Photoinduced toxicity of polycyclic aromatic hydrocarbons (PAHs) occurs via photosensitization reactions (e.g., generation of singlet-state oxygen) and by photomodification (photooxidation and/or photolysis) of the chemicals to more toxic species. The quantitative structure-activity relationship (QSAR) described in the companion paper predicted, in theory, that photosensitization and photomodification additively contribute to toxicity. To substantiate this QSAR modeling exercise it was necessary to show that toxicity can be described by empirically derived parameters. The toxicity of 16 PAHs to the duckweed Lemna gibba was measured as inhibition of leaf production in simulated solar radiation (a light source with a spectrum similar to thatmore » of sunlight). A predictive model for toxicity was generated based on the theoretical model developed in the companion paper. The photophysical descriptors required of each PAH for modeling were efficiency of photon absorbance, relative uptake, quantum yield for triplet-state formation, and the rate of photomodification. The photomodification rates of the PAHs showed a moderate correlation to toxicity, whereas a derived photosensitization factor (PSF; based on absorbance, triplet-state quantum yield, and uptake) for each PAH showed only a weak, complex correlation to toxicity. However, summing the rate of photomodification and the PSF resulted in a strong correlation to toxicity that had predictive value. When the PSF and a derived photomodification factor (PMF; based on the photomodification rate and toxicity of the photomodified PAHs) were summed, an excellent explanatory model of toxicity was produced, substantiating the additive contributions of the two factors.« less

  19. The fluid mechanics of thrombus formation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Experimental data are presented for the growth of thrombi (blood clots) in a stagnation point flow of fresh blood. Thrombus shape, size and structure are shown to depend on local flow conditions. The evolution of a thrombus is described in terms of a physical model that includes platelet diffusion, a platelet aggregation mechanism, and diffusion and convection of the chemical species responsible for aggregation. Diffusion-controlled and convection-controlled regimes are defined by flow parameters and thrombus location, and the characteristic growth pattern in each regime is explained. Quantitative comparisons with an approximate theoretical model are presented, and a more general model is formulated.

  20. Evaluation and development of satellite inferences of convective storm intensity using combined case study and thunderstorm model simulations

    NASA Technical Reports Server (NTRS)

    Cotton, W. R.; Tripoli, G. J.

    1982-01-01

    Observational requirements for predicting convective storm development and intensity as suggested by recent numerical experiments are examined. Recent 3D numerical experiments are interpreted with regard to the relationship between overshooting tops and surface wind gusts. The development of software for emulating satellite inferred cloud properties using 3D cloud model predicted data and the simulation of Heymsfield (1981) Northern Illinois storm are described as well as the development of a conceptual/semi-quantitative model of eastward propagating, mesoscale convective complexes forming to the lee of the Rocky Mountains.

  1. Norwalk virus: how infectious is it?

    PubMed

    Teunis, Peter F M; Moe, Christine L; Liu, Pengbo; Miller, Sara E; Lindesmith, Lisa; Baric, Ralph S; Le Pendu, Jacques; Calderon, Rebecca L

    2008-08-01

    Noroviruses are major agents of viral gastroenteritis worldwide. The infectivity of Norwalk virus, the prototype norovirus, has been studied in susceptible human volunteers. A new variant of the hit theory model of microbial infection was developed to estimate the variation in Norwalk virus infectivity, as well as the degree of virus aggregation, consistent with independent (electron microscopic) observations. Explicit modeling of viral aggregation allows us to express virus infectivity per single infectious unit (particle). Comparison of a primary and a secondary inoculum showed that passage through a human host does not change Norwalk virus infectivity. We estimate the average probability of infection for a single Norwalk virus particle to be close to 0.5, exceeding that reported for any other virus studied to date. Infected subjects had a dose-dependent probability of becoming ill, ranging from 0.1 (at a dose of 10(3) NV genomes) to 0.7 (at 10(8) virus genomes). A norovirus dose response model is important for understanding its transmission and essential for development of a quantitative risk model. Norwalk virus is a valuable model system to study virulence because genetic factors are known for both complete and partial protection; the latter can be quantitatively described as heterogeneity in dose response models.

  2. Effects of a 20 year rain event: a quantitative microbial risk assessment of a case of contaminated bathing water in Copenhagen, Denmark.

    PubMed

    Andersen, S T; Erichsen, A C; Mark, O; Albrechtsen, H-J

    2013-12-01

    Quantitative microbial risk assessments (QMRAs) often lack data on water quality leading to great uncertainty in the QMRA because of the many assumptions. The quantity of waste water contamination was estimated and included in a QMRA on an extreme rain event leading to combined sewer overflow (CSO) to bathing water where an ironman competition later took place. Two dynamic models, (1) a drainage model and (2) a 3D hydrodynamic model, estimated the dilution of waste water from source to recipient. The drainage model estimated that 2.6% of waste water was left in the system before CSO and the hydrodynamic model estimated that 4.8% of the recipient bathing water came from the CSO, so on average there was 0.13% of waste water in the bathing water during the ironman competition. The total estimated incidence rate from a conservative estimate of the pathogenic load of five reference pathogens was 42%, comparable to 55% in an epidemiological study of the case. The combination of applying dynamic models and exposure data led to an improved QMRA that included an estimate of the dilution factor. This approach has not been described previously.

  3. Determining absolute protein numbers by quantitative fluorescence microscopy.

    PubMed

    Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry

    2014-01-01

    Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.

  4. Structures of glycans bound to receptors from saturation transfer difference (STD) NMR spectroscopy: quantitative analysis by using CORCEMA-ST.

    PubMed

    Enríquez-Navas, Pedro M; Guzzi, Cinzia; Muñoz-García, Juan C; Nieto, Pedro M; Angulo, Jesús

    2015-01-01

    Glycan-receptor interactions are of fundamental relevance for a large number of biological processes, and their kinetics properties (medium/weak binding affinities) make them appropriated to be studied by ligand observed NMR techniques, among which saturation transfer difference (STD) NMR spectroscopy has been shown to be a very robust and powerful approach. The quantitative analysis of the results from a STD NMR study of a glycan-receptor interaction is essential to be able to translate the resulting spectral intensities into a 3D molecular model of the complex. This chapter describes how to carry out such a quantitative analysis by means of the Complete Relaxation and Conformational Exchange Matrix Approach for STD NMR (CORCEMA-ST), in general terms, and an example of a previous work on an antibody-glycan interaction is also shown.

  5. Shape and shear guide sperm cells spiraling upstream

    NASA Astrophysics Data System (ADS)

    Kantsler, Vasily; Dunkel, Jorn; Goldstein, Raymond E.

    2014-11-01

    A major puzzle in biology is how mammalian sperm determine and maintain the correct swimming direction during the various phases of the sexual reproduction process. Currently debated mechanisms for sperm long range travel vary from peristaltic pumping to temperature sensing (thermotaxis) and direct response to fluid flow (rheotaxis), but little is known quantitatively about their relative importance. Here, we report the first quantitative experimental study of mammalian sperm rheotaxis. Using microfluidic devices, we investigate systematically the swimming behavior of human and bull sperm over a wide range of physiologically relevant shear rates and viscosities. Our measurements show that the interplay of fluid shear, steric surface-interactions and chirality of the flagellar beat leads to a stable upstream spiraling motion of sperm cells, thus providing a generic and robust rectification mechanism to support mammalian fertilization. To rationalize these findings, we identify a minimal mathematical model that is capable of describing quantitatively the experimental observations.

  6. Mechanistic modeling to predict the transporter- and enzyme-mediated drug-drug interactions of repaglinide.

    PubMed

    Varma, Manthena V S; Lai, Yurong; Kimoto, Emi; Goosen, Theunis C; El-Kattan, Ayman F; Kumar, Vikas

    2013-04-01

    Quantitative prediction of complex drug-drug interactions (DDIs) is challenging. Repaglinide is mainly metabolized by cytochrome-P-450 (CYP)2C8 and CYP3A4, and is also a substrate of organic anion transporting polypeptide (OATP)1B1. The purpose is to develop a physiologically based pharmacokinetic (PBPK) model to predict the pharmacokinetics and DDIs of repaglinide. In vitro hepatic transport of repaglinide, gemfibrozil and gemfibrozil 1-O-β-glucuronide was characterized using sandwich-culture human hepatocytes. A PBPK model, implemented in Simcyp (Sheffield, UK), was developed utilizing in vitro transport and metabolic clearance data. In vitro studies suggested significant active hepatic uptake of repaglinide. Mechanistic model adequately described repaglinide pharmacokinetics, and successfully predicted DDIs with several OATP1B1 and CYP3A4 inhibitors (<10% error). Furthermore, repaglinide-gemfibrozil interaction at therapeutic dose was closely predicted using in vitro fraction metabolism for CYP2C8 (0.71), when primarily considering reversible inhibition of OATP1B1 and mechanism-based inactivation of CYP2C8 by gemfibrozil and gemfibrozil 1-O-β-glucuronide. This study demonstrated that hepatic uptake is rate-determining in the systemic clearance of repaglinide. The model quantitatively predicted several repaglinide DDIs, including the complex interactions with gemfibrozil. Both OATP1B1 and CYP2C8 inhibition contribute significantly to repaglinide-gemfibrozil interaction, and need to be considered for quantitative rationalization of DDIs with either drug.

  7. Applications of the hybrid coordinate method to the TOPS autopilot

    NASA Technical Reports Server (NTRS)

    Fleischer, G. E.

    1978-01-01

    Preliminary results are presented from the application of the hybrid coordinate method to modeling TOPS (thermoelectric outer planet spacecraft) structural dynamics. Computer simulated responses of the vehicle are included which illustrate the interaction of relatively flexible appendages with an autopilot control system. Comparisons were made between simplified single-axis models of the control loop, with spacecraft flexibility represented by hinged rigid bodies, and a very detailed three-axis spacecraft model whose flexible portions are described by modal coordinates. While single-axis system, root loci provided reasonable qualitative indications of stability margins in this case, they were quantitatively optimistic when matched against responses of the detailed model.

  8. Analysis Tools for Interconnected Boolean Networks With Biological Applications.

    PubMed

    Chaves, Madalena; Tournier, Laurent

    2018-01-01

    Boolean networks with asynchronous updates are a class of logical models particularly well adapted to describe the dynamics of biological networks with uncertain measures. The state space of these models can be described by an asynchronous state transition graph, which represents all the possible exits from every single state, and gives a global image of all the possible trajectories of the system. In addition, the asynchronous state transition graph can be associated with an absorbing Markov chain, further providing a semi-quantitative framework where it becomes possible to compute probabilities for the different trajectories. For large networks, however, such direct analyses become computationally untractable, given the exponential dimension of the graph. Exploiting the general modularity of biological systems, we have introduced the novel concept of asymptotic graph , computed as an interconnection of several asynchronous transition graphs and recovering all asymptotic behaviors of a large interconnected system from the behavior of its smaller modules. From a modeling point of view, the interconnection of networks is very useful to address for instance the interplay between known biological modules and to test different hypotheses on the nature of their mutual regulatory links. This paper develops two new features of this general methodology: a quantitative dimension is added to the asymptotic graph, through the computation of relative probabilities for each final attractor and a companion cross-graph is introduced to complement the method on a theoretical point of view.

  9. Boiling points of halogenated aliphatic compounds: a quantitative structure-property relationship for prediction and validation.

    PubMed

    Oberg, Tomas

    2004-01-01

    Halogenated aliphatic compounds have many technical uses, but substances within this group are also ubiquitous environmental pollutants that can affect the ozone layer and contribute to global warming. The establishment of quantitative structure-property relationships is of interest not only to fill in gaps in the available database but also to validate experimental data already acquired. The three-dimensional structures of 240 compounds were modeled with molecular mechanics prior to the generation of empirical descriptors. Two bilinear projection methods, principal component analysis (PCA) and partial-least-squares regression (PLSR), were used to identify outliers. PLSR was subsequently used to build a multivariate calibration model by extracting the latent variables that describe most of the covariation between the molecular structure and the boiling point. Boiling points were also estimated with an extension of the group contribution method of Stein and Brown.

  10. An overview of a multifactor-system theory of personality and individual differences: III. Life span development and the heredity-environment issue.

    PubMed

    Powell, A; Royce, J R

    1981-12-01

    In Part III of this three-part series on multifactor-system theory, multivariate, life-span development is approached from the standpoint of a quantitative and qualitative analysis of the ontogenesis of factors in each of the six systems. The pattern of quantitative development (described via the Gompertz equation and three developmental parameters) involves growth, stability, and decline, and qualitative development involves changes in the organization of factors (e.g., factor differentiation and convergence). Hereditary and environmental sources of variation are analyzed via the factor gene model and the concept of heredity-dominant factors, and the factor-learning model and environment-dominant factors. It is hypothesized that the sensory and motor systems are heredity dominant, that the style and value systems are environment dominant, and that the cognitive and affective systems are partially heredity dominant.

  11. Microtubules soften due to cross-sectional flattening

    DOE PAGES

    Memet, Edvin; Hilitsk, Feodor; Morris, Margaret A.; ...

    2018-06-01

    We use optical trapping to continuously bend an isolated microtubule while simultaneously measuring the applied force and the resulting filament strain, thus allowing us to determine its elastic properties over a wide range of applied strains. We find that, while in the low-strain regime, microtubules may be quantitatively described in terms of the classical Euler-Bernoulli elastic filament, above a critical strain they deviate from this simple elastic model, showing a softening response with increasing deformations. A three-dimensional thin-shell model, in which the increased mechanical compliance is caused by flattening and eventual buckling of the filament cross-section, captures this softening effectmore » in the high strain regime and yields quantitative values of the effective mechanical properties of microtubules. Our results demonstrate that properties of microtubules are highly dependent on the magnitude of the applied strain and offer a new interpretation for the large variety in microtubule mechanical data measured by different methods.« less

  12. Microtubules soften due to cross-sectional flattening

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Memet, Edvin; Hilitsk, Feodor; Morris, Margaret A.

    We use optical trapping to continuously bend an isolated microtubule while simultaneously measuring the applied force and the resulting filament strain, thus allowing us to determine its elastic properties over a wide range of applied strains. We find that, while in the low-strain regime, microtubules may be quantitatively described in terms of the classical Euler-Bernoulli elastic filament, above a critical strain they deviate from this simple elastic model, showing a softening response with increasing deformations. A three-dimensional thin-shell model, in which the increased mechanical compliance is caused by flattening and eventual buckling of the filament cross-section, captures this softening effectmore » in the high strain regime and yields quantitative values of the effective mechanical properties of microtubules. Our results demonstrate that properties of microtubules are highly dependent on the magnitude of the applied strain and offer a new interpretation for the large variety in microtubule mechanical data measured by different methods.« less

  13. Predicting unfolding thermodynamics and stable intermediates for alanine-rich helical peptides with the aid of coarse-grained molecular simulation.

    PubMed

    Calero-Rubio, Cesar; Paik, Bradford; Jia, Xinqiao; Kiick, Kristi L; Roberts, Christopher J

    2016-10-01

    This report focuses on the molecular-level processes and thermodynamics of unfolding of a series of helical peptides using a coarse-grained (CG) molecular model. The CG model was refined to capture thermodynamics and structural changes as a function of temperature for a set of published peptide sequences. Circular dichroism spectroscopy (CD) was used to experimentally monitor the temperature-dependent conformational changes and stability of published peptides and new sequences introduced here. The model predictions were quantitatively or semi-quantitatively accurate in all cases. The simulations and CD results showed that, as expected, in most cases the unfolding of helical peptides is well described by a simply 2-state model, and conformational stability increased with increased length of the helices. A notable exception in a 19-residue helix was when two Ala residues were each replaced with Phe. This stabilized a partly unfolded intermediate state via hydrophobic contacts, and also promoted aggregates at higher peptide concentrations. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Qualitative and Quantitative Distinctions in Personality Disorder

    PubMed Central

    Wright, Aidan G. C.

    2011-01-01

    The “categorical-dimensional debate” has catalyzed a wealth of empirical advances in the study of personality pathology. However, this debate is merely one articulation of a broader conceptual question regarding whether to define and describe psychopathology as a quantitatively extreme expression of normal functioning or as qualitatively distinct in its process. In this paper I argue that dynamic models of personality (e.g., object-relations, cognitive-affective processing system) offer the conceptual scaffolding to reconcile these seemingly incompatible approaches to characterizing the relationship between normal and pathological personality. I propose that advances in personality assessment that sample behavior and experiences intensively provide the empirical techniques, whereas interpersonal theory offers an integrative theoretical framework, for accomplishing this goal. PMID:22804676

  15. The Relationship of Item-Level Response Times with Test-Taker and Item Variables in an Operational CAT Environment. LSAC Research Report Series.

    ERIC Educational Resources Information Center

    Swygert, Kimberly A.

    In this study, data from an operational computerized adaptive test (CAT) were examined in order to gather information concerning item response times in a CAT environment. The CAT under study included multiple-choice items measuring verbal, quantitative, and analytical reasoning. The analyses included the fitting of regression models describing the…

  16. Conducting quantitative synthesis when comparing medical interventions: AHRQ and the Effective Health Care Program.

    PubMed

    Fu, Rongwei; Gartlehner, Gerald; Grant, Mark; Shamliyan, Tatyana; Sedrakyan, Art; Wilt, Timothy J; Griffith, Lauren; Oremus, Mark; Raina, Parminder; Ismaila, Afisi; Santaguida, Pasqualina; Lau, Joseph; Trikalinos, Thomas A

    2011-11-01

    This article is to establish recommendations for conducting quantitative synthesis, or meta-analysis, using study-level data in comparative effectiveness reviews (CERs) for the Evidence-based Practice Center (EPC) program of the Agency for Healthcare Research and Quality. We focused on recurrent issues in the EPC program and the recommendations were developed using group discussion and consensus based on current knowledge in the literature. We first discussed considerations for deciding whether to combine studies, followed by discussions on indirect comparison and incorporation of indirect evidence. Then, we described our recommendations on choosing effect measures and statistical models, giving special attention to combining studies with rare events; and on testing and exploring heterogeneity. Finally, we briefly presented recommendations on combining studies of mixed design and on sensitivity analysis. Quantitative synthesis should be conducted in a transparent and consistent way. Inclusion of multiple alternative interventions in CERs increases the complexity of quantitative synthesis, whereas the basic issues in quantitative synthesis remain crucial considerations in quantitative synthesis for a CER. We will cover more issues in future versions and update and improve recommendations with the accumulation of new research to advance the goal for transparency and consistency. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Quantitative Reasoning in Problem Solving

    ERIC Educational Resources Information Center

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  18. Advances in Quantitative Proteomics of Microbes and Microbial Communities

    NASA Astrophysics Data System (ADS)

    Waldbauer, J.; Zhang, L.; Rizzo, A. I.

    2015-12-01

    Quantitative measurements of gene expression are key to developing a mechanistic, predictive understanding of how microbial metabolism drives many biogeochemical fluxes and responds to environmental change. High-throughput RNA-sequencing can afford a wealth of information about transcript-level expression patterns, but it is becoming clear that expression dynamics are often very different at the protein level where biochemistry actually occurs. These divergent dynamics between levels of biological organization necessitate quantitative proteomic measurements to address many biogeochemical questions. The protein-level expression changes that underlie shifts in the magnitude, or even the direction, of metabolic and biogeochemical fluxes can be quite subtle and test the limits of current quantitative proteomics techniques. Here we describe methodologies for high-precision, whole-proteome quantification that are applicable to both model organisms of biogeochemical interest that may not be genetically tractable, and to complex community samples from natural environments. Employing chemical derivatization of peptides with multiple isotopically-coded tags, this strategy is rapid and inexpensive, can be implemented on a wide range of mass spectrometric instrumentation, and is relatively insensitive to chromatographic variability. We demonstrate the utility of this quantitative proteomics approach in application to both isolates and natural communities of sulfur-metabolizing and photosynthetic microbes.

  19. The linearized multistage model and the future of quantitative risk assessment.

    PubMed

    Crump, K S

    1996-10-01

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for carcinogens having a non-linear mode of action; instead dose-response modelling would be used in the experimental range to calculate an LED10* (a statistical lower bound on the dose corresponding to a 10% increase in risk), and safety factors would be applied to the LED10* to determine acceptable exposure levels for humans. This approach is very similar to the one presently used by USEPA for non-carcinogens. Rather than using one approach for carcinogens believed to have a linear mode of action and a different approach for all other health effects, it is suggested herein that it would be more appropriate to use an approach conceptually similar to the 'LED10*-safety factor' approach for all health effects, and not to routinely develop quantitative risk estimates from animal data.

  20. Quantitation of dissolved gas content in emulsions and in blood using mass spectrometric detection.

    PubMed

    Grimley, Everett; Turner, Nicole; Newell, Clayton; Simpkins, Cuthbert; Rodriguez, Juan

    2011-06-01

    Quantitation of dissolved gases in blood or in other biological media is essential for understanding the dynamics of metabolic processes. Current detection techniques, while enabling rapid and convenient assessment of dissolved gases, provide only direct information on the partial pressure of gases dissolved in the aqueous fraction of the fluid. The more relevant quantity known as gas content, which refers to the total amount of the gas in all fractions of the sample, can be inferred from those partial pressures, but only indirectly through mathematical modeling. Here we describe a simple mass spectrometric technique for rapid and direct quantitation of gas content for a wide range of gases. The technique is based on a mass spectrometer detector that continuously monitors gases that are rapidly extracted from samples injected into a purge vessel. The accuracy and sample processing speed of the system is demonstrated with experiments that reproduce within minutes literature values for the solubility of various gases in water. The capability of the technique is further demonstrated through accurate determination of O(2) content in a lipid emulsion and in whole blood, using as little as 20 μL of sample. The approach to gas content quantitation described here should greatly expand the range of animals and conditions that may be used in studies of metabolic gas exchange, and facilitate the development of artificial oxygen carriers and resuscitation fluids. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. The Viareggio LPG railway accident: event reconstruction and modeling.

    PubMed

    Brambilla, Sara; Manca, Davide

    2010-10-15

    This manuscript describes in detail the LPG accident occurred in Viareggio on June 2009 and its modeling. The accident investigation highlighted the uncertainty and complexity of assessing and modeling what happened in the congested environment close to the Viareggio railway station. Nonetheless, the analysis allowed comprehending the sequence of events, the way they influenced each other, and the different possible paths/evolutions. The paper describes suitable models for the quantitative assessment of the consequences of the most probable accidental dynamics and its outcomes. The main finding is that after about 80 s from the beginning of the release the dense-gas cloud reached the surrounding houses that were destroyed successively by internal explosions. This fact has two main implications. First, it shows that the adopted modeling framework can give a correct picture of what happened in Viareggio. Second, it confirms the need to develop effective mitigation measures because, in case of this kind of accidents, there is no time to apply any protective emergency plans/actions. 2010 Elsevier B.V. All rights reserved.

  2. Variable classification in the LSST era: exploring a model for quasi-periodic light curves

    NASA Astrophysics Data System (ADS)

    Zinn, J. C.; Kochanek, C. S.; Kozłowski, S.; Udalski, A.; Szymański, M. K.; Soszyński, I.; Wyrzykowski, Ł.; Ulaczyk, K.; Poleski, R.; Pietrukowicz, P.; Skowron, J.; Mróz, P.; Pawlak, M.

    2017-06-01

    The Large Synoptic Survey Telescope (LSST) is expected to yield ˜107 light curves over the course of its mission, which will require a concerted effort in automated classification. Stochastic processes provide one means of quantitatively describing variability with the potential advantage over simple light-curve statistics that the parameters may be physically meaningful. Here, we survey a large sample of periodic, quasi-periodic and stochastic Optical Gravitational Lensing Experiment-III variables using the damped random walk (DRW; CARMA(1,0)) and quasi-periodic oscillation (QPO; CARMA(2,1)) stochastic process models. The QPO model is described by an amplitude, a period and a coherence time-scale, while the DRW has only an amplitude and a time-scale. We find that the periodic and quasi-periodic stellar variables are generally better described by a QPO than a DRW, while quasars are better described by the DRW model. There are ambiguities in interpreting the QPO coherence time due to non-sinusoidal light-curve shapes, signal-to-noise ratio, error mischaracterizations and cadence. Higher order implementations of the QPO model that better capture light-curve shapes are necessary for the coherence time to have its implied physical meaning. Independent of physical meaning, the extra parameter of the QPO model successfully distinguishes most of the classes of periodic and quasi-periodic variables we consider.

  3. Calibrant-Free Analyte Quantitation via a Variable Velocity Flow Cell.

    PubMed

    Beck, Jason G; Skuratovsky, Aleksander; Granger, Michael C; Porter, Marc D

    2017-01-17

    In this paper, we describe a novel method for analyte quantitation that does not rely on calibrants, internal standards, or calibration curves but, rather, leverages the relationship between disparate and predictable surface-directed analyte flux to an array of sensing addresses and a measured resultant signal. To reduce this concept to practice, we fabricated two flow cells such that the mean linear fluid velocity, U, was varied systematically over an array of electrodes positioned along the flow axis. This resulted in a predictable variation of the address-directed flux of a redox analyte, ferrocenedimethanol (FDM). The resultant limiting currents measured at a series of these electrodes, and accurately described by a convective-diffusive transport model, provided a means to calculate an "unknown" concentration without the use of calibrants, internal standards, or a calibration curve. Furthermore, the experiment and concentration calculation only takes minutes to perform. Deviation in calculated FDM concentrations from true values was minimized to less than 0.5% when empirically derived values of U were employed.

  4. Interrelation of structure and operational states in cascading failure of overloading lines in power grids

    NASA Astrophysics Data System (ADS)

    Xue, Fei; Bompard, Ettore; Huang, Tao; Jiang, Lin; Lu, Shaofeng; Zhu, Huaiying

    2017-09-01

    As the modern power system is expected to develop to a more intelligent and efficient version, i.e. the smart grid, or to be the central backbone of energy internet for free energy interactions, security concerns related to cascading failures have been raised with consideration of catastrophic results. The researches of topological analysis based on complex networks have made great contributions in revealing structural vulnerabilities of power grids including cascading failure analysis. However, existing literature with inappropriate assumptions in modeling still cannot distinguish the effects between the structure and operational state to give meaningful guidance for system operation. This paper is to reveal the interrelation between network structure and operational states in cascading failure and give quantitative evaluation by integrating both perspectives. For structure analysis, cascading paths will be identified by extended betweenness and quantitatively described by cascading drop and cascading gradient. Furthermore, the operational state for cascading paths will be described by loading level. Then, the risk of cascading failure along a specific cascading path can be quantitatively evaluated considering these two factors. The maximum cascading gradient of all possible cascading paths can be used as an overall metric to evaluate the entire power grid for its features related to cascading failure. The proposed method is tested and verified on IEEE30-bus system and IEEE118-bus system, simulation evidences presented in this paper suggests that the proposed model can identify the structural causes for cascading failure and is promising to give meaningful guidance for the protection of system operation in the future.

  5. Photogrammetry of the Human Brain: A Novel Method for Three-Dimensional Quantitative Exploration of the Structural Connectivity in Neurosurgery and Neurosciences.

    PubMed

    De Benedictis, Alessandro; Nocerino, Erica; Menna, Fabio; Remondino, Fabio; Barbareschi, Mattia; Rozzanigo, Umberto; Corsini, Francesco; Olivetti, Emanuele; Marras, Carlo Efisio; Chioffi, Franco; Avesani, Paolo; Sarubbo, Silvio

    2018-04-13

    Anatomic awareness of the structural connectivity of the brain is mandatory for neurosurgeons, to select the most effective approaches for brain resections. Although standard microdissection is a validated technique to investigate the different white matter (WM) pathways and to verify the results of tractography, the possibility of interactive exploration of the specimens and reliable acquisition of quantitative information has not been described. Photogrammetry is a well-established technique allowing an accurate metrology on highly defined three-dimensional (3D) models. The aim of this work is to propose the application of the photogrammetric technique for supporting the 3D exploration and the quantitative analysis on the cerebral WM connectivity. The main perisylvian pathways, including the superior longitudinal fascicle and the arcuate fascicle were exposed using the Klingler technique. The photogrammetric acquisition followed each dissection step. The point clouds were registered to a reference magnetic resonance image of the specimen. All the acquisitions were coregistered into an open-source model. We analyzed 5 steps, including the cortical surface, the short intergyral fibers, the indirect posterior and anterior superior longitudinal fascicle, and the arcuate fascicle. The coregistration between the magnetic resonance imaging mesh and the point clouds models was highly accurate. Multiple measures of distances between specific cortical landmarks and WM tracts were collected on the photogrammetric model. Photogrammetry allows an accurate 3D reproduction of WM anatomy and the acquisition of unlimited quantitative data directly on the real specimen during the postdissection analysis. These results open many new promising neuroscientific and educational perspectives and also optimize the quality of neurosurgical treatments. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Modeling analysis of pulsed magnetization process of magnetic core based on inverse Jiles-Atherton model

    NASA Astrophysics Data System (ADS)

    Liu, Yi; Zhang, He; Liu, Siwei; Lin, Fuchang

    2018-05-01

    The J-A (Jiles-Atherton) model is widely used to describe the magnetization characteristics of magnetic cores in a low-frequency alternating field. However, this model is deficient in the quantitative analysis of the eddy current loss and residual loss in a high-frequency magnetic field. Based on the decomposition of magnetization intensity, an inverse J-A model is established which uses magnetic flux density B as an input variable. Static and dynamic core losses under high frequency excitation are separated based on the inverse J-A model. Optimized parameters of the inverse J-A model are obtained based on particle swarm optimization. The platform for the pulsed magnetization characteristic test is designed and constructed. The hysteresis curves of ferrite and Fe-based nanocrystalline cores at high magnetization rates are measured. The simulated and measured hysteresis curves are presented and compared. It is found that the inverse J-A model can be used to describe the magnetization characteristics at high magnetization rates and to separate the static loss and dynamic loss accurately.

  7. Improved partition equilibrium model for predicting analyte response in electrospray ionization mass spectrometry.

    PubMed

    Du, Lihong; White, Robert L

    2009-02-01

    A previously proposed partition equilibrium model for quantitative prediction of analyte response in electrospray ionization mass spectrometry is modified to yield an improved linear relationship. Analyte mass spectrometer response is modeled by a competition mechanism between analyte and background electrolytes that is based on partition equilibrium considerations. The correlation between analyte response and solution composition is described by the linear model over a wide concentration range and the improved model is shown to be valid for a wide range of experimental conditions. The behavior of an analyte in a salt solution, which could not be explained by the original model, is correctly predicted. The ion suppression effects of 16:0 lysophosphatidylcholine (LPC) on analyte signals are attributed to a combination of competition for excess charge and reduction of total charge due to surface tension effects. In contrast to the complicated mathematical forms that comprise the original model, the simplified model described here can more easily be employed to predict analyte mass spectrometer responses for solutions containing multiple components. Copyright (c) 2008 John Wiley & Sons, Ltd.

  8. Image analysis and modeling in medical image computing. Recent developments and advances.

    PubMed

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.

  9. Kinetic Model of Growth of Arthropoda Populations

    NASA Astrophysics Data System (ADS)

    Ershov, Yu. A.; Kuznetsov, M. A.

    2018-05-01

    Kinetic equations were derived for calculating the growth of crustacean populations ( Crustacea) based on the biological growth model suggested earlier using shrimp ( Caridea) populations as an example. The development cycle of successive stages for populations can be represented in the form of quasi-chemical equations. The kinetic equations that describe the development cycle of crustaceans allow quantitative prediction of the development of populations depending on conditions. In contrast to extrapolation-simulation models, in the developed kinetic model of biological growth the kinetic parameters are the experimental characteristics of population growth. Verification and parametric identification of the developed model on the basis of the experimental data showed agreement with experiment within the error of the measurement technique.

  10. CDMBE: A Case Description Model Based on Evidence

    PubMed Central

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jager, Yetta; Efroymson, Rebecca Ann; Sublette, K.

    Quantitative tools are needed to evaluate the ecological effects of increasing petroleum production. In this article, we describe two stochastic models for simulating the spatial distribution of brine spills on a landscape. One model uses general assumptions about the spatial arrangement of spills and their sizes; the second model distributes spills by siting rectangular well complexes and conditioning spill probabilities on the configuration of pipes. We present maps of landscapes with spills produced by the two methods and compare the ability of the models to reproduce a specified spill area. A strength of the models presented here is their abilitymore » to extrapolate from the existing landscape to simulate landscapes with a higher (or lower) density of oil wells.« less

  12. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    PubMed

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-03-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg.

  13. Quantitative Protein Localization Signatures Reveal an Association between Spatial and Functional Divergences of Proteins

    PubMed Central

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-01-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg. PMID:24603469

  14. Three-dimensional structural modelling and calculation of electrostatic potentials of HLA Bw4 and Bw6 epitopes to explain the molecular basis for alloantibody binding: toward predicting HLA antigenicity and immunogenicity.

    PubMed

    Mallon, Dermot H; Bradley, J Andrew; Winn, Peter J; Taylor, Craig J; Kosmoliaptsis, Vasilis

    2015-02-01

    We have previously shown that qualitative assessment of surface electrostatic potential of HLA class I molecules helps explain serological patterns of alloantibody binding. We have now used a novel computational approach to quantitate differences in surface electrostatic potential of HLA B-cell epitopes and applied this to explain HLA Bw4 and Bw6 antigenicity. Protein structure models of HLA class I alleles expressing either the Bw4 or Bw6 epitope (defined by sequence motifs at positions 77 to 83) were generated using comparative structure prediction. The electrostatic potential in 3-dimensional space encompassing the Bw4/Bw6 epitope was computed by solving the Poisson-Boltzmann equation and quantitatively compared in a pairwise, all-versus-all fashion to produce distance matrices that cluster epitopes with similar electrostatics properties. Quantitative comparison of surface electrostatic potential at the carboxyl terminal of the α1-helix of HLA class I alleles, corresponding to amino acid sequence motif 77 to 83, produced clustering of HLA molecules in 3 principal groups according to Bw4 or Bw6 epitope expression. Remarkably, quantitative differences in electrostatic potential reflected known patterns of serological reactivity better than Bw4/Bw6 amino acid sequence motifs. Quantitative assessment of epitope electrostatic potential allowed the impact of known amino acid substitutions (HLA-B*07:02 R79G, R82L, G83R) that are critical for antibody binding to be predicted. We describe a novel approach for quantitating differences in HLA B-cell epitope electrostatic potential. Proof of principle is provided that this approach enables better assessment of HLA epitope antigenicity than amino acid sequence data alone, and it may allow prediction of HLA immunogenicity.

  15. Video methods in the quantification of children's exposures.

    PubMed

    Ferguson, Alesia C; Canales, Robert A; Beamer, Paloma; Auyeung, Willa; Key, Maya; Munninghoff, Amy; Lee, Kevin Tse-Wing; Robertson, Alexander; Leckie, James O

    2006-05-01

    In 1994, Stanford University's Exposure Research Group (ERG) conducted its first pilot study to collect micro-level activity time series (MLATS) data for young children. The pilot study involved videotaping four children of farm workers in the Salinas Valley of California and converting their videotaped activities to valuable text files of contact behavior using video-translation techniques. These MLATS are especially useful for describing intermittent dermal (i.e., second-by-second account of surfaces and objects contacted) and non-dietary ingestion (second-by-second account of objects or hands placed in the mouth) contact behavior. Second-by-second records of children contact behavior are amenable to quantitative and statistical analysis and allow for more accurate model estimates of human exposure and dose to environmental contaminants. Activity patterns data for modeling inhalation exposure (i.e., accounts of microenvironments visited) can also be extracted from the MLATS data. Since the pilot study, ERG has collected an immense MLATS data set for 92 children using more developed and refined videotaping and video-translation methodologies. This paper describes all aspects required for the collection of MLATS including: subject recruitment techniques, videotaping and video-translation processes, and potential data analysis. This paper also describes the quality assurance steps employed for these new MLATS projects, including: training, data management, and the application of interobserver and intraobserver agreement during video translation. The discussion of these issues and ERG's experiences in dealing with them can assist other groups in the conduct of research that employs these more quantitative techniques.

  16. Imaging Transgene Expression with Radionuclide Imaging Technologies1

    PubMed Central

    Gambhir, SS; Herschman, HR; Cherry, SR; Barrio, JR; Satyamurthy, N; Toyokuni, T; Phelps, ME; Larson, SM; Balaton, J; Finn, R; Sadelain, M; Tjuvajev, J

    2000-01-01

    Abstract A variety of imaging technologies are being investigated as tools for studying gene expression in living subjects. Noninvasive, repetitive and quantitative imaging of gene expression will help both to facilitate human gene therapy trials and to allow for the study of animal models of molecular and cellular therapy. Radionuclide approaches using single photon emission computed tomography (SPECT) and positron emission tomography (PET) are the most mature of the current imaging technologies and offer many advantages for imaging gene expression compared to optical and magnetic resonance imaging (MRI)-based approaches. These advantages include relatively high sensitivity, full quantitative capability (for PET), and the ability to extend small animal assays directly into clinical human applications. We describe a PET scanner (micro PET) designed specifically for studies of small animals. We review “marker/reporter gene” imaging approaches using the herpes simplex type 1 virus thymidine kinase (HSV1-tk) and the dopamine type 2 receptor (D2R) genes. We describe and contrast several radiolabeled probes that can be used with the HSV1-tk reporter gene both for SPECT and for PET imaging. We also describe the advantages/disadvantages of each of the assays developed and discuss future animal and human applications. PMID:10933072

  17. Automated quantitative gait analysis during overground locomotion in the rat: its application to spinal cord contusion and transection injuries.

    PubMed

    Hamers, F P; Lankhorst, A J; van Laar, T J; Veldhuis, W B; Gispen, W H

    2001-02-01

    Analysis of locomotion is an important tool in the study of peripheral and central nervous system damage. Most locomotor scoring systems in rodents are based either upon open field locomotion assessment, for example, the BBB score or upon foot print analysis. The former yields a semiquantitative description of locomotion as a whole, whereas the latter generates quantitative data on several selected gait parameters. In this paper, we describe the use of a newly developed gait analysis method that allows easy quantitation of a large number of locomotion parameters during walkway crossing. We were able to extract data on interlimb coordination, swing duration, paw print areas (total over stance, and at 20-msec time resolution), stride length, and base of support: Similar data can not be gathered by any single previously described method. We compare changes in gait parameters induced by two different models of spinal cord injury in rats, transection of the dorsal half of the spinal cord and spinal cord contusion injury induced by the NYU or MASCIS device. Although we applied this method to rats with spinal cord injury, the usefulness of this method is not limited to rats or to the investigation of spinal cord injuries alone.

  18. Quantitative reflection spectroscopy at the human ocular fundus

    NASA Astrophysics Data System (ADS)

    Hammer, Martin; Schweitzer, Dietrich

    2002-01-01

    A new model of the reflection of the human ocular fundus on the basis of the adding-doubling method, an approximate solution of the radiative transport equation, is described. This model enables the calculation of the concentrations of xanthophyll in the retina, of melanin in the retinal pigment epithelium and the choroid, and of haemoglobin in the choroid from fundus reflection spectra. The concentration values found in 12 healthy subjects are in excellent agreement with published data. In individual cases of pathologic fundus alterations, possible benefits to the ophthalmologic diagnostics are demonstrated.

  19. A cost-effectiveness comparison of existing and Landsat-aided snow water content estimation systems

    NASA Technical Reports Server (NTRS)

    Sharp, J. M.; Thomas, R. W.

    1975-01-01

    This study describes how Landsat imagery can be cost-effectively employed to augment an operational hydrologic model. Attention is directed toward the estimation of snow water content, a major predictor variable in the volumetric runoff forecasting model presently used by the California Department of Water Resources. A stratified double sampling scheme is supplemented with qualitative and quantitative analyses of existing operations to develop a comparison between the existing and satellite-aided approaches to snow water content estimation. Results show a decided advantage for the Landsat-aided approach.

  20. Carolina Care at University of North Carolina Health Care: Implementing a Theory-Driven Care Delivery Model Across a Healthcare System.

    PubMed

    Tonges, Mary; Ray, Joel D; Herman, Suzanne; McCann, Meghan

    2018-04-01

    Patient satisfaction is a key component of healthcare organizations' performance. Providing a consistent, positive patient experience across a system can be challenging. This article describes an organization's approach to achieving this goal by implementing a successful model developed at the flagship academic healthcare center across an 8-hospital system. The Carolina Care at University of North Carolina Health Care initiative has resulted in substantive qualitative and quantitative benefits including higher patient experience scores for both overall rating and nurse communication.

  1. Adsorption-Induced Deformation of Hierarchically Structured Mesoporous Silica—Effect of Pore-Level Anisotropy

    PubMed Central

    2017-01-01

    The goal of this work is to understand adsorption-induced deformation of hierarchically structured porous silica exhibiting well-defined cylindrical mesopores. For this purpose, we performed an in situ dilatometry measurement on a calcined and sintered monolithic silica sample during the adsorption of N2 at 77 K. To analyze the experimental data, we extended the adsorption stress model to account for the anisotropy of cylindrical mesopores, i.e., we explicitly derived the adsorption stress tensor components in the axial and radial direction of the pore. For quantitative predictions of stresses and strains, we applied the theoretical framework of Derjaguin, Broekhoff, and de Boer for adsorption in mesopores and two mechanical models of silica rods with axially aligned pore channels: an idealized cylindrical tube model, which can be described analytically, and an ordered hexagonal array of cylindrical mesopores, whose mechanical response to adsorption stress was evaluated by 3D finite element calculations. The adsorption-induced strains predicted by both mechanical models are in good quantitative agreement making the cylindrical tube the preferable model for adsorption-induced strains due to its simple analytical nature. The theoretical results are compared with the in situ dilatometry data on a hierarchically structured silica monolith composed by a network of mesoporous struts of MCM-41 type morphology. Analyzing the experimental adsorption and strain data with the proposed theoretical framework, we find the adsorption-induced deformation of the monolithic sample being reasonably described by a superposition of axial and radial strains calculated on the mesopore level. The structural and mechanical parameters obtained from the model are in good agreement with expectations from independent measurements and literature, respectively. PMID:28547995

  2. Quantitative risk assessment for Escherichia coli O157:H7 in frozen ground beef patties consumed by young children in French households.

    PubMed

    Delignette-Muller, M L; Cornu, M

    2008-11-30

    A quantitative risk assessment for Escherichia coli O157:H7 in frozen ground beef patties consumed by children under 10 years of age in French households was conducted by a national study group describing an outbreak which occurred in France in 2005. Our exposure assessment model incorporates results from French surveys on consumption frequency of ground beef patties, serving size and consumption preference, microbial destruction experiments and microbial counts on patties sampled from the industrial batch which were responsible for the outbreak. Two different exposure models were proposed, respectively for children under the age of 5 and for children between 5 and 10 years. For each of these two age groups, a single-hit dose-response model was proposed to describe the probability of hemolytic and uremic syndrome (HUS) as a function of the ingested dose. For each group, the single parameter of this model was estimated by Bayesian inference, using the results of the exposure assessment and the epidemiological data collected during the outbreak. Results show that children under 5 years of age are roughly 5 times more susceptible to the pathogen than children over 5 years. Exposure and dose-response models were used in a scenario analysis in order to validate the use of the model and to propose appropriate guidelines in order to prevent new outbreaks. The impact of the cooking preference was evaluated, showing that only a well-done cooking notably reduces the HUS risk, without annulling it. For each age group, a relation between the mean individual HUS risk per serving and the contamination level in a ground beef batch was proposed, as a tool to help French risk managers.

  3. Adsorption-Induced Deformation of Hierarchically Structured Mesoporous Silica-Effect of Pore-Level Anisotropy.

    PubMed

    Balzer, Christian; Waag, Anna M; Gehret, Stefan; Reichenauer, Gudrun; Putz, Florian; Hüsing, Nicola; Paris, Oskar; Bernstein, Noam; Gor, Gennady Y; Neimark, Alexander V

    2017-06-06

    The goal of this work is to understand adsorption-induced deformation of hierarchically structured porous silica exhibiting well-defined cylindrical mesopores. For this purpose, we performed an in situ dilatometry measurement on a calcined and sintered monolithic silica sample during the adsorption of N 2 at 77 K. To analyze the experimental data, we extended the adsorption stress model to account for the anisotropy of cylindrical mesopores, i.e., we explicitly derived the adsorption stress tensor components in the axial and radial direction of the pore. For quantitative predictions of stresses and strains, we applied the theoretical framework of Derjaguin, Broekhoff, and de Boer for adsorption in mesopores and two mechanical models of silica rods with axially aligned pore channels: an idealized cylindrical tube model, which can be described analytically, and an ordered hexagonal array of cylindrical mesopores, whose mechanical response to adsorption stress was evaluated by 3D finite element calculations. The adsorption-induced strains predicted by both mechanical models are in good quantitative agreement making the cylindrical tube the preferable model for adsorption-induced strains due to its simple analytical nature. The theoretical results are compared with the in situ dilatometry data on a hierarchically structured silica monolith composed by a network of mesoporous struts of MCM-41 type morphology. Analyzing the experimental adsorption and strain data with the proposed theoretical framework, we find the adsorption-induced deformation of the monolithic sample being reasonably described by a superposition of axial and radial strains calculated on the mesopore level. The structural and mechanical parameters obtained from the model are in good agreement with expectations from independent measurements and literature, respectively.

  4. Homeopathic potentization based on nanoscale domains.

    PubMed

    Czerlinski, George; Ypma, Tjalling

    2011-12-01

    The objectives of this study were to present a simple descriptive and quantitative model of how high potencies in homeopathy arise. The model begins with the mechanochemical production of hydrogen and hydroxyl radicals from water and the electronic stabilization of the resulting nanodomains of water molecules. The life of these domains is initially limited to a few days, but may extend to years when the electromagnetic characteristic of a homeopathic agent is copied onto the domains. This information is transferred between the original agent and the nanodomains, and also between previously imprinted nanodomains and new ones. The differential equations previously used to describe these processes are replaced here by exponential expressions, corresponding to simplified model mechanisms. Magnetic stabilization is also involved, since these long-lived domains apparently require the presence of the geomagnetic field. Our model incorporates this factor in the formation of the long-lived compound. Numerical simulation and graphs show that the potentization mechanism can be described quantitatively by a very simplified mechanism. The omitted factors affect only the fine structure of the kinetics. Measurements of pH changes upon absorption of different electromagnetic frequencies indicate that about 400 nanodomains polymerize to form one cooperating unit. Singlet excited states of some compounds lead to dramatic changes in their hydrogen ion dissociation constant, explaining this pH effect and suggesting that homeopathic information is imprinted as higher singlet excited states. A simple description is provided of the process of potentization in homeopathic dilutions. With the exception of minor details, this simple model replicates the results previously obtained from a more complex model. While excited states are short lived in isolated molecules, they become long lived in nanodomains that form coherent cooperative aggregates controlled by the geomagnetic field. These domains either slowly emit biophotons or perform specific biochemical work at their target.

  5. Equivalent Circuit for Magnetoelectric Read and Write Operations

    NASA Astrophysics Data System (ADS)

    Camsari, Kerem Y.; Faria, Rafatul; Hassan, Orchi; Sutton, Brian M.; Datta, Supriyo

    2018-04-01

    We describe an equivalent circuit model applicable to a wide variety of magnetoelectric phenomena and use spice simulations to benchmark this model against experimental data. We use this model to suggest a different mode of operation where the 1 and 0 states are represented not by states with net magnetization (like mx , my, or mz) but by different easy axes, quantitatively described by (mx2-my2), which switches from 0 to 1 through the write voltage. This change is directly detected as a read signal through the inverse effect. The use of (mx2-my2) to represent a bit is a radical departure from the standard convention of using the magnetization (m ) to represent information. We then show how the equivalent circuit can be used to build a device exhibiting tunable randomness and suggest possibilities for extending it to nonvolatile memory with read and write capabilities, without the use of external magnetic fields or magnetic tunnel junctions.

  6. Dynamical and many-body correlation effects in the kinetic energy spectra of isotopes produced in nuclear multifragmentation

    NASA Astrophysics Data System (ADS)

    Souza, S. R.; Donangelo, R.; Lynch, W. G.; Tsang, M. B.

    2018-03-01

    The properties of the kinetic energy spectra of light isotopes produced in the breakup of a nuclear source and during the de-excitation of its products are examined. The initial stage, at which the hot fragments are created, is modeled by the statistical multifragmentation model, whereas the Weisskopf-Ewing evaporation treatment is adopted to describe the subsequent fragment de-excitation, as they follow their classical trajectories dictated by the Coulomb repulsion among them. The energy spectra obtained are compared to available experimental data. The influence of the fusion cross section entering into the evaporation treatment is investigated and its influence on the qualitative aspects of the energy spectra turns out to be small. Although these aspects can be fairly well described by the model, the underlying physics associated with the quantitative discrepancies remains to be understood.

  7. Title: Experimental and analytical study of frictional anisotropy of nanotubes

    NASA Astrophysics Data System (ADS)

    Riedo, Elisa; Gao, Yang; Li, Tai-De; Chiu, Hsiang-Chih; Kim, Suenne; Klinke, Christian; Tosatti, Erio

    The frictional properties of Carbon and Boron Nitride nanotubes (NTs) are very important in a variety of applications, including composite materials, carbon fibers, and micro/nano-electromechanical systems. Atomic force microscopy (AFM) is a powerful tool to investigate with nanoscale resolution the frictional properties of individual NTs. Here, we report on an experimental study of the frictional properties of different types of supported nanotubes by AFM. We also propose a quantitative model to describe and then predict the frictional properties of nanotubes sliding on a substrate along (longitudinal friction) or perpendicular (transverse friction) their axis. This model provides a simple but general analytical relationship that well describes the acquired experimental data. As an example of potential applications, this experimental method combined with the proposed model can guide to design better NTs-ceramic composites, or to self-assemble the nanotubes on a surface in a given direction. M. Lucas et al., Nature Materials 8, 876-881 (2009).

  8. In silico quantitative structure-toxicity relationship study of aromatic nitro compounds.

    PubMed

    Pasha, Farhan Ahmad; Neaz, Mohammad Morshed; Cho, Seung Joo; Ansari, Mohiuddin; Mishra, Sunil Kumar; Tiwari, Sharvan

    2009-05-01

    Small molecules often have toxicities that are a function of molecular structural features. Minor variations in structural features can make large difference in such toxicity. Consequently, in silico techniques may be used to correlate such molecular toxicities with their structural features. Relative to nine different sets of aromatic nitro compounds having known observed toxicities against different targets, we developed ligand-based 2D quantitative structure-toxicity relationship models using 20 selected topological descriptors. The topological descriptors have several advantages such as conformational independency, facile and less time-consuming computation to yield good results. Multiple linear regression analysis was used to correlate variations of toxicity with molecular properties. The information index on molecular size, lopping centric index and Kier flexibility index were identified as fundamental descriptors for different kinds of toxicity, and further showed that molecular size, branching and molecular flexibility might be particularly important factors in quantitative structure-toxicity relationship analysis. This study revealed that topological descriptor-guided quantitative structure-toxicity relationship provided a very useful, cost and time-efficient, in silico tool for describing small-molecule toxicities.

  9. A practical technique for quantifying the performance of acoustic emission systems on plate-like structures.

    PubMed

    Scholey, J J; Wilcox, P D; Wisnom, M R; Friswell, M I

    2009-06-01

    A model for quantifying the performance of acoustic emission (AE) systems on plate-like structures is presented. Employing a linear transfer function approach the model is applicable to both isotropic and anisotropic materials. The model requires several inputs including source waveforms, phase velocity and attenuation. It is recognised that these variables may not be readily available, thus efficient measurement techniques are presented for obtaining phase velocity and attenuation in a form that can be exploited directly in the model. Inspired by previously documented methods, the application of these techniques is examined and some important implications for propagation characterisation in plates are discussed. Example measurements are made on isotropic and anisotropic plates and, where possible, comparisons with numerical solutions are made. By inputting experimentally obtained data into the model, quantitative system metrics are examined for different threshold values and sensor locations. By producing plots describing areas of hit success and source location error, the ability to measure the performance of different AE system configurations is demonstrated. This quantitative approach will help to place AE testing on a more solid foundation, underpinning its use in industrial AE applications.

  10. The Pathway for Oxygen: Tutorial Modelling on Oxygen Transport from Air to Mitochondrion: The Pathway for Oxygen.

    PubMed

    Bassingthwaighte, James B; Raymond, Gary M; Dash, Ranjan K; Beard, Daniel A; Nolan, Margaret

    2016-01-01

    The 'Pathway for Oxygen' is captured in a set of models describing quantitative relationships between fluxes and driving forces for the flux of oxygen from the external air source to the mitochondrial sink at cytochrome oxidase. The intervening processes involve convection, membrane permeation, diffusion of free and heme-bound O2 and enzymatic reactions. While this system's basic elements are simple: ventilation, alveolar gas exchange with blood, circulation of the blood, perfusion of an organ, uptake by tissue, and consumption by chemical reaction, integration of these pieces quickly becomes complex. This complexity led us to construct a tutorial on the ideas and principles; these first PathwayO2 models are simple but quantitative and cover: (1) a 'one-alveolus lung' with airway resistance, lung volume compliance, (2) bidirectional transport of solute gasses like O2 and CO2, (3) gas exchange between alveolar air and lung capillary blood, (4) gas solubility in blood, and circulation of blood through the capillary syncytium and back to the lung, and (5) blood-tissue gas exchange in capillaries. These open-source models are at Physiome.org and provide background for the many respiratory models there.

  11. Development of a pharmacokinetic/pharmacodynamic/disease progression model in NC/Nga mice for development of novel anti-atopic dermatitis drugs.

    PubMed

    Baek, In-Hwan; Lee, Byung-Yo; Chae, Jung-Woo; Song, Gyu Yong; Kang, Wonku; Kwon, Kwang-Il

    2014-11-01

    1. JHL45, a novel immune modulator against atopic dermatitis (AD), was synthesized from decursin isolated from Angelica gigas. The goal is to evaluate the lead compound using quantitative modeling approaches to novel anti-AD drug development. 2. We tested the anti-inflammatory effect of JHL45 by in vitro screening, characterized its in vitro pharmacokinetic (PK) properties. The dose-dependent efficacy of JHL45 was developed using a pharmacokinetics/pharmacodynamics/disease progression (PK/PD/DIS) model in NC/Nga mice. 3. JHL45 has drug-like properties and pharmacological effects when administered orally to treat atopic dermatitis. The developed PK/PD/DIS model described well the rapid metabolism of JHL45, double-peak phenomenon in the PK of decursinol and inhibition of IgE generation by compounds in NC/Nga mice. Also, a quantitative model was developed and used to elucidate the complex interactions between serum IgE concentration and atopic dermatitis symptoms. 4. Our findings indicate that JHL45 has good physicochemical properties and powerful pharmacological effects when administered orally for treatment of AD in rodents.

  12. The Pathway for Oxygen: Tutorial Modelling on Oxygen Transport from Air to Mitochondrion

    PubMed Central

    Bassingthwaighte, James B.; Raymond, Gary M.; Dash, Ranjan K.; Beard, Daniel A.; Nolan, Margaret

    2016-01-01

    The ‘Pathway for Oxygen’ is captured in a set of models describing quantitative relationships between fluxes and driving forces for the flux of oxygen from the external air source to the mitochondrial sink at cytochrome oxidase. The intervening processes involve convection, membrane permeation, diffusion of free and heme-bound O2 and enzymatic reactions. While this system’s basic elements are simple: ventilation, alveolar gas exchange with blood, circulation of the blood, perfusion of an organ, uptake by tissue, and consumption by chemical reaction, integration of these pieces quickly becomes complex. This complexity led us to construct a tutorial on the ideas and principles; these first PathwayO2 models are simple but quantitative and cover: 1) a ‘one-alveolus lung’ with airway resistance, lung volume compliance, 2) bidirectional transport of solute gasses like O2 and CO2, 3) gas exchange between alveolar air and lung capillary blood, 4) gas solubility in blood, and circulation of blood through the capillary syncytium and back to the lung, and 5) blood-tissue gas exchange in capillaries. These open-source models are at Physiome.org and provide background for the many respiratory models there. PMID:26782201

  13. Mapping of quantitative trait loci using the skew-normal distribution.

    PubMed

    Fernandes, Elisabete; Pacheco, António; Penha-Gonçalves, Carlos

    2007-11-01

    In standard interval mapping (IM) of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. When this assumption of normality is violated, the most commonly adopted strategy is to use the previous model after data transformation. However, an appropriate transformation may not exist or may be difficult to find. Also this approach can raise interpretation issues. An interesting alternative is to consider a skew-normal mixture model in standard IM, and the resulting method is here denoted as skew-normal IM. This flexible model that includes the usual symmetric normal distribution as a special case is important, allowing continuous variation from normality to non-normality. In this paper we briefly introduce the main peculiarities of the skew-normal distribution. The maximum likelihood estimates of parameters of the skew-normal distribution are obtained by the expectation-maximization (EM) algorithm. The proposed model is illustrated with real data from an intercross experiment that shows a significant departure from the normality assumption. The performance of the skew-normal IM is assessed via stochastic simulation. The results indicate that the skew-normal IM has higher power for QTL detection and better precision of QTL location as compared to standard IM and nonparametric IM.

  14. Flow, Transport, and Reaction in Porous Media: Percolation Scaling, Critical-Path Analysis, and Effective Medium Approximation

    NASA Astrophysics Data System (ADS)

    Hunt, Allen G.; Sahimi, Muhammad

    2017-12-01

    We describe the most important developments in the application of three theoretical tools to modeling of the morphology of porous media and flow and transport processes in them. One tool is percolation theory. Although it was over 40 years ago that the possibility of using percolation theory to describe flow and transport processes in porous media was first raised, new models and concepts, as well as new variants of the original percolation model are still being developed for various applications to flow phenomena in porous media. The other two approaches, closely related to percolation theory, are the critical-path analysis, which is applicable when porous media are highly heterogeneous, and the effective medium approximation—poor man's percolation—that provide a simple and, under certain conditions, quantitatively correct description of transport in porous media in which percolation-type disorder is relevant. Applications to topics in geosciences include predictions of the hydraulic conductivity and air permeability, solute and gas diffusion that are particularly important in ecohydrological applications and land-surface interactions, and multiphase flow in porous media, as well as non-Gaussian solute transport, and flow morphologies associated with imbibition into unsaturated fractures. We describe new applications of percolation theory of solute transport to chemical weathering and soil formation, geomorphology, and elemental cycling through the terrestrial Earth surface. Wherever quantitatively accurate predictions of such quantities are relevant, so are the techniques presented here. Whenever possible, the theoretical predictions are compared with the relevant experimental data. In practically all the cases, the agreement between the theoretical predictions and the data is excellent. Also discussed are possible future directions in the application of such concepts to many other phenomena in geosciences.

  15. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  16. Novel mathematic models for quantitative transitivity of quality-markers in extraction process of the Buyanghuanwu decoction.

    PubMed

    Zhang, Yu-Tian; Xiao, Mei-Feng; Deng, Kai-Wen; Yang, Yan-Tao; Zhou, Yi-Qun; Zhou, Jin; He, Fu-Yuan; Liu, Wen-Long

    2018-06-01

    Nowadays, to research and formulate an efficiency extraction system for Chinese herbal medicine, scientists have always been facing a great challenge for quality management, so that the transitivity of Q-markers in quantitative analysis of TCM was proposed by Prof. Liu recently. In order to improve the quality of extraction from raw medicinal materials for clinical preparations, a series of integrated mathematic models for transitivity of Q-markers in quantitative analysis of TCM were established. Buyanghuanwu decoction (BYHWD) was a commonly TCMs prescription, which was used to prevent and treat the ischemic heart and brain diseases. In this paper, we selected BYHWD as an extraction experimental subject to study the quantitative transitivity of TCM. Based on theory of Fick's Rule and Noyes-Whitney equation, novel kinetic models were established for extraction of active components. Meanwhile, fitting out kinetic equations of extracted models and then calculating the inherent parameters in material piece and Q-marker quantitative transfer coefficients, which were considered as indexes to evaluate transitivity of Q-markers in quantitative analysis of the extraction process of BYHWD. HPLC was applied to screen and analyze the potential Q-markers in the extraction process. Fick's Rule and Noyes-Whitney equation were adopted for mathematically modeling extraction process. Kinetic parameters were fitted and calculated by the Statistical Program for Social Sciences 20.0 software. The transferable efficiency was described and evaluated by potential Q-markers transfer trajectory via transitivity availability AUC, extraction ratio P, and decomposition ratio D respectively. The Q-marker was identified with AUC, P, D. Astragaloside IV, laetrile, paeoniflorin, and ferulic acid were studied as potential Q-markers from BYHWD. The relative technologic parameters were presented by mathematic models, which could adequately illustrate the inherent properties of raw materials preparation and affection of Q-markers transitivity in equilibrium processing. AUC, P, D for potential Q-markers of AST-IV, laetrile, paeoniflorin, and FA were obtained, with the results of 289.9 mAu s, 46.24%, 22.35%; 1730 mAu s, 84.48%, 1.963%; 5600 mAu s, 70.22%, 0.4752%; 7810 mAu s, 24.29%, 4.235%, respectively. The results showed that the suitable Q-markers were laetrile and paeoniflorin in our study, which exhibited acceptable traceability and transitivity in the extraction process of TCMs. Therefore, these novel mathematic models might be developed as a new standard to control TCMs quality process from raw medicinal materials to product manufacturing. Copyright © 2018 Elsevier GmbH. All rights reserved.

  17. Development of quantitative radioactive methodologies on paper to determine important lateral-flow immunoassay parameters.

    PubMed

    Mosley, Garrett L; Nguyen, Phuong; Wu, Benjamin M; Kamei, Daniel T

    2016-08-07

    The lateral-flow immunoassay (LFA) is a well-established diagnostic technology that has recently seen significant advancements due in part to the rapidly expanding fields of paper diagnostics and paper-fluidics. As LFA-based diagnostics become more complex, it becomes increasingly important to quantitatively determine important parameters during the design and evaluation process. However, current experimental methods for determining these parameters have certain limitations when applied to LFA systems. In this work, we describe our novel methods of combining paper and radioactive measurements to determine nanoprobe molarity, the number of antibodies per nanoprobe, and the forward and reverse rate constants for nanoprobe binding to immobilized target on the LFA test line. Using a model LFA system that detects for the presence of the protein transferrin (Tf), we demonstrate the application of our methods, which involve quantitative experimentation and mathematical modeling. We also compare the results of our rate constant experiments with traditional experiments to demonstrate how our methods more appropriately capture the influence of the LFA environment on the binding interaction. Our novel experimental approaches can therefore more efficiently guide the research process for LFA design, leading to more rapid advancement of the field of paper-based diagnostics.

  18. Systems microscopy: an emerging strategy for the life sciences.

    PubMed

    Lock, John G; Strömblad, Staffan

    2010-05-01

    Dynamic cellular processes occurring in time and space are fundamental to all physiology and disease. To understand complex and dynamic cellular processes therefore demands the capacity to record and integrate quantitative multiparametric data from the four spatiotemporal dimensions within which living cells self-organize, and to subsequently use these data for the mathematical modeling of cellular systems. To this end, a raft of complementary developments in automated fluorescence microscopy, cell microarray platforms, quantitative image analysis and data mining, combined with multivariate statistics and computational modeling, now coalesce to produce a new research strategy, "systems microscopy", which facilitates systems biology analyses of living cells. Systems microscopy provides the crucial capacities to simultaneously extract and interrogate multiparametric quantitative data at resolution levels ranging from the molecular to the cellular, thereby elucidating a more comprehensive and richly integrated understanding of complex and dynamic cellular systems. The unique capacities of systems microscopy suggest that it will become a vital cornerstone of systems biology, and here we describe the current status and future prospects of this emerging field, as well as outlining some of the key challenges that remain to be overcome. Copyright 2010 Elsevier Inc. All rights reserved.

  19. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach.

    PubMed

    van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie

    2017-09-01

    This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction.

    PubMed

    Cobbs, Gary

    2012-08-16

    Numerous models for use in interpreting quantitative PCR (qPCR) data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the literature. They also give better estimates of initial target concentration. Model 1 was found to be slightly more robust than model 2 giving better estimates of initial target concentration when estimation of parameters was done for qPCR curves with very different initial target concentration. Both models may be used to estimate the initial absolute concentration of target sequence when a standard curve is not available. It is argued that the kinetic approach to modeling and interpreting quantitative PCR data has the potential to give more precise estimates of the true initial target concentrations than other methods currently used for analysis of qPCR data. The two models presented here give a unified model of the qPCR process in that they explain the shape of the qPCR curve for a wide variety of initial target concentrations.

  1. MONALISA for stochastic simulations of Petri net models of biochemical systems.

    PubMed

    Balazki, Pavel; Lindauer, Klaus; Einloft, Jens; Ackermann, Jörg; Koch, Ina

    2015-07-10

    The concept of Petri nets (PN) is widely used in systems biology and allows modeling of complex biochemical systems like metabolic systems, signal transduction pathways, and gene expression networks. In particular, PN allows the topological analysis based on structural properties, which is important and useful when quantitative (kinetic) data are incomplete or unknown. Knowing the kinetic parameters, the simulation of time evolution of such models can help to study the dynamic behavior of the underlying system. If the number of involved entities (molecules) is low, a stochastic simulation should be preferred against the classical deterministic approach of solving ordinary differential equations. The Stochastic Simulation Algorithm (SSA) is a common method for such simulations. The combination of the qualitative and semi-quantitative PN modeling and stochastic analysis techniques provides a valuable approach in the field of systems biology. Here, we describe the implementation of stochastic analysis in a PN environment. We extended MONALISA - an open-source software for creation, visualization and analysis of PN - by several stochastic simulation methods. The simulation module offers four simulation modes, among them the stochastic mode with constant firing rates and Gillespie's algorithm as exact and approximate versions. The simulator is operated by a user-friendly graphical interface and accepts input data such as concentrations and reaction rate constants that are common parameters in the biological context. The key features of the simulation module are visualization of simulation, interactive plotting, export of results into a text file, mathematical expressions for describing simulation parameters, and up to 500 parallel simulations of the same parameter sets. To illustrate the method we discuss a model for insulin receptor recycling as case study. We present a software that combines the modeling power of Petri nets with stochastic simulation of dynamic processes in a user-friendly environment supported by an intuitive graphical interface. The program offers a valuable alternative to modeling, using ordinary differential equations, especially when simulating single-cell experiments with low molecule counts. The ability to use mathematical expressions provides an additional flexibility in describing the simulation parameters. The open-source distribution allows further extensions by third-party developers. The software is cross-platform and is licensed under the Artistic License 2.0.

  2. Two schemes for quantitative photoacoustic tomography based on Monte Carlo simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yubin; Yuan, Zhen, E-mail: zhenyuan@umac.mo

    Purpose: The aim of this study was to develop novel methods for photoacoustically determining the optical absorption coefficient of biological tissues using Monte Carlo (MC) simulation. Methods: In this study, the authors propose two quantitative photoacoustic tomography (PAT) methods for mapping the optical absorption coefficient. The reconstruction methods combine conventional PAT with MC simulation in a novel way to determine the optical absorption coefficient of biological tissues or organs. Specifically, the authors’ two schemes were theoretically and experimentally examined using simulations, tissue-mimicking phantoms, ex vivo, and in vivo tests. In particular, the authors explored these methods using several objects withmore » different absorption contrasts embedded in turbid media and by using high-absorption media when the diffusion approximation was not effective at describing the photon transport. Results: The simulations and experimental tests showed that the reconstructions were quantitatively accurate in terms of the locations, sizes, and optical properties of the targets. The positions of the recovered targets were accessed by the property profiles, where the authors discovered that the off center error was less than 0.1 mm for the circular target. Meanwhile, the sizes and quantitative optical properties of the targets were quantified by estimating the full width half maximum of the optical absorption property. Interestingly, for the reconstructed sizes, the authors discovered that the errors ranged from 0 for relatively small-size targets to 26% for relatively large-size targets whereas for the recovered optical properties, the errors ranged from 0% to 12.5% for different cases. Conclusions: The authors found that their methods can quantitatively reconstruct absorbing objects of different sizes and optical contrasts even when the diffusion approximation is unable to accurately describe the photon propagation in biological tissues. In particular, their methods are able to resolve the intrinsic difficulties that occur when quantitative PAT is conducted by combining conventional PAT with the diffusion approximation or with radiation transport modeling.« less

  3. Valuing national effects of digital health investments: an applied method.

    PubMed

    Hagens, Simon; Zelmer, Jennifer; Frazer, Cassandra; Gheorghiu, Bobby; Leaver, Chad

    2015-01-01

    This paper describes an approach which has been applied to value national outcomes of investments by federal, provincial and territorial governments, clinicians and healthcare organizations in digital health. Hypotheses are used to develop a model, which is revised and populated based upon the available evidence. Quantitative national estimates and qualitative findings are produced and validated through structured peer review processes. This methodology has applied in four studies since 2008.

  4. Another Curriculum Requirement? Quantitative Reasoning in Economics: Some First Steps

    ERIC Educational Resources Information Center

    O'Neill, Patrick B.; Flynn, David T.

    2013-01-01

    In this paper, we describe first steps toward focusing on quantitative reasoning in an intermediate microeconomic theory course. We find student attitudes toward quantitative aspects of economics improve over the duration of the course (as we would hope). Perhaps more importantly, student attitude toward quantitative reasoning improves, in…

  5. Quantitative Courses in a Liberal Education Program: A Case Study

    ERIC Educational Resources Information Center

    Wismath, Shelly L.; Mackay, D. Bruce

    2012-01-01

    This essay argues for the importance of quantitative reasoning skills as part of a liberal education and describes the successful introduction of a mathematics-based quantitative skills course at a small Canadian university. Today's students need quantitative problem-solving skills, to function as adults, professionals, consumers, and citizens in…

  6. Quantitative PET/CT scanner performance characterization based upon the society of nuclear medicine and molecular imaging clinical trials network oncology clinical simulator phantom.

    PubMed

    Sunderland, John J; Christian, Paul E

    2015-01-01

    The Clinical Trials Network (CTN) of the Society of Nuclear Medicine and Molecular Imaging (SNMMI) operates a PET/CT phantom imaging program using the CTN's oncology clinical simulator phantom, designed to validate scanners at sites that wish to participate in oncology clinical trials. Since its inception in 2008, the CTN has collected 406 well-characterized phantom datasets from 237 scanners at 170 imaging sites covering the spectrum of commercially available PET/CT systems. The combined and collated phantom data describe a global profile of quantitative performance and variability of PET/CT data used in both clinical practice and clinical trials. Individual sites filled and imaged the CTN oncology PET phantom according to detailed instructions. Standard clinical reconstructions were requested and submitted. The phantom itself contains uniform regions suitable for scanner calibration assessment, lung fields, and 6 hot spheric lesions with diameters ranging from 7 to 20 mm at a 4:1 contrast ratio with primary background. The CTN Phantom Imaging Core evaluated the quality of the phantom fill and imaging and measured background standardized uptake values to assess scanner calibration and maximum standardized uptake values of all 6 lesions to review quantitative performance. Scanner make-and-model-specific measurements were pooled and then subdivided by reconstruction to create scanner-specific quantitative profiles. Different makes and models of scanners predictably demonstrated different quantitative performance profiles including, in some cases, small calibration bias. Differences in site-specific reconstruction parameters increased the quantitative variability among similar scanners, with postreconstruction smoothing filters being the most influential parameter. Quantitative assessment of this intrascanner variability over this large collection of phantom data gives, for the first time, estimates of reconstruction variance introduced into trials from allowing trial sites to use their preferred reconstruction methodologies. Predictably, time-of-flight-enabled scanners exhibited less size-based partial-volume bias than non-time-of-flight scanners. The CTN scanner validation experience over the past 5 y has generated a rich, well-curated phantom dataset from which PET/CT make-and-model and reconstruction-dependent quantitative behaviors were characterized for the purposes of understanding and estimating scanner-based variances in clinical trials. These results should make it possible to identify and recommend make-and-model-specific reconstruction strategies to minimize measurement variability in cancer clinical trials. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  7. Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.

    ERIC Educational Resources Information Center

    Moffat, A. J.; And Others

    Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…

  8. A Quantitative Review and Meta-Models of the Variability and Factors Affecting Oral Drug Absorption-Part I: Gastrointestinal pH.

    PubMed

    Abuhelwa, Ahmad Y; Foster, David J R; Upton, Richard N

    2016-09-01

    This study aimed to conduct a quantitative meta-analysis for the values of, and variability in, gastrointestinal (GI) pH in the different GI segments; characterize the effect of food on the values and variability in these parameters; and present quantitative meta-models of distributions of GI pH to help inform models of oral drug absorption. The literature was systemically reviewed for the values of, and the variability in, GI pH under fed and fasted conditions. The GI tract was categorized into the following 10 distinct regions: stomach (proximal, mid-distal), duodenum (proximal, mid-distal), jejunum and ileum (proximal, mid, and distal small intestine), and colon (ascending, transverse, and descending colon). Meta-analysis used the "metafor" package of the R language. The time course of postprandial stomach pH was modeled using NONMEM. Food significantly influenced the estimated meta-mean stomach and duodenal pH but had no significant influence on small intestinal and colonic pH. The time course of postprandial pH was described using an exponential model. Increased meal caloric content increased the extent and duration of postprandial gastric pH buffering. The different parts of the small intestine had significantly different pH. Colonic pH was significantly different for descending but not for ascending and transverse colon. Knowledge of GI pH is important for the formulation design of the pH-dependent dosage forms and in understanding the dissolution and absorption of orally administered drugs. The meta-models of GI pH may also be used as part of semi-physiological pharmacokinetic models to characterize the effect of GI pH on the in vivo drug release and pharmacokinetics.

  9. Population pharmacokinetic characterization of BAY 81-8973, a full-length recombinant factor VIII: lessons learned - importance of including samples with factor VIII levels below the quantitation limit.

    PubMed

    Garmann, D; McLeay, S; Shah, A; Vis, P; Maas Enriquez, M; Ploeger, B A

    2017-07-01

    The pharmacokinetics (PK), safety and efficacy of BAY 81-8973, a full-length, unmodified, recombinant human factor VIII (FVIII), were evaluated in the LEOPOLD trials. The aim of this study was to develop a population PK model based on pooled data from the LEOPOLD trials and to investigate the importance of including samples with FVIII levels below the limit of quantitation (BLQ) to estimate half-life. The analysis included 1535 PK observations (measured by the chromogenic assay) from 183 male patients with haemophilia A aged 1-61 years from the 3 LEOPOLD trials. The limit of quantitation was 1.5 IU dL -1 for the majority of samples. Population PK models that included or excluded BLQ samples were used for FVIII half-life estimations, and simulations were performed using both estimates to explore the influence on the time below a determined FVIII threshold. In the data set used, approximately 16.5% of samples were BLQ, which is not uncommon for FVIII PK data sets. The structural model to describe the PK of BAY 81-8973 was a two-compartment model similar to that seen for other FVIII products. If BLQ samples were excluded from the model, FVIII half-life estimations were longer compared with a model that included BLQ samples. It is essential to assess the importance of BLQ samples when performing population PK estimates of half-life for any FVIII product. Exclusion of BLQ data from half-life estimations based on population PK models may result in an overestimation of half-life and underestimation of time under a predetermined FVIII threshold, resulting in potential underdosing of patients. © 2017 Bayer AG. Haemophilia Published by John Wiley & Sons Ltd.

  10. Non-invasive assessment of cerebral microcirculation with diffuse optics and coherent hemodynamics spectroscopy

    NASA Astrophysics Data System (ADS)

    Fantini, Sergio; Sassaroli, Angelo; Kainerstorfer, Jana M.; Tgavalekos, Kristen T.; Zang, Xuan

    2016-03-01

    We describe the general principles and initial results of coherent hemodynamics spectroscopy (CHS), which is a new technique for the quantitative assessment of cerebral hemodynamics on the basis of dynamic near-infrared spectroscopy (NIRS) measurements. The two components of CHS are (1) dynamic measurements of coherent cerebral hemodynamics in the form of oscillations at multiple frequencies (frequency domain) or temporal transients (time domain), and (2) their quantitative analysis with a dynamic mathematical model that relates the concentration and oxygen saturation of hemoglobin in tissue to cerebral blood volume (CBV), cerebral blood flow (CBF), and cerebral metabolic rate of oxygen (CMRO2). In particular, CHS can provide absolute measurements and dynamic monitoring of CBF, and quantitative measures of cerebral autoregulation. We report initial results of CBF measurements in hemodialysis patients, where we found a lower CBF (54 +/- 16 ml/(100 g-min)) compared to a group of healthy controls (95 +/- 11 ml/(100 g-min)). We also report CHS measurements of cerebral autoregulation, where a quantitative index of autoregulation (its cutoff frequency) was found to be significantly greater in healthy subjects during hyperventilation (0.034 +/- 0.005 Hz) than during normal breathing (0.017 +/- 0.002 Hz). We also present our approach to depth resolved CHS, based on multi-distance, frequency-domain NIRS data and a two-layer diffusion model, to enhance sensitivity to cerebral tissue. CHS offers a potentially powerful approach to the quantitative assessment and continuous monitoring of local brain perfusion at the microcirculation level, with prospective brain mapping capabilities of research and clinical significance.

  11. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  13. Dynamic regulation of heart rate during acute hypotension: new insight into baroreflex function

    NASA Technical Reports Server (NTRS)

    Zhang, R.; Behbehani, K.; Crandall, C. G.; Zuckerman, J. H.; Levine, B. D.; Blomqvist, C. G. (Principal Investigator)

    2001-01-01

    To examine the dynamic properties of baroreflex function, we measured beat-to-beat changes in arterial blood pressure (ABP) and heart rate (HR) during acute hypotension induced by thigh cuff deflation in 10 healthy subjects under supine resting conditions and during progressive lower body negative pressure (LBNP). The quantitative, temporal relationship between ABP and HR was fitted by a second-order autoregressive (AR) model. The frequency response was evaluated by transfer function analysis. Results: HR changes during acute hypotension appear to be controlled by an ABP error signal between baseline and induced hypotension. The quantitative relationship between changes in ABP and HR is characterized by a second-order AR model with a pure time delay of 0.75 s containing low-pass filter properties. During LBNP, the change in HR/change in ABP during induced hypotension significantly decreased, as did the numerator coefficients of the AR model and transfer function gain. Conclusions: 1) Beat-to-beat HR responses to dynamic changes in ABP may be controlled by an error signal rather than directional changes in pressure, suggesting a "set point" mechanism in short-term ABP control. 2) The quantitative relationship between dynamic changes in ABP and HR can be described by a second-order AR model with a pure time delay. 3) The ability of the baroreflex to evoke a HR response to transient changes in pressure was reduced during LBNP, which was due primarily to a reduction of the static gain of the baroreflex.

  14. Drug-disease modeling in the pharmaceutical industry - where mechanistic systems pharmacology and statistical pharmacometrics meet.

    PubMed

    Helmlinger, Gabriel; Al-Huniti, Nidal; Aksenov, Sergey; Peskov, Kirill; Hallow, Karen M; Chu, Lulu; Boulton, David; Eriksson, Ulf; Hamrén, Bengt; Lambert, Craig; Masson, Eric; Tomkinson, Helen; Stanski, Donald

    2017-11-15

    Modeling & simulation (M&S) methodologies are established quantitative tools, which have proven to be useful in supporting the research, development (R&D), regulatory approval, and marketing of novel therapeutics. Applications of M&S help design efficient studies and interpret their results in context of all available data and knowledge to enable effective decision-making during the R&D process. In this mini-review, we focus on two sets of modeling approaches: population-based models, which are well-established within the pharmaceutical industry today, and fall under the discipline of clinical pharmacometrics (PMX); and systems dynamics models, which encompass a range of models of (patho-)physiology amenable to pharmacological intervention, of signaling pathways in biology, and of substance distribution in the body (today known as physiologically-based pharmacokinetic models) - which today may be collectively referred to as quantitative systems pharmacology models (QSP). We next describe the convergence - or rather selected integration - of PMX and QSP approaches into 'middle-out' drug-disease models, which retain selected mechanistic aspects, while remaining parsimonious, fit-for-purpose, and able to address variability and the testing of covariates. We further propose development opportunities for drug-disease systems models, to increase their utility and applicability throughout the preclinical and clinical spectrum of pharmaceutical R&D. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Numerical Modeling of Ophthalmic Response to Space

    NASA Technical Reports Server (NTRS)

    Nelson, E. S.; Myers, J. G.; Mulugeta, L.; Vera, J.; Raykin, J.; Feola, A.; Gleason, R.; Samuels, B.; Ethier, C. R.

    2015-01-01

    To investigate ophthalmic changes in spaceflight, we would like to predict the impact of blood dysregulation and elevated intracranial pressure (ICP) on Intraocular Pressure (IOP). Unlike other physiological systems, there are very few lumped parameter models of the eye. The eye model described here is novel in its inclusion of the human choroid and retrobulbar subarachnoid space (rSAS), which are key elements in investigating the impact of increased ICP and ocular blood volume. Some ingenuity was required in modeling the blood and rSAS compartments due to the lack of quantitative data on essential hydrodynamic quantities, such as net choroidal volume and blood flowrate, inlet and exit pressures, and material properties, such as compliances between compartments.

  16. A novel description of FDG excretion in the renal system: application to metformin-treated models

    NASA Astrophysics Data System (ADS)

    Garbarino, S.; Caviglia, G.; Sambuceti, G.; Benvenuto, F.; Piana, M.

    2014-05-01

    This paper introduces a novel compartmental model describing the excretion of 18F-fluoro-deoxyglucose (FDG) in the renal system and a numerical method based on the maximum likelihood for its reduction. This approach accounts for variations in FDG concentration due to water re-absorption in renal tubules and the increase of the bladder’s volume during the FDG excretion process. From the computational viewpoint, the reconstruction of the tracer kinetic parameters is obtained by solving the maximum likelihood problem iteratively, using a non-stationary, steepest descent approach that explicitly accounts for the Poisson nature of nuclear medicine data. The reliability of the method is validated against two sets of synthetic data realized according to realistic conditions. Finally we applied this model to describe FDG excretion in the case of animal models treated with metformin. In particular we show that our approach allows the quantitative estimation of the reduction of FDG de-phosphorylation induced by metformin.

  17. ICT & OTs: a model of information and communications technology acceptance and utilisation by occupational therapists (part 2).

    PubMed

    Schaper, Louise; Pervan, Graham

    2007-01-01

    The research reported in this paper describes the development, empirical validation and analysis of a model of technology acceptance by Australian occupational therapists. The study described involved the collection of quantitative data through a national survey. The theoretical significance of this work is that it uses a thoroughly constructed research model, with one of the largest sample sizes ever tested (n=1605), to extend technology acceptance research into the health sector. Results provide strong support for the model. This work reveals the complexity of the constructs and relationships that influence technology acceptance and highlights the need to include sociotechnical and system issues in studies of technology acceptance in healthcare to improve information system implementation success in this arena. The results of this study have practical and theoretical implications for health informaticians and researchers in the field of health informatics and information systems, tertiary educators, Commonwealth and State Governments and the allied health professions.

  18. Noise from Supersonic Coaxial Jets. Part 1; Mean Flow Predictions

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Morris, Philip J.

    1997-01-01

    Recent theories for supersonic jet noise have used an instability wave noise generation model to predict radiated noise. This model requires a known mean flow that has typically been described by simple analytic functions for single jet mean flows. The mean flow of supersonic coaxial jets is not described easily in terms of analytic functions. To provide these profiles at all axial locations, a numerical scheme is developed to calculate the mean flow properties of a coaxial jet. The Reynolds-averaged, compressible, parabolic boundary layer equations are solved using a mixing length turbulence model. Empirical correlations are developed to account for the effects of velocity and temperature ratios and Mach number on the shear layer spreading. Both normal velocity profile and inverted velocity profile coaxial jets are considered. The mixing length model is modified in each case to obtain reasonable results when the two stream jet merges into a single fully developed jet. The mean flow calculations show both good qualitative and quantitative agreement with measurements in single and coaxial jet flows.

  19. TG study of the Li0.4Fe2.4Zn0.2O4 ferrite synthesis

    NASA Astrophysics Data System (ADS)

    Lysenko, E. N.; Nikolaev, E. V.; Surzhikov, A. P.

    2016-02-01

    In this paper, the kinetic analysis of Li-Zn ferrite synthesis was studied using thermogravimetry (TG) method through the simultaneous application of non-linear regression to several measurements run at different heating rates (multivariate non-linear regression). Using TG-curves obtained for the four heating rates and Netzsch Thermokinetics software package, the kinetic models with minimal adjustable parameters were selected to quantitatively describe the reaction of Li-Zn ferrite synthesis. It was shown that the experimental TG-curves clearly suggest a two-step process for the ferrite synthesis and therefore a model-fitting kinetic analysis based on multivariate non-linear regressions was conducted. The complex reaction was described by a two-step reaction scheme consisting of sequential reaction steps. It is established that the best results were obtained using the Yander three-dimensional diffusion model at the first stage and Ginstling-Bronstein model at the second step. The kinetic parameters for lithium-zinc ferrite synthesis reaction were found and discussed.

  20. Exploring electrical resistance: a novel kinesthetic model helps to resolve some misconceptions

    NASA Astrophysics Data System (ADS)

    Cottle, Dan; Marshall, Rick

    2016-09-01

    A simple ‘hands on’ physical model is described which displays analogous behaviour to some aspects of the free electron theory of metals. Using it students can get a real feel for what is going on inside a metallic conductor. Ohms Law, the temperature dependence of resistivity, the dependence of resistance on geometry, how the conduction electrons respond to a potential difference and the concepts of mean free path and drift speed of the conduction electrons can all be explored. Some quantitative results obtained by using the model are compared with the predictions of Drude’s free electron theory of electrical conduction.

  1. Multidimensional analysis of data obtained in experiments with X-ray emulsion chambers and extensive air showers

    NASA Technical Reports Server (NTRS)

    Chilingaryan, A. A.; Galfayan, S. K.; Zazyan, M. Z.; Dunaevsky, A. M.

    1985-01-01

    Nonparametric statistical methods are used to carry out the quantitative comparison of the model and the experimental data. The same methods enable one to select the events initiated by the heavy nuclei and to calculate the portion of the corresponding events. For this purpose it is necessary to have the data on artificial events describing the experiment sufficiently well established. At present, the model with the small scaling violation in the fragmentation region is the closest to the experiments. Therefore, the treatment of gamma families obtained in the Pamir' experiment is being carried out at present with the application of these models.

  2. Baseline Error Analysis and Experimental Validation for Height Measurement of Formation Insar Satellite

    NASA Astrophysics Data System (ADS)

    Gao, X.; Li, T.; Zhang, X.; Geng, X.

    2018-04-01

    In this paper, we proposed the stochastic model of InSAR height measurement by considering the interferometric geometry of InSAR height measurement. The model directly described the relationship between baseline error and height measurement error. Then the simulation analysis in combination with TanDEM-X parameters was implemented to quantitatively evaluate the influence of baseline error to height measurement. Furthermore, the whole emulation validation of InSAR stochastic model was performed on the basis of SRTM DEM and TanDEM-X parameters. The spatial distribution characteristics and error propagation rule of InSAR height measurement were fully evaluated.

  3. Structure and Function of Iron-Loaded Synthetic Melanin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yiwen; Xie, Yijun; Wang, Zhao

    We describe a synthetic method for increasing and controlling the iron loading of synthetic melanin nanoparticles and use the resulting materials to perform a systematic quantitative investigation on their structure- property relationship. A comprehensive analysis by magnetometry, electron paramagnetic resonance, and nuclear magnetic relaxation dispersion reveals the complexities of their magnetic behavior and how these intraparticle magnetic interactions manifest in useful material properties such as their performance as MRI contrast agents. This analysis allows predictions of the optimal iron loading through a quantitative modeling of antiferromagnetic coupling that arises from proximal iron ions. This study provides a detailed understanding ofmore » this complex class of synthetic biomaterials and gives insight into interactions and structures prevalent in naturally occurring melanins.« less

  4. Quantitative Thermochemical Measurements in High-Pressure Gaseous Combustion

    NASA Technical Reports Server (NTRS)

    Kojima, Jun J.; Fischer, David G.

    2012-01-01

    We present our strategic experiment and thermochemical analyses on combustion flow using a subframe burst gating (SBG) Raman spectroscopy. This unconventional laser diagnostic technique has promising ability to enhance accuracy of the quantitative scalar measurements in a point-wise single-shot fashion. In the presentation, we briefly describe an experimental methodology that generates transferable calibration standard for the routine implementation of the diagnostics in hydrocarbon flames. The diagnostic technology was applied to simultaneous measurements of temperature and chemical species in a swirl-stabilized turbulent flame with gaseous methane fuel at elevated pressure (17 atm). Statistical analyses of the space-/time-resolved thermochemical data provide insights into the nature of the mixing process and it impact on the subsequent combustion process in the model combustor.

  5. Nonequilibrium fluctuations in metaphase spindles: polarized light microscopy, image registration, and correlation functions

    NASA Astrophysics Data System (ADS)

    Brugués, Jan; Needleman, Daniel J.

    2010-02-01

    Metaphase spindles are highly dynamic, nonequilibrium, steady-state structures. We study the internal fluctuations of spindles by computing spatio-temporal correlation functions of movies obtained from quantitative polarized light microscopy. These correlation functions are only physically meaningful if corrections are made for the net motion of the spindle. We describe our image registration algorithm in detail and we explore its robustness. Finally, we discuss the expression used for the estimation of the correlation function in terms of the nematic order of the microtubules which make up the spindle. Ultimately, studying the form of these correlation functions will provide a quantitative test of the validity of coarse-grained models of spindle structure inspired from liquid crystal physics.

  6. Quantitative Studies on the Propagation and Extinction of Near-Limit Premixed Flames Under Normal and Microgravity

    NASA Technical Reports Server (NTRS)

    Dong, Y.; Spedding, G. R.; Egolfopoulos, F. N.; Miller, F. J.

    2003-01-01

    The main objective of this research is to introduce accurate fluid mechanics measurements diagnostics in the 2.2-s drop tower for the determination of the detailed flow-field at the states of extinction. These results are important as they can then be compared with confidence with detailed numerical simulations so that important insight is provided into near-limit phenomena that are controlled by not well-understood kinetics and thermal radiation processes. Past qualitative studies did enhance our general understanding on the subject. However, quantitative studies are essential for the validation of existing models that subsequently be used to describe near-limit phenomena that can initiate catastrophic events in micro- and/or reduced gravity environments.

  7. Quantitative analysis of the flexibility effect of cisplatin on circular DNA

    NASA Astrophysics Data System (ADS)

    Ji, Chao; Zhang, Lingyun; Wang, Peng-Ye

    2013-10-01

    We study the effects of cisplatin on the circular configuration of DNA using atomic force microscopy (AFM) and observe that the DNA gradually transforms to a complex configuration with an intersection and interwound structures from a circlelike structure. An algorithm is developed to extract the configuration profiles of circular DNA from AFM images and the radius of gyration is used to describe the flexibility of circular DNA. The quantitative analysis of the circular DNA demonstrates that the radius of gyration gradually decreases and two processes on the change of flexibility of circular DNA are found as the cisplatin concentration increases. Furthermore, a model is proposed and discussed to explain the mechanism for understanding the complicated interaction between DNA and cisplatin.

  8. The Systems Biology Markup Language (SBML) Level 3 Package: Qualitative Models, Version 1, Release 1.

    PubMed

    Chaouiya, Claudine; Keating, Sarah M; Berenguier, Duncan; Naldi, Aurélien; Thieffry, Denis; van Iersel, Martijn P; Le Novère, Nicolas; Helikar, Tomáš

    2015-09-04

    Quantitative methods for modelling biological networks require an in-depth knowledge of the biochemical reactions and their stoichiometric and kinetic parameters. In many practical cases, this knowledge is missing. This has led to the development of several qualitative modelling methods using information such as, for example, gene expression data coming from functional genomic experiments. The SBML Level 3 Version 1 Core specification does not provide a mechanism for explicitly encoding qualitative models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Qualitative Models package for SBML Level 3 adds features so that qualitative models can be directly and explicitly encoded. The approach taken in this package is essentially based on the definition of regulatory or influence graphs. The SBML Qualitative Models package defines the structure and syntax necessary to describe qualitative models that associate discrete levels of activities with entity pools and the transitions between states that describe the processes involved. This is particularly suited to logical models (Boolean or multi-valued) and some classes of Petri net models can be encoded with the approach.

  9. Methodological Issues in Examining Measurement Equivalence in Patient Reported Outcomes Measures: Methods Overview to the Two-Part Series, “Measurement Equivalence of the Patient Reported Outcomes Measurement Information System® (PROMIS®) Short Forms”

    PubMed Central

    Teresi, Jeanne A.; Jones, Richard N.

    2017-01-01

    The purpose of this article is to introduce the methods used and challenges confronted by the authors of this two-part series of articles describing the results of analyses of measurement equivalence of the short form scales from the Patient Reported Outcomes Measurement Information System® (PROMIS®). Qualitative and quantitative approaches used to examine differential item functioning (DIF) are reviewed briefly. Qualitative methods focused on generation of DIF hypotheses. The basic quantitative approaches used all rely on a latent variable model, and examine parameters either derived directly from item response theory (IRT) or from structural equation models (SEM). A key methods focus of these articles is to describe state-of-the art approaches to examination of measurement equivalence in eight domains: physical health, pain, fatigue, sleep, depression, anxiety, cognition, and social function. These articles represent the first time that DIF has been examined systematically in the PROMIS short form measures, particularly among ethnically diverse groups. This is also the first set of analyses to examine the performance of PROMIS short forms in patients with cancer. Latent variable model state-of-the-art methods for examining measurement equivalence are introduced briefly in this paper to orient readers to the approaches adopted in this set of papers. Several methodological challenges underlying (DIF-free) anchor item selection and model assumption violations are presented as a backdrop for the articles in this two-part series on measurement equivalence of PROMIS measures. PMID:28983448

  10. Methodological Issues in Examining Measurement Equivalence in Patient Reported Outcomes Measures: Methods Overview to the Two-Part Series, "Measurement Equivalence of the Patient Reported Outcomes Measurement Information System® (PROMIS®) Short Forms".

    PubMed

    Teresi, Jeanne A; Jones, Richard N

    2016-01-01

    The purpose of this article is to introduce the methods used and challenges confronted by the authors of this two-part series of articles describing the results of analyses of measurement equivalence of the short form scales from the Patient Reported Outcomes Measurement Information System ® (PROMIS ® ). Qualitative and quantitative approaches used to examine differential item functioning (DIF) are reviewed briefly. Qualitative methods focused on generation of DIF hypotheses. The basic quantitative approaches used all rely on a latent variable model, and examine parameters either derived directly from item response theory (IRT) or from structural equation models (SEM). A key methods focus of these articles is to describe state-of-the art approaches to examination of measurement equivalence in eight domains: physical health, pain, fatigue, sleep, depression, anxiety, cognition, and social function. These articles represent the first time that DIF has been examined systematically in the PROMIS short form measures, particularly among ethnically diverse groups. This is also the first set of analyses to examine the performance of PROMIS short forms in patients with cancer. Latent variable model state-of-the-art methods for examining measurement equivalence are introduced briefly in this paper to orient readers to the approaches adopted in this set of papers. Several methodological challenges underlying (DIF-free) anchor item selection and model assumption violations are presented as a backdrop for the articles in this two-part series on measurement equivalence of PROMIS measures.

  11. Quantitative study of interactions between oxygen lone pair and aromatic rings: substituent effect and the importance of closeness of contact.

    PubMed

    Gung, Benjamin W; Zou, Yan; Xu, Zhigang; Amicangelo, Jay C; Irwin, Daniel G; Ma, Shengqian; Zhou, Hong-Cai

    2008-01-18

    Current models describe aromatic rings as polar groups based on the fact that benzene and hexafluorobenzene are known to have large and permanent quadrupole moments. This report describes a quantitative study of the interactions between oxygen lone pair and aromatic rings. We found that even electron-rich aromatic rings and oxygen lone pairs exhibit attractive interactions. Free energies of interactions are determined using the triptycene scaffold and the equilibrium constants were determined by low-temperature 1H NMR spectroscopy. An X-ray structure analysis for one of the model compounds confirms the close proximity between the oxygen and the center of the aromatic ring. Theoretical calculations at the MP2/aug-cc-pVTZ level corroborate the experimental results. The origin of attractive interactions was explored by using aromatic rings with a wide range of substituents. The interactions between an oxygen lone pair and an aromatic ring are attractive at van der Waals' distance even with electron-donating substituents. Electron-withdrawing groups increase the strength of the attractive interactions. The results from this study can be only partly rationalized by using the current models of aromatic system. Electrostatic-based models are consistent with the fact that stronger electron-withdrawing groups lead to stronger attractions, but fail to predict or rationalize the fact that weak attractions even exist between electron-rich arenes and oxygen lone pairs. The conclusion from this study is that aromatic rings cannot be treated as a simple quadrupolar functional group at van der Waals' distance. Dispersion forces and local dipole should also be considered.

  12. Developing a logic model for youth mental health: participatory research with a refugee community in Beirut

    PubMed Central

    Afifi, Rema A; Makhoul, Jihad; El Hajj, Taghreed; Nakkash, Rima T

    2011-01-01

    Although logic models are now touted as an important component of health promotion planning, implementation and evaluation, there are few published manuscripts that describe the process of logic model development, and fewer which do so with community involvement, despite the increasing emphasis on participatory research. This paper describes a process leading to the development of a logic model for a youth mental health promotion intervention using a participatory approach in a Palestinian refugee camp in Beirut, Lebanon. First, a needs assessment, including quantitative and qualitative data collection was carried out with children, parents and teachers. The second phase was identification of a priority health issue and analysis of determinants. The final phase in the construction of the logic model involved development of an intervention. The process was iterative and resulted in a more grounded depiction of the pathways of influence informed by evidence. Constructing a logic model with community input ensured that the intervention was more relevant to community needs, feasible for implementation and more likely to be sustainable. PMID:21278370

  13. An improved Brass correlational fertility model.

    PubMed

    Zhang, E; Chen, J

    1995-01-01

    Demographers have for years tried to establish a mathematical model capable of accurately describing patterns of fertility change. William Brass's Gompertz correlational fertility model is based upon a standard age-specific fertility pattern correlated to the age-specific fertility rate of the area under study with the purpose of simulating the actual age-specific fertility rate of the area. While the Brass correlational fertility model has solved many problems in quantitative studies of fertility and has been applied in population simulation and prediction, it has been unsatisfactory in analyzing fertility changes in China. The authors therefore developed a parity-specific correlational model to better reflect the situation of rapid fertility decline in China. This modified model better describes the impact of current family planning policy in China. Moreover, satisfactory results can be obtained by simulating and analyzing fertility in recent years, and major parameters can be identified by using demographically definite and readily manageable indicators. These indicators can clearly reflect the goals of the country's family planning policy, such as the average age at child-bearing, median age at child-bearing, early reproduction ratio, and percentage of the second child.

  14. People's Risk Recognition Preceding Evacuation and Its Role in Demand Modeling and Planning.

    PubMed

    Urata, Junji; Pel, Adam J

    2018-05-01

    Evacuation planning and management involves estimating the travel demand in the event that such action is required. This is usually done as a function of people's decision to evacuate, which we show is strongly linked to their risk awareness. We use an empirical data set, which shows tsunami evacuation behavior, to demonstrate that risk recognition is not synonymous with objective risk, but is instead determined by a combination of factors including risk education, information, and sociodemographics, and that it changes dynamically over time. Based on these findings, we formulate an ordered logit model to describe risk recognition combined with a latent class model to describe evacuation choices. Our proposed evacuation choice model along with a risk recognition class can evaluate quantitatively the influence of disaster mitigation measures, risk education, and risk information. The results obtained from the risk recognition model show that risk information has a greater impact in the sense that people recognize their high risk. The results of the evacuation choice model show that people who are unaware of their risk take a longer time to evacuate. © 2017 Society for Risk Analysis.

  15. Quantitative metrics for evaluating the phased roll-out of clinical information systems.

    PubMed

    Wong, David; Wu, Nicolas; Watkinson, Peter

    2017-09-01

    We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  16. Learning Quantitative Sequence-Function Relationships from Massively Parallel Experiments

    NASA Astrophysics Data System (ADS)

    Atwal, Gurinder S.; Kinney, Justin B.

    2016-03-01

    A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships—functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes"—directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.

  17. A quantitative study on magnesium alloy stent biodegradation.

    PubMed

    Gao, Yuanming; Wang, Lizhen; Gu, Xuenan; Chu, Zhaowei; Guo, Meng; Fan, Yubo

    2018-06-06

    Insufficient scaffolding time in the process of rapid corrosion is the main problem of magnesium alloy stent (MAS). Finite element method had been used to investigate corrosion of MAS. However, related researches mostly described all elements suffered corrosion in view of one-dimensional corrosion. Multi-dimensional corrosions significantly influence mechanical integrity of MAS structures such as edges and corners. In this study, the effects of multi-dimensional corrosion were studied using experiment quantitatively, then a phenomenological corrosion model was developed to consider these effects. We implemented immersion test with magnesium alloy (AZ31B) cubes, which had different numbers of exposed surfaces to analyze differences of dimension. It was indicated that corrosion rates of cubes are almost proportional to their exposed-surface numbers, especially when pitting corrosions are not marked. The cubes also represented the hexahedron elements in simulation. In conclusion, corrosion rate of every element accelerates by increasing corrosion-surface numbers in multi-dimensional corrosion. The damage ratios among elements with the same size are proportional to the ratios of corrosion-surface numbers under uniform corrosion. The finite element simulation using proposed model provided more details of changes of morphology and mechanics in scaffolding time by removing 25.7% of elements of MAS. The proposed corrosion model reflected the effects of multi-dimension on corrosions. It would be used to predict degradation process of MAS quantitatively. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Preclinical Magnetic Resonance Fingerprinting (MRF) at 7 T: Effective Quantitative Imaging for Rodent Disease Models

    PubMed Central

    Gao, Ying; Chen, Yong; Ma, Dan; Jiang, Yun; Herrmann, Kelsey A.; Vincent, Jason A.; Dell, Katherine M.; Drumm, Mitchell L.; Brady-Kalnay, Susann M.; Griswold, Mark A.; Flask, Chris A.; Lu, Lan

    2015-01-01

    High field, preclinical magnetic resonance imaging (MRI) scanners are now commonly used to quantitatively assess disease status and efficacy of novel therapies in a wide variety of rodent models. Unfortunately, conventional MRI methods are highly susceptible to respiratory and cardiac motion artifacts resulting in potentially inaccurate and misleading data. We have developed an initial preclinical, 7.0 T MRI implementation of the highly novel Magnetic Resonance Fingerprinting (MRF) methodology that has been previously described for clinical imaging applications. The MRF technology combines a priori variation in the MRI acquisition parameters with dictionary-based matching of acquired signal evolution profiles to simultaneously generate quantitative maps of T1 and T2 relaxation times and proton density. This preclinical MRF acquisition was constructed from a Fast Imaging with Steady-state Free Precession (FISP) MRI pulse sequence to acquire 600 MRF images with both evolving T1 and T2 weighting in approximately 30 minutes. This initial high field preclinical MRF investigation demonstrated reproducible and differentiated estimates of in vitro phantoms with different relaxation times. In vivo preclinical MRF results in mouse kidneys and brain tumor models demonstrated an inherent resistance to respiratory motion artifacts as well as sensitivity to known pathology. These results suggest that MRF methodology may offer the opportunity for quantification of numerous MRI parameters for a wide variety of preclinical imaging applications. PMID:25639694

  19. Preclinical MR fingerprinting (MRF) at 7 T: effective quantitative imaging for rodent disease models.

    PubMed

    Gao, Ying; Chen, Yong; Ma, Dan; Jiang, Yun; Herrmann, Kelsey A; Vincent, Jason A; Dell, Katherine M; Drumm, Mitchell L; Brady-Kalnay, Susann M; Griswold, Mark A; Flask, Chris A; Lu, Lan

    2015-03-01

    High-field preclinical MRI scanners are now commonly used to quantitatively assess disease status and the efficacy of novel therapies in a wide variety of rodent models. Unfortunately, conventional MRI methods are highly susceptible to respiratory and cardiac motion artifacts resulting in potentially inaccurate and misleading data. We have developed an initial preclinical 7.0-T MRI implementation of the highly novel MR fingerprinting (MRF) methodology which has been described previously for clinical imaging applications. The MRF technology combines a priori variation in the MRI acquisition parameters with dictionary-based matching of acquired signal evolution profiles to simultaneously generate quantitative maps of T1 and T2 relaxation times and proton density. This preclinical MRF acquisition was constructed from a fast imaging with steady-state free precession (FISP) MRI pulse sequence to acquire 600 MRF images with both evolving T1 and T2 weighting in approximately 30 min. This initial high-field preclinical MRF investigation demonstrated reproducible and differentiated estimates of in vitro phantoms with different relaxation times. In vivo preclinical MRF results in mouse kidneys and brain tumor models demonstrated an inherent resistance to respiratory motion artifacts as well as sensitivity to known pathology. These results suggest that MRF methodology may offer the opportunity for the quantification of numerous MRI parameters for a wide variety of preclinical imaging applications. Copyright © 2015 John Wiley & Sons, Ltd.

  20. Photoelectron angular distributions for states of any mixed character: An experiment-friendly model for atomic, molecular, and cluster anions

    NASA Astrophysics Data System (ADS)

    Khuseynov, Dmitry; Blackstone, Christopher C.; Culberson, Lori M.; Sanov, Andrei

    2014-09-01

    We present a model for laboratory-frame photoelectron angular distributions in direct photodetachment from (in principle) any molecular orbital using linearly polarized light. A transparent mathematical approach is used to generalize the Cooper-Zare central-potential model to anionic states of any mixed character. In the limit of atomic-anion photodetachment, the model reproduces the Cooper-Zare formula. In the case of an initial orbital described as a superposition of s and p-type functions, the model yields the previously obtained s-p mixing formula. The formalism is further advanced using the Hanstorp approximation, whereas the relative scaling of the partial-wave cross-sections is assumed to follow the Wigner threshold law. The resulting model describes the energy dependence of photoelectron anisotropy for any atomic, molecular, or cluster anions, usually without requiring a direct calculation of the transition dipole matrix elements. As a benchmark case, we apply the p-d variant of the model to the experimental results for NO- photodetachment and show that the observed anisotropy trend is described well using physically meaningful values of the model parameters. Overall, the presented formalism delivers insight into the photodetachment process and affords a new quantitative strategy for analyzing the photoelectron angular distributions and characterizing mixed-character molecular orbitals using photoelectron imaging spectroscopy of negative ions.

  1. Photoelectron angular distributions for states of any mixed character: an experiment-friendly model for atomic, molecular, and cluster anions.

    PubMed

    Khuseynov, Dmitry; Blackstone, Christopher C; Culberson, Lori M; Sanov, Andrei

    2014-09-28

    We present a model for laboratory-frame photoelectron angular distributions in direct photodetachment from (in principle) any molecular orbital using linearly polarized light. A transparent mathematical approach is used to generalize the Cooper-Zare central-potential model to anionic states of any mixed character. In the limit of atomic-anion photodetachment, the model reproduces the Cooper-Zare formula. In the case of an initial orbital described as a superposition of s and p-type functions, the model yields the previously obtained s-p mixing formula. The formalism is further advanced using the Hanstorp approximation, whereas the relative scaling of the partial-wave cross-sections is assumed to follow the Wigner threshold law. The resulting model describes the energy dependence of photoelectron anisotropy for any atomic, molecular, or cluster anions, usually without requiring a direct calculation of the transition dipole matrix elements. As a benchmark case, we apply the p-d variant of the model to the experimental results for NO(-) photodetachment and show that the observed anisotropy trend is described well using physically meaningful values of the model parameters. Overall, the presented formalism delivers insight into the photodetachment process and affords a new quantitative strategy for analyzing the photoelectron angular distributions and characterizing mixed-character molecular orbitals using photoelectron imaging spectroscopy of negative ions.

  2. Assessing the durability and efficiency of landscape-based strategies to deploy plant resistance to pathogens

    PubMed Central

    Rey, Jean-François; Barrett, Luke G.; Thrall, Peter H.

    2018-01-01

    Genetically-controlled plant resistance can reduce the damage caused by pathogens. However, pathogens have the ability to evolve and overcome such resistance. This often occurs quickly after resistance is deployed, resulting in significant crop losses and a continuing need to develop new resistant cultivars. To tackle this issue, several strategies have been proposed to constrain the evolution of pathogen populations and thus increase genetic resistance durability. These strategies mainly rely on varying different combinations of resistance sources across time (crop rotations) and space. The spatial scale of deployment can vary from multiple resistance sources occurring in a single cultivar (pyramiding), in different cultivars within the same field (cultivar mixtures) or in different fields (mosaics). However, experimental comparison of the efficiency (i.e. ability to reduce disease impact) and durability (i.e. ability to limit pathogen evolution and delay resistance breakdown) of landscape-scale deployment strategies presents major logistical challenges. Therefore, we developed a spatially explicit stochastic model able to assess the epidemiological and evolutionary outcomes of the four major deployment options described above, including both qualitative resistance (i.e. major genes) and quantitative resistance traits against several components of pathogen aggressiveness: infection rate, latent period duration, propagule production rate, and infectious period duration. This model, implemented in the R package landsepi, provides a new and useful tool to assess the performance of a wide range of deployment options, and helps investigate the effect of landscape, epidemiological and evolutionary parameters. This article describes the model and its parameterisation for rust diseases of cereal crops, caused by fungi of the genus Puccinia. To illustrate the model, we use it to assess the epidemiological and evolutionary potential of the combination of a major gene and different traits of quantitative resistance. The comparison of the four major deployment strategies described above will be the objective of future studies. PMID:29649208

  3. Beyond the Förster formulation for resonance energy transfer: the role of dark states.

    PubMed

    Sissa, C; Manna, A K; Terenziani, F; Painelli, A; Pati, S K

    2011-07-28

    Resonance Energy Transfer (RET) is investigated in pairs of charge-transfer (CT) chromophores. CT chromophores are an interesting class of π conjugated chromophores decorated with one or more electron-donor and acceptor groups in polar (D-π-A), quadrupolar (D-π-A-π-D or A-π-D-π-A) or octupolar (D(-π-A)(3) or A(-π-D)(3)) structures. Essential-state models accurately describe low-energy linear and nonlinear spectra of CT-chromophores and proved very useful to describe spectroscopic effects of electrostatic interchromophore interactions in multichromophoric assemblies. Here we apply the same approach to describe RET between CT-chromophores. The results are quantitatively validated by an extensive comparison with time-dependent density functional theory (TDDFT) calculations, confirming that essential-state models offer a simple and reliable approach for the calculation of electrostatic interchromophore interactions. This is an important result since it sets the basis for more refined treatments of RET: essential-state models are in fact easily extended to account for molecular vibrations in truly non-adiabatic approaches and to account for inhomogeneous broadening effects due to polar solvation. Optically forbidden (dark) states of quadrupolar and octupolar chromophores offer an interesting opportunity to verify the reliability of the dipolar approximation. In striking contrast with the dipolar approximation that strictly forbids RET towards or from dark states, our results demonstrate that dark states can take an active role in RET with interaction energies that, depending on the relative orientation of the chromophores, can be even larger than those relevant to allowed states. Essential-state models, whose predictions are quantitatively confirmed by TDDFT results, allow us to relate RET interaction energies towards allowed and dark states to the supramolecular symmetry of the RET-pair, offering reliable design strategies to optimize RET-interactions. This journal is © the Owner Societies 2011

  4. Molecular systems biology of ErbB1 signaling: bridging the gap through multiscale modeling and high-performance computing.

    PubMed

    Shih, Andrew J; Purvis, Jeremy; Radhakrishnan, Ravi

    2008-12-01

    The complexity in intracellular signaling mechanisms relevant for the conquest of many diseases resides at different levels of organization with scales ranging from the subatomic realm relevant to catalytic functions of enzymes to the mesoscopic realm relevant to the cooperative association of molecular assemblies and membrane processes. Consequently, the challenge of representing and quantifying functional or dysfunctional modules within the networks remains due to the current limitations in our understanding of mesoscopic biology, i.e., how the components assemble into functional molecular ensembles. A multiscale approach is necessary to treat a hierarchy of interactions ranging from molecular (nm, ns) to signaling (microm, ms) length and time scales, which necessitates the development and application of specialized modeling tools. Complementary to multiscale experimentation (encompassing structural biology, mechanistic enzymology, cell biology, and single molecule studies) multiscale modeling offers a powerful and quantitative alternative for the study of functional intracellular signaling modules. Here, we describe the application of a multiscale approach to signaling mediated by the ErbB1 receptor which constitutes a network hub for the cell's proliferative, migratory, and survival programs. Through our multiscale model, we mechanistically describe how point-mutations in the ErbB1 receptor can profoundly alter signaling characteristics leading to the onset of oncogenic transformations. Specifically, we describe how the point mutations induce cascading fragility mechanisms at the molecular scale as well as at the scale of the signaling network to preferentially activate the survival factor Akt. We provide a quantitative explanation for how the hallmark of preferential Akt activation in cell-lines harboring the constitutively active mutant ErbB1 receptors causes these cell-lines to be addicted to ErbB1-mediated generation of survival signals. Consequently, inhibition of ErbB1 activity leads to a remarkable therapeutic response in the addicted cell lines.

  5. The Green House Model of Nursing Home Care in Design and Implementation.

    PubMed

    Cohen, Lauren W; Zimmerman, Sheryl; Reed, David; Brown, Patrick; Bowers, Barbara J; Nolet, Kimberly; Hudak, Sandra; Horn, Susan

    2016-02-01

    To describe the Green House (GH) model of nursing home (NH) care, and examine how GH homes vary from the model, one another, and their founding (or legacy) NH. Data include primary quantitative and qualitative data and secondary quantitative data, derived from 12 GH/legacy NH organizations February 2012-September 2014. This mixed methods, cross-sectional study used structured interviews to obtain information about presence of, and variation in, GH-relevant structures and processes of care. Qualitative questions explored reasons for variation in model implementation. Interview data were analyzed using related-sample tests, and qualitative data were iteratively analyzed using a directed content approach. GH homes showed substantial variation in practices to support resident choice and decision making; neither GH nor legacy homes provided complete choice, and all GH homes excluded residents from some key decisions. GH homes were most consistent with the model and one another in elements to create a real home, such as private rooms and baths and open kitchens, and in staff-related elements, such as self-managed work teams and consistent, universal workers. Although variation in model implementation complicates evaluation, if expansion is to continue, it is essential to examine GH elements and their outcomes. © Health Research and Educational Trust.

  6. Non Linear Programming (NLP) Formulation for Quantitative Modeling of Protein Signal Transduction Pathways

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Lauffenburger, Douglas A.; Alexopoulos, Leonidas G.

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms. PMID:23226239

  7. Quantitative Agent Based Model of Opinion Dynamics: Polish Elections of 2015

    PubMed Central

    Sobkowicz, Pawel

    2016-01-01

    We present results of an abstract, agent based model of opinion dynamics simulations based on the emotion/information/opinion (E/I/O) approach, applied to a strongly polarized society, corresponding to the Polish political scene between 2005 and 2015. Under certain conditions the model leads to metastable coexistence of two subcommunities of comparable size (supporting the corresponding opinions)—which corresponds to the bipartisan split found in Poland. Spurred by the recent breakdown of this political duopoly, which occurred in 2015, we present a model extension that describes both the long term coexistence of the two opposing opinions and a rapid, transitory change due to the appearance of a third party alternative. We provide quantitative comparison of the model with the results of polls and elections in Poland, testing the assumptions related to the modeled processes and the parameters used in the simulations. It is shown, that when the propaganda messages of the two incumbent parties differ in emotional tone, the political status quo may be unstable. The asymmetry of the emotions within the support bases of the two parties allows one of them to be ‘invaded’ by a newcomer third party very quickly, while the second remains immune to such invasion. PMID:27171226

  8. Emerging Infectious Diseases and Blood Safety: Modeling the Transfusion-Transmission Risk.

    PubMed

    Kiely, Philip; Gambhir, Manoj; Cheng, Allen C; McQuilten, Zoe K; Seed, Clive R; Wood, Erica M

    2017-07-01

    While the transfusion-transmission (TT) risk associated with the major transfusion-relevant viruses such as HIV is now very low, during the last 20 years there has been a growing awareness of the threat to blood safety from emerging infectious diseases, a number of which are known to be, or are potentially, transfusion transmissible. Two published models for estimating the transfusion-transmission risk from EIDs, referred to as the Biggerstaff-Petersen model and the European Upfront Risk Assessment Tool (EUFRAT), respectively, have been applied to several EIDs in outbreak situations. We describe and compare the methodological principles of both models, highlighting their similarities and differences. We also discuss the appropriateness of comparing results from the two models. Quantitating the TT risk of EIDs can inform decisions about risk mitigation strategies and their cost-effectiveness. Finally, we present a qualitative risk assessment for Zika virus (ZIKV), an EID agent that has caused several outbreaks since 2007. In the latest and largest ever outbreak, several probable cases of transfusion-transmission ZIKV have been reported, indicating that it is transfusion-transmissible and therefore a risk to blood safety. We discuss why quantitative modeling the TT risk of ZIKV is currently problematic. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  9. Prediction of the retention of s-triazines in reversed-phase high-performance liquid chromatography under linear gradient-elution conditions.

    PubMed

    D'Archivio, Angelo Antonio; Maggi, Maria Anna; Ruggieri, Fabrizio

    2014-08-01

    In this paper, a multilayer artificial neural network is used to model simultaneously the effect of solute structure and eluent concentration profile on the retention of s-triazines in reversed-phase high-performance liquid chromatography under linear gradient elution. The retention data of 24 triazines, including common herbicides and their metabolites, are collected under 13 different elution modes, covering the following experimental domain: starting acetonitrile volume fraction ranging between 40 and 60% and gradient slope ranging between 0 and 1% acetonitrile/min. The gradient parameters together with five selected molecular descriptors, identified by quantitative structure-retention relationship modelling applied to individual separation conditions, are the network inputs. Predictive performance of this model is evaluated on six external triazines and four unseen separation conditions. For comparison, retention of triazines is modelled by both quantitative structure-retention relationships and response surface methodology, which describe separately the effect of molecular structure and gradient parameters on the retention. Although applied to a wider variable domain, the network provides a performance comparable to that of the above "local" models and retention times of triazines are modelled with accuracy generally better than 7%. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Non Linear Programming (NLP) formulation for quantitative modeling of protein signal transduction pathways.

    PubMed

    Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  11. Modelling the Active Hearing Process in Mosquitoes

    NASA Astrophysics Data System (ADS)

    Avitabile, Daniele; Homer, Martin; Jackson, Joe; Robert, Daniel; Champneys, Alan

    2011-11-01

    A simple microscopic mechanistic model is described of the active amplification within the Johnston's organ of the mosquito species Toxorhynchites brevipalpis. The model is based on the description of the antenna as a forced-damped oscillator coupled to a set of active threads (ensembles of scolopidia) that provide an impulsive force when they twitch. This twitching is in turn controlled by channels that are opened and closed if the antennal oscillation reaches a critical amplitude. The model matches both qualitatively and quantitatively with recent experiments. New results are presented using mathematical homogenization techniques to derive a mesoscopic model as a simple oscillator with nonlinear force and damping characteristics. It is shown how the results from this new model closely resemble those from the microscopic model as the number of threads approach physiologically correct values.

  12. Supraglacial channel inception: Modeling and processes

    NASA Astrophysics Data System (ADS)

    Mantelli, E.; Camporeale, C.; Ridolfi, L.

    2015-09-01

    Supraglacial drainage systems play a key role in glacial hydrology. Nevertheless, physical processes leading to spatial organization in supraglacial networks are still an open issue. In the present work we thus address from a quantitative point of view the question of what is the physics leading to widely observed patterns made up of evenly spaced channels. To this aim, we set up a novel mathematical model describing a condition antecedent channel formation, i.e., the down-glacier flow of a distributed meltwater film. We then perform a linear stability analysis to assess whether the ice-water interface undergoes a morphological instability compatible with observed patterns. The instability is detected, its features depending on glacier surface slope, ice friction factor, and water as well as ice thermal conditions. By contrast, in our model channel spacing is solely hydrodynamically driven and relies on the interplay between pressure perturbations, flow depth response, and Reynolds stresses. Geometrical features of the predicted pattern are quantitatively consistent with available field data. The hydrodynamic origin of supraglacial channel morphogenesis suggests that alluvial patterns might share the same physical controls.

  13. A quantitative approach to the topology of large-scale structure. [for galactic clustering computation

    NASA Technical Reports Server (NTRS)

    Gott, J. Richard, III; Weinberg, David H.; Melott, Adrian L.

    1987-01-01

    A quantitative measure of the topology of large-scale structure: the genus of density contours in a smoothed density distribution, is described and applied. For random phase (Gaussian) density fields, the mean genus per unit volume exhibits a universal dependence on threshold density, with a normalizing factor that can be calculated from the power spectrum. If large-scale structure formed from the gravitational instability of small-amplitude density fluctuations, the topology observed today on suitable scales should follow the topology in the initial conditions. The technique is illustrated by applying it to simulations of galaxy clustering in a flat universe dominated by cold dark matter. The technique is also applied to a volume-limited sample of the CfA redshift survey and to a model in which galaxies reside on the surfaces of polyhedral 'bubbles'. The topology of the evolved mass distribution and 'biased' galaxy distribution in the cold dark matter models closely matches the topology of the density fluctuations in the initial conditions. The topology of the observational sample is consistent with the random phase, cold dark matter model.

  14. Transition-state theory predicts clogging at the microscale

    NASA Astrophysics Data System (ADS)

    Laar, T. Van De; Klooster, S. Ten; Schroën, K.; Sprakel, J.

    2016-06-01

    Clogging is one of the main failure mechanisms encountered in industrial processes such as membrane filtration. Our understanding of the factors that govern the build-up of fouling layers and the emergence of clogs is largely incomplete, so that prevention of clogging remains an immense and costly challenge. In this paper we use a microfluidic model combined with quantitative real-time imaging to explore the influence of pore geometry and particle interactions on suspension clogging in constrictions, two crucial factors which remain relatively unexplored. We find a distinct dependence of the clogging rate on the entrance angle to a membrane pore which we explain quantitatively by deriving a model, based on transition-state theory, which describes the effect of viscous forces on the rate with which particles accumulate at the channel walls. With the same model we can also predict the effect of the particle interaction potential on the clogging rate. In both cases we find excellent agreement between our experimental data and theory. A better understanding of these clogging mechanisms and the influence of design parameters could form a stepping stone to delay or prevent clogging by rational membrane design.

  15. Coordinated encoding between cell types in the retina: insights from the theory of phase transitions

    NASA Astrophysics Data System (ADS)

    Sharpee, Tatyana

    2015-03-01

    In this talk I will describe how the emergence of some types of neurons in the brain can be quantitatively described by the theory of transitions between different phases of matter. The two key parameters that control the separation of neurons into subclasses are the mean and standard deviation of noise levels among neurons in the population. The mean noise level plays the role of temperature in the classic theory of phase transitions, whereas the standard deviation is equivalent to pressure, in the case of liquid-gas transitions, or to magnetic field for magnetic transitions. Our results account for properties of two recently discovered types of salamander OFF retinal ganglion cells, as well as the absence of multiple types of ON cells. We further show that, across visual stimulus contrasts, retinal circuits continued to operate near the critical point whose quantitative characteristics matched those expected near a liquid-gas critical point and described by the nearest-neighbor Ising model in three dimensions. Because the retina needs to operate under changing stimulus conditions, the observed parameters of cell types corresponded to metastable states in the region between the spinodal line and the line describing maximally informative solutions. Such properties of neural circuits can maximize information transmission in a given environment while retaining the ability to quickly adapt to a new environment. NSF CAREER award 1254123 and NIH R01EY019493

  16. Estimation of postfire nutrient loss in the Florida everglades.

    PubMed

    Qian, Y; Miao, S L; Gu, B; Li, Y C

    2009-01-01

    Postfire nutrient release into ecosystem via plant ash is critical to the understanding of fire impacts on the environment. Factors determining a postfire nutrient budget are prefire nutrient content in the combustible biomass, burn temperature, and the amount of combustible biomass. Our objective was to quantitatively describe the relationships between nutrient losses (or concentrations in ash) and burning temperature in laboratory controlled combustion and to further predict nutrient losses in field fire by applying predictive models established based on laboratory data. The percentage losses of total nitrogen (TN), total carbon (TC), and material mass showed a significant linear correlation with a slope close to 1, indicating that TN or TC loss occurred predominantly through volatilization during combustion. Data obtained in laboratory experiments suggest that the losses of TN, TC, as well as the ratio of ash total phosphorus (TP) concentration to leaf TP concentration have strong relationships with burning temperature and these relationships can be quantitatively described by nonlinear equations. The potential use of these nonlinear models relating nutrient loss (or concentration) to temperature in predicting nutrient concentrations in field ash appear to be promising. During a prescribed fire in the northern Everglades, 73.1% of TP was estimated to be retained in ash while 26.9% was lost to the atmosphere, agreeing well with the distribution of TP during previously reported wild fires. The use of predictive models would greatly reduce the cost associated with measuring field ash nutrient concentrations.

  17. Mathematical modeling of efficacy and safety for anticancer drugs clinical development.

    PubMed

    Lavezzi, Silvia Maria; Borella, Elisa; Carrara, Letizia; De Nicolao, Giuseppe; Magni, Paolo; Poggesi, Italo

    2018-01-01

    Drug attrition in oncology clinical development is higher than in other therapeutic areas. In this context, pharmacometric modeling represents a useful tool to explore drug efficacy in earlier phases of clinical development, anticipating overall survival using quantitative model-based metrics. Furthermore, modeling approaches can be used to characterize earlier the safety and tolerability profile of drug candidates, and, thus, the risk-benefit ratio and the therapeutic index, supporting the design of optimal treatment regimens and accelerating the whole process of clinical drug development. Areas covered: Herein, the most relevant mathematical models used in clinical anticancer drug development during the last decade are described. Less recent models were considered in the review if they represent a standard for the analysis of certain types of efficacy or safety measures. Expert opinion: Several mathematical models have been proposed to predict overall survival from earlier endpoints and validate their surrogacy in demonstrating drug efficacy in place of overall survival. An increasing number of mathematical models have also been developed to describe the safety findings. Modeling has been extensively used in anticancer drug development to individualize dosing strategies based on patient characteristics, and design optimal dosing regimens balancing efficacy and safety.

  18. Integrated approach to modeling long-term durability of concrete engineered barriers in LLRW disposal facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, J.H.; Roy, D.M.; Mann, B.

    1995-12-31

    This paper describes an integrated approach to developing a predictive computer model for long-term performance of concrete engineered barriers utilized in LLRW and ILRW disposal facilities. The model development concept consists of three major modeling schemes: hydration modeling of the binder phase, pore solution speciation, and transport modeling in the concrete barrier and service environment. Although still in its inception, the model development approach demonstrated that the chemical and physical properties of complex cementitious materials and their interactions with service environments can be described quantitatively. Applying the integrated model development approach to modeling alkali (Na and K) leaching from amore » concrete pad barrier in an above-grade tumulus disposal unit, it is predicted that, in a near-surface land disposal facility where water infiltration through the facility is normally minimal, the alkalis control the pore solution pH of the concrete barriers for much longer than most previous concrete barrier degradation studies assumed. The results also imply that a highly alkaline condition created by the alkali leaching will result in alteration of the soil mineralogy in the vicinity of the disposal facility.« less

  19. Quantitative pharmacological analysis of antagonist binding kinetics at CRF1 receptors in vitro and in vivo

    PubMed Central

    Ramsey, Simeon J; Attkins, Neil J; Fish, Rebecca; van der Graaf, Piet H

    2011-01-01

    BACKGROUND AND PURPOSE A series of novel non-peptide corticotropin releasing factor type-1 receptor (CRF1) antagonists were found to display varying degrees of insurmountable and non-competitive behaviour in functional in vitro assays. We describe how we attempted to relate this behaviour to ligand receptor-binding kinetics in a quantitative manner and how this resulted in the development and implementation of an efficient pharmacological screening method based on principles described by Motulsky and Mahan. EXPERIMENTAL APPROACH A non-equilibrium binding kinetic assay was developed to determine the receptor binding kinetics of non-peptide CRF1 antagonists. Nonlinear, mixed-effects modelling was used to obtain estimates of the compounds association and dissociation rates. We present an integrated pharmacokinetic–pharmacodynamic (PKPD) approach, whereby the time course of in vivo CRF1 receptor binding of novel compounds can be predicted on the basis of in vitro assays. KEY RESULTS The non-competitive antagonist behaviour appeared to be correlated to the CRF1 receptor off-rate kinetics. The integrated PKPD model suggested that, at least in a qualitative manner, the in vitro assay can be used to triage and select compounds for further in vivo investigations. CONCLUSIONS AND IMPLICATIONS This study provides evidence for a link between ligand offset kinetics and insurmountable/non-competitive antagonism at the CRF1 receptor. The exact molecular pharmacological nature of this association remains to be determined. In addition, we have developed a quantitative framework to study and integrate in vitro and in vivo receptor binding kinetic behaviour of CRF1 receptor antagonists in an efficient manner in a drug discovery setting. PMID:21449919

  20. Identification of Synchronous Machine Stability - Parameters: AN On-Line Time-Domain Approach.

    NASA Astrophysics Data System (ADS)

    Le, Loc Xuan

    1987-09-01

    A time-domain modeling approach is described which enables the stability-study parameters of the synchronous machine to be determined directly from input-output data measured at the terminals of the machine operating under normal conditions. The transient responses due to system perturbations are used to identify the parameters of the equivalent circuit models. The described models are verified by comparing their responses with the machine responses generated from the transient stability models of a small three-generator multi-bus power system and of a single -machine infinite-bus power network. The least-squares method is used for the solution of the model parameters. As a precaution against ill-conditioned problems, the singular value decomposition (SVD) is employed for its inherent numerical stability. In order to identify the equivalent-circuit parameters uniquely, the solution of a linear optimization problem with non-linear constraints is required. Here, the SVD appears to offer a simple solution to this otherwise difficult problem. Furthermore, the SVD yields solutions with small bias and, therefore, physically meaningful parameters even in the presence of noise in the data. The question concerning the need for a more advanced model of the synchronous machine which describes subtransient and even sub-subtransient behavior is dealt with sensibly by the concept of condition number. The concept provides a quantitative measure for determining whether such an advanced model is indeed necessary. Finally, the recursive SVD algorithm is described for real-time parameter identification and tracking of slowly time-variant parameters. The algorithm is applied to identify the dynamic equivalent power system model.

  1. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    PubMed

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.

  2. Quantification of HCV RNA in Liver Tissue by bDNA Assay.

    PubMed

    Dailey, P J; Collins, M L; Urdea, M S; Wilber, J C

    1999-01-01

    With this statement, Sherlock and Dooley have described two of the three major challenges involved in quantitatively measuring any analyte in tissue samples: the distribution of the analyte in the tissue; and the standard of reference, or denominator, with which to make comparisons between tissue samples. The third challenge for quantitative measurement of an analyte in tissue is to ensure reproducible and quantitative recovery of the analyte on extraction from tissue samples. This chapter describes a method that can be used to measure HCV RNA quantitatively in liver biopsy and tissue samples using the bDNA assay. All three of these challenges-distribution, denominator, and recovery-apply to the measurement of HCV RNA in liver biopsies.

  3. Cobalt and scandium partitioning versus iron content for crystalline phases in ultramafic nodules

    USGS Publications Warehouse

    Glassley, W.E.; Piper, D.Z.

    1978-01-01

    Fractionation of Co and Sc between garnets, olivines, and clino- and orthopyroxenes, separated from a suite of Salt Lake Crater ultramafic nodules that equilibrated at the same T and P, is strongly dependent on Fe contents. This observation suggests that petrogenetic equilibrium models of partial melting and crystal fractionation must take into account effects of magma composition, if they are to describe quantitatively geochemical evolutionary trends. ?? 1978.

  4. Using Popular Culture to Teach Quantitative Reasoning

    ERIC Educational Resources Information Center

    Hillyard, Cinnamon

    2007-01-01

    Popular culture provides many opportunities to develop quantitative reasoning. This article describes a junior-level, interdisciplinary, quantitative reasoning course that uses examples from movies, cartoons, television, magazine advertisements, and children's literature. Some benefits from and cautions to using popular culture to teach…

  5. Modeling the Afferent Dynamics of the Baroreflex Control System

    PubMed Central

    Mahdi, Adam; Sturdy, Jacob; Ottesen, Johnny T.; Olufsen, Mette S.

    2013-01-01

    In this study we develop a modeling framework for predicting baroreceptor firing rate as a function of blood pressure. We test models within this framework both quantitatively and qualitatively using data from rats. The models describe three components: arterial wall deformation, stimulation of mechanoreceptors located in the BR nerve-endings, and modulation of the action potential frequency. The three sub-systems are modeled individually following well-established biological principles. The first submodel, predicting arterial wall deformation, uses blood pressure as an input and outputs circumferential strain. The mechanoreceptor stimulation model, uses circumferential strain as an input, predicting receptor deformation as an output. Finally, the neural model takes receptor deformation as an input predicting the BR firing rate as an output. Our results show that nonlinear dependence of firing rate on pressure can be accounted for by taking into account the nonlinear elastic properties of the artery wall. This was observed when testing the models using multiple experiments with a single set of parameters. We find that to model the response to a square pressure stimulus, giving rise to post-excitatory depression, it is necessary to include an integrate-and-fire model, which allows the firing rate to cease when the stimulus falls below a given threshold. We show that our modeling framework in combination with sensitivity analysis and parameter estimation can be used to test and compare models. Finally, we demonstrate that our preferred model can exhibit all known dynamics and that it is advantageous to combine qualitative and quantitative analysis methods. PMID:24348231

  6. Modeling Cape- and Ridge-Associated Marine Sand Deposits; A Focus on the U.S. Atlantic Continental Shelf

    USGS Publications Warehouse

    Bliss, James D.; Williams, S. Jeffress; Bolm, Karen S.

    2009-01-01

    Cape- and ridge-associated marine sand deposits, which accumulate on storm-dominated continental shelves that are undergoing Holocene marine transgression, are particularly notable in a segment of the U.S. Atlantic Continental Shelf that extends southward from the east tip of Long Island, N.Y., and eastward from Cape May at the south end of the New Jersey shoreline. These sand deposits commonly contain sand suitable for shore protection in the form of beach nourishment. Increasing demand for marine sand raises questions about both short- and long-term potential supply and the sustainability of beach nourishment with the prospects of accelerating sea-level rise and increasing storm activity. To address these important issues, quantitative assessments of the volume of marine sand resources are needed. Currently, the U.S. Geological Survey is undertaking these assessments through its national Marine Aggregates and Resources Program (URL http://woodshole.er.usgs.gov/project-pages/aggregates/). In this chapter, we present a hypothetical example of a quantitative assessment of cape-and ridge-associated marine sand deposits in the study area, using proven tools of mineral-resource assessment. Applying these tools requires new models that summarize essential data on the quantity and quality of these deposits. Two representative types of model are descriptive models, which consist of a narrative that allows for a consistent recognition of cape-and ridge-associated marine sand deposits, and quantitative models, which consist of empirical statistical distributions that describe significant deposit characteristics, such as volume and grain-size distribution. Variables of the marine sand deposits considered for quantitative modeling in this study include area, thickness, mean grain size, grain sorting, volume, proportion of sand-dominated facies, and spatial density, of which spatial density is particularly helpful in estimating the number of undiscovered deposits within an assessment area. A Monte Carlo simulation that combines the volume of sand-dominated-facies models with estimates of the hypothetical probable number of undiscovered deposits provides a probabilistic approach to estimating marine sand resources within parts of the U.S. Atlantic Continental Shelf and other comparable marine shelves worldwide.

  7. Toward Multiscale Models of Cyanobacterial Growth: A Modular Approach

    PubMed Central

    Westermark, Stefanie; Steuer, Ralf

    2016-01-01

    Oxygenic photosynthesis dominates global primary productivity ever since its evolution more than three billion years ago. While many aspects of phototrophic growth are well understood, it remains a considerable challenge to elucidate the manifold dependencies and interconnections between the diverse cellular processes that together facilitate the synthesis of new cells. Phototrophic growth involves the coordinated action of several layers of cellular functioning, ranging from the photosynthetic light reactions and the electron transport chain, to carbon-concentrating mechanisms and the assimilation of inorganic carbon. It requires the synthesis of new building blocks by cellular metabolism, protection against excessive light, as well as diurnal regulation by a circadian clock and the orchestration of gene expression and cell division. Computational modeling allows us to quantitatively describe these cellular functions and processes relevant for phototrophic growth. As yet, however, computational models are mostly confined to the inner workings of individual cellular processes, rather than describing the manifold interactions between them in the context of a living cell. Using cyanobacteria as model organisms, this contribution seeks to summarize existing computational models that are relevant to describe phototrophic growth and seeks to outline their interactions and dependencies. Our ultimate aim is to understand cellular functioning and growth as the outcome of a coordinated operation of diverse yet interconnected cellular processes. PMID:28083530

  8. Designing and encoding models for synthetic biology.

    PubMed

    Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas

    2009-08-06

    A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology 'loop'.

  9. A surface curvature oscillation model for vapour-liquid-solid growth of periodic one-dimensional nanostructures

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Wang, Jian-Tao; Cao, Ze-Xian; Zhang, Wen-Jun; Lee, Chun-Sing; Lee, Shuit-Tong; Zhang, Xiao-Hong

    2015-03-01

    While the vapour-liquid-solid process has been widely used for growing one-dimensional nanostructures, quantitative understanding of the process is still far from adequate. For example, the origins for the growth of periodic one-dimensional nanostructures are not fully understood. Here we observe that morphologies in a wide range of periodic one-dimensional nanostructures can be described by two quantitative relationships: first, inverse of the periodic spacing along the length direction follows an arithmetic sequence; second, the periodic spacing in the growth direction varies linearly with the diameter of the nanostructure. We further find that these geometric relationships can be explained by considering the surface curvature oscillation of the liquid sphere at the tip of the growing nanostructure. The work reveals the requirements of vapour-liquid-solid growth. It can be applied for quantitative understanding of vapour-liquid-solid growth and to design experiments for controlled growth of nanostructures with custom-designed morphologies.

  10. Calcium tracer kinetics show decreased irreversible flow to bone in glucocorticoid treated patients.

    PubMed

    Goans, R E; Weiss, G H; Abrams, S A; Perez, M D; Yergey, A L

    1995-06-01

    Osteopenia resulting from pharmacologic doses of glucocorticoids is well known. Previously, there has been no satisfactory quantitative model describing the kinetics of calcium flow in subjects on chronic steroid use. A mathematical model of calcium isotope interaction with bone is described and applied to determine an estimate of kinetic parameters characterizing these changes. Calcium tracer dilution kinetics after a bolus injection of 42Ca were measured in 14 subjects with juvenile dermatomyositis, 6 on prednisone regimens and 8 on treatment regimens without prednisone. Irreversible tracer loss from plasma bone is found to be significantly reduced (P = 0.043) in the glucocorticoid-treated patients compared with patients on nonsteroid regimens. Reversible flow to bone is noted to be similar in the two groups. These results suggest a direct effect of glucocorticoids on osteoblast function.

  11. Advances in cleavage fracture modelling in steels: Micromechanical, numerical and multiscale aspects

    NASA Astrophysics Data System (ADS)

    Pineau, André; Tanguy, Benoît

    2010-04-01

    Brittle cleavage fracture remains one of the major concerns for structural integrity assessment. The main characteristics of this mode of failure in relation to the stress field ahead of a crack, tip are described in the introduction. The emphasis is laid on the physical origins of scatter and the size effect observed in ferritic steels. It is shown that cleavage fracture is controlled by physical events occurring at different scales: initiation at (sub)micrometric particles, propagation across grain boundaries (10-50 microns) and final fracture at centimetric scale. The two first scales are detailed in this paper. The statistical origin of cleavage is described quantitatively from both microstructural defects and stress-strain heterogeneities due to crystalline plasticity at the grain scale. Existing models are applied to the prediction of the variation of Charpy fracture toughness with temperature.

  12. Model of Pressure Distribution in Vortex Flow Controls

    NASA Astrophysics Data System (ADS)

    Mielczarek, Szymon; Sawicki, Jerzy M.

    2015-06-01

    Vortex valves belong to the category of hydrodynamic flow controls. They are important and theoretically interesting devices, so complex from hydraulic point of view, that probably for this reason none rational concept of their operation has been proposed so far. In consequence, functioning of vortex valves is described by CFD-methods (computer-aided simulation of technical objects) or by means of simple empirical relations (using discharge coefficient or hydraulic loss coefficient). Such rational model of the considered device is proposed in the paper. It has a simple algebraic form, but is well grounded physically. The basic quantitative relationship, which describes the valve operation, i.e. dependence between the flow discharge and the circumferential pressure head, caused by the rotation, has been verified empirically. Conformity between calculated and measured parameters of the device allows for acceptation of the proposed concept.

  13. Targeted methods for quantitative analysis of protein glycosylation

    PubMed Central

    Goldman, Radoslav; Sanda, Miloslav

    2018-01-01

    Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218

  14. Model-Based Approaches for Teaching and Practicing Personality Assessment.

    PubMed

    Blais, Mark A; Hopwood, Christopher J

    2017-01-01

    Psychological assessment is a complex professional skill. Competence in assessment requires an extensive knowledge of personality, neuropsychology, social behavior, and psychopathology, a background in psychometrics, familiarity with a range of multimethod tools, cognitive flexibility, skepticism, and interpersonal sensitivity. This complexity makes assessment a challenge to teach and learn, particularly as the investment of resources and time in assessment has waned in psychological training programs over the last few decades. In this article, we describe 3 conceptual models that can assist teaching and learning psychological assessments. The transtheoretical model of personality provides a personality systems-based framework for understanding how multimethod assessment data relate to major personality systems and can be combined to describe and explain complex human behavior. The quantitative psychopathology-personality trait model is an empirical model based on the hierarchical organization of individual differences. Application of this model can help students understand diagnostic comorbidity and symptom heterogeneity, focus on more meaningful high-order domains, and identify the most effective assessment tools for addressing a given question. The interpersonal situation model is rooted in interpersonal theory and can help students connect test data to here-and-now interactions with patients. We conclude by demonstrating the utility of these models using a case example.

  15. A generalized population dynamics model for reproductive interference with absolute density dependence.

    PubMed

    Kyogoku, Daisuke; Sota, Teiji

    2017-05-17

    Interspecific mating interactions, or reproductive interference, can affect population dynamics, species distribution and abundance. Previous population dynamics models have assumed that the impact of frequency-dependent reproductive interference depends on the relative abundances of species. However, this assumption could be an oversimplification inappropriate for making quantitative predictions. Therefore, a more general model to forecast population dynamics in the presence of reproductive interference is required. Here we developed a population dynamics model to describe the absolute density dependence of reproductive interference, which appears likely when encounter rate between individuals is important. Our model (i) can produce diverse shapes of isoclines depending on parameter values and (ii) predicts weaker reproductive interference when absolute density is low. These novel characteristics can create conditions where coexistence is stable and independent from the initial conditions. We assessed the utility of our model in an empirical study using an experimental pair of seed beetle species, Callosobruchus maculatus and Callosobruchus chinensis. Reproductive interference became stronger with increasing total beetle density even when the frequencies of the two species were kept constant. Our model described the effects of absolute density and showed a better fit to the empirical data than the existing model overall.

  16. Modeling of contact tracing in social networks

    NASA Astrophysics Data System (ADS)

    Tsimring, Lev S.; Huerta, Ramón

    2003-07-01

    Spreading of certain infections in complex networks is effectively suppressed by using intelligent strategies for epidemic control. One such standard epidemiological strategy consists in tracing contacts of infected individuals. In this paper, we use a recently introduced generalization of the standard susceptible-infectious-removed stochastic model for epidemics in sparse random networks which incorporates an additional (traced) state. We describe a deterministic mean-field description which yields quantitative agreement with stochastic simulations on random graphs. We also discuss the role of contact tracing in epidemics control in small-world and scale-free networks. Effectiveness of contact tracing grows as the rewiring probability is reduced.

  17. Experimental Keratitis Due to Pseudomonas aeruginosa: Model for Evaluation of Antimicrobial Drugs

    PubMed Central

    Davis, Starkey D.; Chandler, John W.

    1975-01-01

    An improved method for experimental keratitis due to Pseudomonas aeruginosa is described. Essential features of the method are use of inbred guinea pigs, intracorneal injection of bacteria, subconjunctival injection of antibiotics, “blind” evaluation of results, and statistical analysis of data. Untreated ocular infections were most severe 5 to 7 days after infection. Sterilized bacterial suspensions caused no abnormalities on day 5. Tobramycin and polymyxin B were more active than gentamicin against two strains of Pseudomonas. This model is suitable for many types of quantitative studies on experimental keratitis. Images PMID:810084

  18. Imaging 2D optical diffuse reflectance in skeletal muscle

    NASA Astrophysics Data System (ADS)

    Ranasinghesagara, Janaka; Yao, Gang

    2007-04-01

    We discovered a unique pattern of optical reflectance from fresh prerigor skeletal muscles, which can not be described using existing theories. A numerical fitting function was developed to quantify the equiintensity contours of acquired reflectance images. Using this model, we studied the changes of reflectance profile during stretching and rigor process. We found that the prominent anisotropic features diminished after rigor completion. These results suggested that muscle sarcomere structures played important roles in modulating light propagation in whole muscle. When incorporating the sarcomere diffraction in a Monte Carlo model, we showed that the resulting reflectance profiles quantitatively resembled the experimental observation.

  19. An information-carrying and knowledge-producing molecular machine. A Monte-Carlo simulation.

    PubMed

    Kuhn, Christoph

    2012-02-01

    The concept called Knowledge is a measure of the quality of genetically transferred information. Its usefulness is demonstrated quantitatively in a Monte-Carlo simulation on critical steps in a origin of life model. The model describes the origin of a bio-like genetic apparatus by a long sequence of physical-chemical steps: it starts with the presence of a self-replicating oligomer and a specifically structured environment in time and space that allow for the formation of aggregates such as assembler-hairpins-devices and, at a later stage, an assembler-hairpins-enzyme device-a first translation machine.

  20. ChloroKB: A Web Application for the Integration of Knowledge Related to Chloroplast Metabolic Network1[OPEN

    PubMed Central

    Gloaguen, Pauline; Alban, Claude; Ravanel, Stéphane; Seigneurin-Berny, Daphné; Matringe, Michel; Ferro, Myriam; Bruley, Christophe; Rolland, Norbert; Vandenbrouck, Yves

    2017-01-01

    Higher plants, as autotrophic organisms, are effective sources of molecules. They hold great promise for metabolic engineering, but the behavior of plant metabolism at the network level is still incompletely described. Although structural models (stoichiometry matrices) and pathway databases are extremely useful, they cannot describe the complexity of the metabolic context, and new tools are required to visually represent integrated biocurated knowledge for use by both humans and computers. Here, we describe ChloroKB, a Web application (http://chlorokb.fr/) for visual exploration and analysis of the Arabidopsis (Arabidopsis thaliana) metabolic network in the chloroplast and related cellular pathways. The network was manually reconstructed through extensive biocuration to provide transparent traceability of experimental data. Proteins and metabolites were placed in their biological context (spatial distribution within cells, connectivity in the network, participation in supramolecular complexes, and regulatory interactions) using CellDesigner software. The network contains 1,147 reviewed proteins (559 localized exclusively in plastids, 68 in at least one additional compartment, and 520 outside the plastid), 122 proteins awaiting biochemical/genetic characterization, and 228 proteins for which genes have not yet been identified. The visual presentation is intuitive and browsing is fluid, providing instant access to the graphical representation of integrated processes and to a wealth of refined qualitative and quantitative data. ChloroKB will be a significant support for structural and quantitative kinetic modeling, for biological reasoning, when comparing novel data with established knowledge, for computer analyses, and for educational purposes. ChloroKB will be enhanced by continuous updates following contributions from plant researchers. PMID:28442501

  1. A New Poisson-Nernst-Planck Model with Ion-Water Interactions for Charge Transport in Ion Channels.

    PubMed

    Chen, Duan

    2016-08-01

    In this work, we propose a new Poisson-Nernst-Planck (PNP) model with ion-water interactions for biological charge transport in ion channels. Due to narrow geometries of these membrane proteins, ion-water interaction is critical for both dielectric property of water molecules in channel pore and transport dynamics of mobile ions. We model the ion-water interaction energy based on realistic experimental observations in an efficient mean-field approach. Variation of a total energy functional of the biological system yields a new PNP-type continuum model. Numerical simulations show that the proposed model with ion-water interaction energy has the new features that quantitatively describe dielectric properties of water molecules in narrow pores and are possible to model the selectivity of some ion channels.

  2. A strategy for understanding noise-induced annoyance

    NASA Astrophysics Data System (ADS)

    Fidell, S.; Green, D. M.; Schultz, T. J.; Pearsons, K. S.

    1988-08-01

    This report provides a rationale for development of a systematic approach to understanding noise-induced annoyance. Two quantitative models are developed to explain: (1) the prevalence of annoyance due to residential exposure to community noise sources; and (2) the intrusiveness of individual noise events. Both models deal explicitly with the probabilistic nature of annoyance, and assign clear roles to acoustic and nonacoustic determinants of annoyance. The former model provides a theoretical foundation for empirical dosage-effect relationships between noise exposure and community response, while the latter model differentiates between the direct and immediate annoyance of noise intrusions and response bias factors that influence the reporting of annoyance. The assumptions of both models are identified, and the nature of the experimentation necessary to test hypotheses derived from the models is described.

  3. Psychophysically based model of surface gloss perception

    NASA Astrophysics Data System (ADS)

    Ferwerda, James A.; Pellacini, Fabio; Greenberg, Donald P.

    2001-06-01

    In this paper we introduce a new model of surface appearance that is based on quantitative studies of gloss perception. We use image synthesis techniques to conduct experiments that explore the relationships between the physical dimensions of glossy reflectance and the perceptual dimensions of glossy appearance. The product of these experiments is a psychophysically-based model of surface gloss, with dimensions that are both physically and perceptually meaningful and scales that reflect our sensitivity to gloss variations. We demonstrate that the model can be used to describe and control the appearance of glossy surfaces in synthesis images, allowing prediction of gloss matches and quantification of gloss differences. This work represents some initial steps toward developing psychophyscial models of the goniometric aspects of surface appearance to complement widely-used colorimetric models.

  4. A primer on thermodynamic-based models for deciphering transcriptional regulatory logic.

    PubMed

    Dresch, Jacqueline M; Richards, Megan; Ay, Ahmet

    2013-09-01

    A rigorous analysis of transcriptional regulation at the DNA level is crucial to the understanding of many biological systems. Mathematical modeling has offered researchers a new approach to understanding this central process. In particular, thermodynamic-based modeling represents the most biophysically informed approach aimed at connecting DNA level regulatory sequences to the expression of specific genes. The goal of this review is to give biologists a thorough description of the steps involved in building, analyzing, and implementing a thermodynamic-based model of transcriptional regulation. The data requirements for this modeling approach are described, the derivation for a specific regulatory region is shown, and the challenges and future directions for the quantitative modeling of gene regulation are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Multiple methods for multiple futures: Integrating qualitative scenario planning and quantitative simulation modeling for natural resource decision making

    USGS Publications Warehouse

    Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.

    2017-01-01

    Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.

  6. Relevance and limitations of crowding, fractal, and polymer models to describe nuclear architecture.

    PubMed

    Huet, Sébastien; Lavelle, Christophe; Ranchon, Hubert; Carrivain, Pascal; Victor, Jean-Marc; Bancaud, Aurélien

    2014-01-01

    Chromosome architecture plays an essential role for all nuclear functions, and its physical description has attracted considerable interest over the last few years among the biophysics community. These researches at the frontiers of physics and biology have been stimulated by the demand for quantitative analysis of molecular biology experiments, which provide comprehensive data on chromosome folding, or of live cell imaging experiments that enable researchers to visualize selected chromosome loci in living or fixed cells. In this review our goal is to survey several nonmutually exclusive models that have emerged to describe the folding of DNA in the nucleus, the dynamics of proteins in the nucleoplasm, or the movements of chromosome loci. We focus on three classes of models, namely molecular crowding, fractal, and polymer models, draw comparisons, and discuss their merits and limitations in the context of chromosome structure and dynamics, or nuclear protein navigation in the nucleoplasm. Finally, we identify future challenges in the roadmap to a unified model of the nuclear environment. © 2014 Elsevier Inc. All rights reserved.

  7. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    DOE PAGES

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta; ...

    2016-08-29

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less

  8. The Adaptation of the Immigrant Second Generation in America: Theoretical Overview and Recent Evidence

    PubMed Central

    Portes, Alejandro; Fernández-Kelly, Patricia; Haller, William

    2013-01-01

    This paper summarises a research program on the new immigrant second generation initiated in the early 1990s and completed in 2006. The four field waves of the Children of Immigrants Longitudinal Study (CILS) are described and the main theoretical models emerging from it are presented and graphically summarised. After considering critical views of this theory, we present the most recent results from this longitudinal research program in the forum of quantitative models predicting downward assimilation in early adulthood and qualitative interviews identifying ways to escape it by disadvantaged children of immigrants. Quantitative results strongly support the predicted effects of exogenous variables identified by segmented assimilation theory and identify the intervening factors during adolescence that mediate their influence on adult outcomes. Qualitative evidence gathered during the last stage of the study points to three factors that can lead to exceptional educational achievement among disadvantaged youths. All three indicate the positive influence of selective acculturation. Implications of these findings for theory and policy are discussed. PMID:23626483

  9. What makes a house a home? An evaluation of a supported housing project for individuals with long-term psychiatric backgrounds.

    PubMed

    Boydell, K M; Everett, B

    1992-01-01

    Supported housing (as distinct from supportive housing) emphasizes the values of consumer choice; independence; participation; permanence; normalcy; and flexible, ongoing supports. As a model, it has only recently become popular in the literature and therefore little is known of its effectiveness in serving people with long-term psychiatric backgrounds. In 1989, Homeward Projects, a community mental health agency located in Metropolitan Toronto, established a supported housing project. Homeward included an evaluative component in its program from the outset. In order to give equal weight to the tenants' opinions, both quantitative and qualitative methodologies were employed. In the quantitative component, residential milieu, social support, and service delivery were examined. The qualitative component involved an ethnographic study which allowed the tenants to voice their experiences of living in such a setting. Results provided a rich understanding of the model. Overall, the tenants eventually came to describe their house as a home.

  10. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less

  11. Surface temperature/heat transfer measurement using a quantitative phosphor thermography system

    NASA Technical Reports Server (NTRS)

    Buck, G. M.

    1991-01-01

    A relative-intensity phosphor thermography technique developed for surface heating studies in hypersonic wind tunnels is described. A direct relationship between relative emission intensity and phosphor temperature is used for quantitative surface temperature measurements in time. The technique provides global surface temperature-time histories using a 3-CCD (Charge Coupled Device) video camera and digital recording system. A current history of technique development at Langley is discussed. Latest developments include a phosphor mixture for a greater range of temperature sensitivity and use of castable ceramics for inexpensive test models. A method of calculating surface heat-transfer from thermal image data in blowdown wind tunnels is included in an appendix, with an analysis of material thermal heat-transfer properties. Results from tests in the Langley 31-Inch Mach 10 Tunnel are presented for a ceramic orbiter configuration and a four-inch diameter hemisphere model. Data include windward heating for bow-shock/wing-shock interactions on the orbiter wing surface, and a comparison with prediction for hemisphere heating distribution.

  12. Effect of Turbulence in Wind-Tunnel Measurements

    NASA Technical Reports Server (NTRS)

    Dryden, H L; Kuethe, A M

    1931-01-01

    This paper gives some quantitative measurements of wind tunnel turbulence and its effect on the air resistance of spheres and airship models, measurements made possible by the hot wire anemometer and associated apparatus in its original form was described in Technical Report no. 320 and some modifications are presented in an appendix to the present paper. One important result of the investigation is a curve by means of which measurements of the air resistance of spheres can be interpreted to give the turbulence quantitatively. Another is the definite proof that the discrepancies in the results on the N. P. L. Standard airship models are due mainly to differences in the turbulences of the wind tunnels in which the tests were made. An attempt is made to interpret the observed results in terms of the boundary layer theory and for this purpose a brief account is given of the physical bases of this theory and of conceptions that have been obtained by analogy with the laws of flow in pipes.

  13. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    PubMed

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  14. [Some comments on ecological field].

    PubMed

    Wang, D

    2000-06-01

    Based on the data of plant ecological field studies, this paper reviewed the conception of ecological field, field eigenfunctions, graphs of ecological field and its application of ecological field theory in explaining plant interactions. It is suggested that the basic character of ecological field is material, and based on the current research level, it is not sure whether ecological field is a kind of specific field different from general physical field. The author gave some comments on the formula and estimation of parameters of basic field function-ecological potential model on ecological field. Both models have their own characteristics and advantages in specific conditions. The author emphasized that ecological field had even more meaning of ecological methodology, and applying ecological field theory in describing the types and processes of plant interactions had three characteristics: quantitative, synthetic and intuitionistic. Field graphing might provide a new way to ecological studies, especially applying the ecological field theory might give an appropriate quantitative explanation for the dynamic process of plant populations (coexistence and interference competition).

  15. Terrain shape index: quantifying effect of minor landforms on tree height

    Treesearch

    W. Henry McNab

    1989-01-01

    In the southern Appalachians, the distribution and growth of trees are highly correlated with local topography, but the relationships have been ditficult to describe quantitatively. A quantitative expression of the geometric shape of the land surface (terrain shape index) is described and correlated with oventory tree heights and site quality. Application of the index...

  16. Anatomic and Quantitative Temporal Bone CT for Preoperative Assessment of Branchio-Oto-Renal Syndrome.

    PubMed

    Ginat, D T; Ferro, L; Gluth, M B

    2016-12-01

    We describe the temporal bone computed tomography (CT) findings of an unusual case of branchio-oto-renal syndrome with ectopic ossicles that are partially located in the middle cranial fossa. We also describe quantitative temporal bone CT assessment pertaining to cochlear implantation in the setting of anomalous cochlear anatomy associated with this syndrome.

  17. Licancabur Volcano, Bolivia and life in the Atacama: Environmental physics and analogies to Mars

    NASA Astrophysics Data System (ADS)

    Hock, Andrew Nelson

    Although there is no perfect environmental analog to Mars on Earth, quantitative study of relevant terrestrial field sites can serve as the basis for physical models and technology development to aid future exploration. This dissertation describes original field and laboratory research on two terrestrial analog sites: Licancabur Volcano, Bolivia, and the Atacama Desert, Chile. Atop Licancabur, at an elevation of nearly 6,000 meters above sea level, sits the highest volcanic lake on Earth. Prior to this work, little was known about the lake, its waters, the role of volcanism or its potential relationship to locales on Mars. In the first part of this work, I describe observations of the lake resulting from several years of field study, including data on meteorological conditions and solar irradiance. These and other measurements provide the basis for (1) the first quantitative mass and energy balance model of the lake, and (2) the first determination of the altitude effect on solar visible and ultraviolet flux from the high altitude summit. Under the observed conditions, model results indicate: lake waters are primarily meteoric in origin and evaporating rapidly; volcanic input is not required to explain observations of lake water temperature or year-end model results. Nearby, Chile's Atacama Desert is known to be one of the driest, most inhospitable environments on Earth. There, environmental similarities to Mars provide an apt testing ground for new astrobiological exploration technologies. In the latter part of this work, I present results from my work with the Life In The Atacama (LITA) Mars rover field experiment. In particular, I report on the development of a new data analysis tool named the LITA Data Scoring System (DSS). Subject to the user-defined constraints, the DSS was used to facilitate targeting, analysis and mapping of rover science results relevant to potential habitability and evidence for life at three desert field sites. Although experimental in nature, the DSS demonstrated the utility of this type of tool for future astrobiology rovers. The quantitative environmental and operational analogies to Mars are discussed in the conclusion, where they form the basis for recommendations on future avenues of research.

  18. Engineering Digestion: Multiscale Processes of Food Digestion.

    PubMed

    Bornhorst, Gail M; Gouseti, Ourania; Wickham, Martin S J; Bakalis, Serafim

    2016-03-01

    Food digestion is a complex, multiscale process that has recently become of interest to the food industry due to the developing links between food and health or disease. Food digestion can be studied by using either in vitro or in vivo models, each having certain advantages or disadvantages. The recent interest in food digestion has resulted in a large number of studies in this area, yet few have provided an in-depth, quantitative description of digestion processes. To provide a framework to develop these quantitative comparisons, a summary is given here between digestion processes and parallel unit operations in the food and chemical industry. Characterization parameters and phenomena are suggested for each step of digestion. In addition to the quantitative characterization of digestion processes, the multiscale aspect of digestion must also be considered. In both food systems and the gastrointestinal tract, multiple length scales are involved in food breakdown, mixing, absorption. These different length scales influence digestion processes independently as well as through interrelated mechanisms. To facilitate optimized development of functional food products, a multiscale, engineering approach may be taken to describe food digestion processes. A framework for this approach is described in this review, as well as examples that demonstrate the importance of process characterization as well as the multiple, interrelated length scales in the digestion process. © 2016 Institute of Food Technologists®

  19. A multiplexed quantitative proteomics approach for investigating protein expression in the developing central nervous system.

    PubMed

    Orme, Rowan P; Gates, Monte A; Fricker-Gates, Rosemary A

    2010-08-15

    Cell transplantation using stem cell-derived neurons is commonly viewed as a candidate therapy for neurodegenerative diseases. However, methods for differentiating stem cells into homogenous populations of neurons suitable for transplant remain elusive. This suggests that there are as yet unknown signalling factors working in vivo to specify neuronal cell fate during development. These factors could be manipulated to better differentiate stem cells into neural populations useful for therapeutic transplantation. Here a quantitative proteomics approach is described for investigating cell signalling in the developing central nervous system (CNS), using the embryonic ventral mesencephalon as a model. Briefly, total protein was extracted from embryonic ventral midbrain tissue before, during and after the birth of dopaminergic neurons, and digested using trypsin. Two-dimensional liquid chromatography, coupled with tandem mass spectrometry, was then used to identify proteins from the tryptic peptides. Isobaric tagging for relative and absolute quantification (iTRAQ) reagents were used to label the tryptic peptides and facilitate relative quantitative analysis. The success of the experiment was confirmed by the identification of proteins known to be expressed in the developing ventral midbrain, as well as by Western blotting, and immunolabelling of embryonic tissue sections. This method of protein discovery improves upon previous attempts to identify novel signalling factors through microarray analysis. Importantly, the methods described here could be applied to virtually any aspect of development. (c) 2010 Elsevier B.V. All rights reserved.

  20. Research impact in the community-based health sciences: an analysis of 162 case studies from the 2014 UK Research Excellence Framework.

    PubMed

    Greenhalgh, Trisha; Fahy, Nick

    2015-09-21

    The 2014 UK Research Excellence Framework (REF2014) generated a unique database of impact case studies, each describing a body of research and impact beyond academia. We sought to explore the nature and mechanism of impact in a sample of these. The study design was manual content analysis of a large sample of impact case studies (producing mainly quantitative data), plus in-depth interpretive analysis of a smaller sub-sample (for qualitative detail), thereby generating both breadth and depth. For all 162 impact case studies submitted to sub-panel A2 in REF2014, we extracted data on study design(s), stated impacts and audiences, mechanisms of impact, and efforts to achieve impact. We analysed four case studies (selected as exemplars of the range of approaches to impact) in depth, including contacting the authors for their narratives of impact efforts. Most impact case studies described quantitative research (most commonly, trials) and depicted a direct, linear link between research and impact. Research was said to have influenced a guideline in 122 case studies, changed policy in 88, changed practice in 84, improved morbidity in 44 and reduced mortality in 25. Qualitative and participatory research designs were rare, and only one case study described a co-production model of impact. Eighty-two case studies described strong and ongoing linkages with policymakers, but only 38 described targeted knowledge translation activities. In 40 case studies, no active efforts to achieve impact were described. Models of good implementation practice were characterised by an ethical commitment by researchers, strong institutional support and a proactive, interdisciplinary approach to impact activities. REF2014 both inspired and documented significant efforts by UK researchers to achieve impact. But in contrast with the published evidence on research impact (which depicts much as occurring indirectly through non-linear mechanisms), this sub-panel seems to have captured mainly direct and relatively short-term impacts one step removed from patient outcomes. Limited impacts on morbidity and mortality, and researchers' relatively low emphasis on the processes and interactions through which indirect impacts may occur, are concerns. These findings have implications for multi-stakeholder research collaborations such as UK National Institute for Health Research Collaborations for Leadership in Applied Health Research and Care, which are built on non-linear models of impact.

Top