Sample records for quantitative mechanistically based

  1. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  2. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.

    PubMed

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M

    2016-05-05

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.

  3. MECHANISTIC-BASED DISINFECTION AND DISINFECTION BYPRODUCT MODELS

    EPA Science Inventory

    We propose developing a mechanistic-based numerical model for chlorine decay and regulated DBP (THM and HAA) formation derived from (free) chlorination; the model framework will allow future modifications for other DBPs and chloramination. Predicted chlorine residual and DBP r...

  4. Life at the Common Denominator: Mechanistic and Quantitative Biology for the Earth and Space Sciences

    NASA Technical Reports Server (NTRS)

    Hoehler, Tori M.

    2010-01-01

    The remarkable challenges and possibilities of the coming few decades will compel the biogeochemical and astrobiological sciences to characterize the interactions between biology and its environment in a fundamental, mechanistic, and quantitative fashion. The clear need for integrative and scalable biology-environment models is exemplified in the Earth sciences by the challenge of effectively addressing anthropogenic global change, and in the space sciences by the challenge of mounting a well-constrained yet sufficiently adaptive and inclusive search for life beyond Earth. Our understanding of the life-planet interaction is still, however, largely empirical. A variety of approaches seek to move from empirical to mechanistic descriptions. One approach focuses on the relationship between biology and energy, which is at once universal (all life requires energy), unique (life manages energy flow in a fashion not seen in abiotic systems), and amenable to characterization and quantification in thermodynamic terms. Simultaneously, a focus on energy flow addresses a critical point of interface between life and its geological, chemical, and physical environment. Characterizing and quantifying this relationship for life on Earth will support the development of integrative and predictive models for biology-environment dynamics. Understanding this relationship at its most fundamental level holds potential for developing concepts of habitability and biosignatures that can optimize astrobiological exploration strategies and are extensible to all life.

  5. Mechanistic applicability domain classification of a local lymph node assay dataset for skin sensitization.

    PubMed

    Roberts, David W; Patlewicz, Grace; Kern, Petra S; Gerberick, Frank; Kimber, Ian; Dearman, Rebecca J; Ryan, Cindy A; Basketter, David A; Aptula, Aynur O

    2007-07-01

    The goal of eliminating animal testing in the predictive identification of chemicals with the intrinsic ability to cause skin sensitization is an important target, the attainment of which has recently been brought into even sharper relief by the EU Cosmetics Directive and the requirements of the REACH legislation. Development of alternative methods requires that the chemicals used to evaluate and validate novel approaches comprise not only confirmed skin sensitizers and non-sensitizers but also substances that span the full chemical mechanistic spectrum associated with skin sensitization. To this end, a recently published database of more than 200 chemicals tested in the mouse local lymph node assay (LLNA) has been examined in relation to various chemical reaction mechanistic domains known to be associated with sensitization. It is demonstrated here that the dataset does cover the main reaction mechanistic domains. In addition, it is shown that assignment to a reaction mechanistic domain is a critical first step in a strategic approach to understanding, ultimately on a quantitative basis, how chemical properties influence the potency of skin sensitizing chemicals. This understanding is necessary if reliable non-animal approaches, including (quantitative) structure-activity relationships (Q)SARs, read-across, and experimental chemistry based models, are to be developed.

  6. Base course resilient modulus for the mechanistic-empirical pavement design guide.

    DOT National Transportation Integrated Search

    2011-11-01

    The Mechanistic-Empirical Pavement Design Guidelines (MEPDG) recommend use of modulus in lieu of structural number for base layer thickness design. Modulus is nonlinear with respect to effective confinement stress, loading strain, and moisture. For d...

  7. Blinded Prospective Evaluation of Computer-Based Mechanistic Schizophrenia Disease Model for Predicting Drug Response

    PubMed Central

    Geerts, Hugo; Spiros, Athan; Roberts, Patrick; Twyman, Roy; Alphs, Larry; Grace, Anthony A.

    2012-01-01

    The tremendous advances in understanding the neurobiological circuits involved in schizophrenia have not translated into more effective treatments. An alternative strategy is to use a recently published ‘Quantitative Systems Pharmacology’ computer-based mechanistic disease model of cortical/subcortical and striatal circuits based upon preclinical physiology, human pathology and pharmacology. The physiology of 27 relevant dopamine, serotonin, acetylcholine, norepinephrine, gamma-aminobutyric acid (GABA) and glutamate-mediated targets is calibrated using retrospective clinical data on 24 different antipsychotics. The model was challenged to predict quantitatively the clinical outcome in a blinded fashion of two experimental antipsychotic drugs; JNJ37822681, a highly selective low-affinity dopamine D2 antagonist and ocaperidone, a very high affinity dopamine D2 antagonist, using only pharmacology and human positron emission tomography (PET) imaging data. The model correctly predicted the lower performance of JNJ37822681 on the positive and negative syndrome scale (PANSS) total score and the higher extra-pyramidal symptom (EPS) liability compared to olanzapine and the relative performance of ocaperidone against olanzapine, but did not predict the absolute PANSS total score outcome and EPS liability for ocaperidone, possibly due to placebo responses and EPS assessment methods. Because of its virtual nature, this modeling approach can support central nervous system research and development by accounting for unique human drug properties, such as human metabolites, exposure, genotypes and off-target effects and can be a helpful tool for drug discovery and development. PMID:23251349

  8. MECHANISTIC DATA & CANCER RISK ASSESSMENT: THE NEED FOR QUANTITATIVE MOLECULAR ENDPOINTS

    EPA Science Inventory

    The cancer risk assessment process as currently proposed by the U.S. Environmental Protection Agency allows for the use of mechanistic data to inform the low dose tumor response in humans and in laboratory animals. The aim is to reduce the reliance on defaults that introduce a re...

  9. Mechanistic-empirical Pavement Design Guide Implementation

    DOT National Transportation Integrated Search

    2010-06-01

    The recently introduced Mechanistic-Empirical Pavement Design Guide (MEPDG) and associated computer software provides a state-of-practice mechanistic-empirical highway pavement design methodology. The MEPDG methodology is based on pavement responses ...

  10. Quantitative prediction of repaglinide-rifampicin complex drug interactions using dynamic and static mechanistic models: delineating differential CYP3A4 induction and OATP1B1 inhibition potential of rifampicin.

    PubMed

    Varma, Manthena V S; Lin, Jian; Bi, Yi-An; Rotter, Charles J; Fahmi, Odette A; Lam, Justine L; El-Kattan, Ayman F; Goosen, Theunis C; Lai, Yurong

    2013-05-01

    Repaglinide is mainly metabolized by cytochrome P450 enzymes CYP2C8 and CYP3A4, and it is also a substrate to a hepatic uptake transporter, organic anion transporting polypeptide (OATP)1B1. The purpose of this study is to predict the dosing time-dependent pharmacokinetic interactions of repaglinide with rifampicin, using mechanistic models. In vitro hepatic transport of repaglinide, characterized using sandwich-cultured human hepatocytes, and intrinsic metabolic parameters were used to build a dynamic whole-body physiologically-based pharmacokinetic (PBPK) model. The PBPK model adequately described repaglinide plasma concentration-time profiles and successfully predicted area under the plasma concentration-time curve ratios of repaglinide (within ± 25% error), dosed (staggered 0-24 hours) after rifampicin treatment when primarily considering induction of CYP3A4 and reversible inhibition of OATP1B1 by rifampicin. Further, a static mechanistic "extended net-effect" model incorporating transport and metabolic disposition parameters of repaglinide and interaction potency of rifampicin was devised. Predictions based on the static model are similar to those observed in the clinic (average error ∼19%) and to those based on the PBPK model. Both the models suggested that the combined effect of increased gut extraction and decreased hepatic uptake caused minimal repaglinide systemic exposure change when repaglinide is dosed simultaneously or 1 hour after the rifampicin dose. On the other hand, isolated induction effect as a result of temporal separation of the two drugs translated to an approximate 5-fold reduction in repaglinide systemic exposure. In conclusion, both dynamic and static mechanistic models are instrumental in delineating the quantitative contribution of transport and metabolism in the dosing time-dependent repaglinide-rifampicin interactions.

  11. Improvements In AF Ablation Outcome Will Be Based More On Technological Advancement Versus Mechanistic Understanding.

    PubMed

    Jiang Md, Chen-Yang; Jiang Ms, Ru-Hong

    2014-01-01

    Atrial fibrillation (AF) is one of the most common cardiac arrhythmias. Catheter ablation has proven more effective than antiarrhythmic drugs in preventing clinical recurrence of AF, however long-term outcome remains unsatisfactory. Ablation strategies have evolved based on progress in mechanistic understanding, and technologies have advanced continuously. This article reviews current mechanistic concepts and technological advancements in AF treatment, and summarizes their impact on improvement of AF ablation outcome.

  12. Kinetic Profiling of Catalytic Organic Reactions as a Mechanistic Tool.

    PubMed

    Blackmond, Donna G

    2015-09-02

    The use of modern kinetic tools to obtain virtually continuous reaction progress data over the course of a catalytic reaction opens up a vista that provides mechanistic insights into both simple and complex catalytic networks. Reaction profiles offer a rate/concentration scan that tells the story of a batch reaction time course in a qualitative "fingerprinting" manner as well as in quantitative detail. Reaction progress experiments may be mathematically designed to elucidate catalytic rate laws from only a fraction of the number of experiments required in classical kinetic measurements. The information gained from kinetic profiles provides clues to direct further mechanistic analysis by other approaches. Examples from a variety of catalytic reactions spanning two decades of the author's work help to delineate nuances on a central mechanistic theme.

  13. Development of a Mechanistic-Based Healing Model for Self-Healing Glass Seals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Wei; Stephens, Elizabeth V.; Sun, Xin

    Self-healing glass, a recent development of hermetic sealant materials, has the ability to effectively repair damage when heated to elevated temperatures; thus, able to extend its service life. Since crack healing morphological changes in the glass material are usually temperature and stress dependent, quantitative studies to determine the effects of thermo-mechanical conditions on the healing behavior of the self-healing glass sealants are extremely useful to accommodate the design and optimization of the sealing systems within SOFCs. The goal of this task is to develop a mechanistic-based healing model to quantify the stress and temperature dependent healing behavior. A two-step healingmore » mechanism was developed and implemented into finite element (FE) models through user-subroutines. Integrated experimental/kinetic Monte Carlo (kMC) simulation methodology was taken to calibrate the model parameters. The crack healing model is able to investigate the effects of various thermo-mechanical factors; therefore, able to determine the critical conditions under which the healing mechanism will be activated. Furthermore, the predicted results can be used to formulate the continuum damage-healing model and to assist the SOFC stack level simulations in predicting and evaluating the effectiveness and the performance of various engineering seal designs.« less

  14. Quantitative AOP-based predictions for two aromatase inhibitors evaluating the influence of bioaccumulation on prediction accuracy

    EPA Science Inventory

    The adverse outcome pathway (AOP) framework can be used to support the use of mechanistic toxicology data as a basis for risk assessment. For certain risk contexts this includes defining, quantitative linkages between the molecular initiating event (MIE) and subsequent key events...

  15. Assessing causal mechanistic interactions: a peril ratio index of synergy based on multiplicativity.

    PubMed

    Lee, Wen-Chung

    2013-01-01

    The assessments of interactions in epidemiology have traditionally been based on risk-ratio, odds-ratio or rate-ratio multiplicativity. However, many epidemiologists fail to recognize that this is mainly for statistical conveniences and often will misinterpret a statistically significant interaction as a genuine mechanistic interaction. The author adopts an alternative metric system for risk, the 'peril'. A peril is an exponentiated cumulative rate, or simply, the inverse of a survival (risk complement) or one plus an odds. The author proposes a new index based on multiplicativity of peril ratios, the 'peril ratio index of synergy based on multiplicativity' (PRISM). Under the assumption of no redundancy, PRISM can be used to assess synergisms in sufficient cause sense, i.e., causal co-actions or causal mechanistic interactions. It has a less stringent threshold to detect a synergy as compared to a previous index of 'relative excess risk due to interaction'. Using the new PRISM criterion, many situations in which there is not evidence of interaction judged by the traditional indices are in fact corresponding to bona fide positive or negative synergisms.

  16. Rational and mechanistic perspectives on reinforcement learning.

    PubMed

    Chater, Nick

    2009-12-01

    This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: mechanistic and rational. Reinforcement learning is often viewed in mechanistic terms--as describing the operation of aspects of an agent's cognitive and neural machinery. Yet it can also be viewed as a rational level of description, specifically, as describing a class of methods for learning from experience, using minimal background knowledge. This paper considers how rational and mechanistic perspectives differ, and what types of evidence distinguish between them. Reinforcement learning research in the cognitive and brain sciences is often implicitly committed to the mechanistic interpretation. Here the opposite view is put forward: that accounts of reinforcement learning should apply at the rational level, unless there is strong evidence for a mechanistic interpretation. Implications of this viewpoint for reinforcement-based theories in the cognitive and brain sciences are discussed.

  17. Assessing Causal Mechanistic Interactions: A Peril Ratio Index of Synergy Based on Multiplicativity

    PubMed Central

    Lee, Wen-Chung

    2013-01-01

    The assessments of interactions in epidemiology have traditionally been based on risk-ratio, odds-ratio or rate-ratio multiplicativity. However, many epidemiologists fail to recognize that this is mainly for statistical conveniences and often will misinterpret a statistically significant interaction as a genuine mechanistic interaction. The author adopts an alternative metric system for risk, the ‘peril’. A peril is an exponentiated cumulative rate, or simply, the inverse of a survival (risk complement) or one plus an odds. The author proposes a new index based on multiplicativity of peril ratios, the ‘peril ratio index of synergy based on multiplicativity’ (PRISM). Under the assumption of no redundancy, PRISM can be used to assess synergisms in sufficient cause sense, i.e., causal co-actions or causal mechanistic interactions. It has a less stringent threshold to detect a synergy as compared to a previous index of ‘relative excess risk due to interaction’. Using the new PRISM criterion, many situations in which there is not evidence of interaction judged by the traditional indices are in fact corresponding to bona fide positive or negative synergisms. PMID:23826299

  18. Mechanistic analysis of challenge-response experiments.

    PubMed

    Shotwell, M S; Drake, K J; Sidorov, V Y; Wikswo, J P

    2013-09-01

    We present an application of mechanistic modeling and nonlinear longitudinal regression in the context of biomedical response-to-challenge experiments, a field where these methods are underutilized. In this type of experiment, a system is studied by imposing an experimental challenge, and then observing its response. The combination of mechanistic modeling and nonlinear longitudinal regression has brought new insight, and revealed an unexpected opportunity for optimal design. Specifically, the mechanistic aspect of our approach enables the optimal design of experimental challenge characteristics (e.g., intensity, duration). This article lays some groundwork for this approach. We consider a series of experiments wherein an isolated rabbit heart is challenged with intermittent anoxia. The heart responds to the challenge onset, and recovers when the challenge ends. The mean response is modeled by a system of differential equations that describe a candidate mechanism for cardiac response to anoxia challenge. The cardiac system behaves more variably when challenged than when at rest. Hence, observations arising from this experiment exhibit complex heteroscedasticity and sharp changes in central tendency. We present evidence that an asymptotic statistical inference strategy may fail to adequately account for statistical uncertainty. Two alternative methods are critiqued qualitatively (i.e., for utility in the current context), and quantitatively using an innovative Monte-Carlo method. We conclude with a discussion of the exciting opportunities in optimal design of response-to-challenge experiments. © 2013, The International Biometric Society.

  19. Predictive and mechanistic multivariate linear regression models for reaction development

    PubMed Central

    Santiago, Celine B.; Guo, Jing-Yao

    2018-01-01

    Multivariate Linear Regression (MLR) models utilizing computationally-derived and empirically-derived physical organic molecular descriptors are described in this review. Several reports demonstrating the effectiveness of this methodological approach towards reaction optimization and mechanistic interrogation are discussed. A detailed protocol to access quantitative and predictive MLR models is provided as a guide for model development and parameter analysis. PMID:29719711

  20. Assessing Metal Levels in Children from the Mechanistic Indicators of Childhood Asthma(MICA) study

    EPA Science Inventory

    Toxic and essential metals levels can be used as health indicators. Here, we quantitatively compare and contrast toxic and essential metals levels in vacuum dust, urine, and fingernail samples of 109 children in Detroit, Michigan as part of The Mechanistic Indicators of Childhood...

  1. The use of mechanistic evidence in drug approval.

    PubMed

    Aronson, Jeffrey K; La Caze, Adam; Kelly, Michael P; Parkkinen, Veli-Pekka; Williamson, Jon

    2018-06-11

    The role of mechanistic evidence tends to be under-appreciated in current evidence-based medicine (EBM), which focusses on clinical studies, tending to restrict attention to randomized controlled studies (RCTs) when they are available. The EBM+ programme seeks to redress this imbalance, by suggesting methods for evaluating mechanistic studies alongside clinical studies. Drug approval is a problematic case for the view that mechanistic evidence should be taken into account, because RCTs are almost always available. Nevertheless, we argue that mechanistic evidence is central to all the key tasks in the drug approval process: in drug discovery and development; assessing pharmaceutical quality; devising dosage regimens; assessing efficacy, harms, external validity, and cost-effectiveness; evaluating adherence; and extending product licences. We recommend that, when preparing for meetings in which any aspect of drug approval is to be discussed, mechanistic evidence should be systematically analysed and presented to the committee members alongside analyses of clinical studies. © 2018 The Authors Journal of Evaluation in Clinical Practice Published by John Wiley & Sons Ltd.

  2. Bridging paradigms: hybrid mechanistic-discriminative predictive models.

    PubMed

    Doyle, Orla M; Tsaneva-Atansaova, Krasimira; Harte, James; Tiffin, Paul A; Tino, Peter; Díaz-Zuccarini, Vanessa

    2013-03-01

    Many disease processes are extremely complex and characterized by multiple stochastic processes interacting simultaneously. Current analytical approaches have included mechanistic models and machine learning (ML), which are often treated as orthogonal viewpoints. However, to facilitate truly personalized medicine, new perspectives may be required. This paper reviews the use of both mechanistic models and ML in healthcare as well as emerging hybrid methods, which are an exciting and promising approach for biologically based, yet data-driven advanced intelligent systems.

  3. Pregnancy-induced changes in pharmacokinetics: a mechanistic-based approach.

    PubMed

    Anderson, Gail D

    2005-01-01

    Observational studies have documented that women take a variety of medications during pregnancy. It is well known that pregnancy can induce changes in the plasma concentrations of some drugs. The use of mechanistic-based approaches to drug interactions has significantly increased our ability to predict clinically significant drug interactions and improve clinical care. This same method can also be used to improve our understanding regarding the effect of pregnancy on pharmacokinetics of drugs. Limited studies suggest bioavailability of drugs is not altered during pregnancy. Increased plasma volume and protein binding changes can alter the apparent volume of distribution (Vd) of drugs. Through changes in Vd and clearance, pregnancy can cause increases or decreases in the terminal elimination half-life of drugs. Depending on whether a drug is excreted unchanged by the kidneys or which metabolic isoenzyme is involved in the metabolism of a drug can determine whether or not a change in dosage is needed during pregnancy. The renal excretion of unchanged drugs is increased during pregnancy. The metabolism of drugs catalysed by select cytochrome P450 (CYP) isoenzymes (i.e. CYP3A4, CYP2D6 and CYP2C9) and uridine diphosphate glucuronosyltransferase (UGT) isoenzymes (i.e. UGT1A4 and UGT2B7) are increased during pregnancy. Dosages of drugs predominantly metabolised by these isoenzymes or excreted by the kidneys unchanged may need to be increased during pregnancy in order to avoid loss of efficacy. In contrast, CYP1A2 and CYP2C19 activity is decreased during pregnancy, suggesting that dosage reductions may be needed to minimise potential toxicity of their substrates. There are limitations to the available data. This analysis is based primarily on observational studies, many including small numbers of women. For some isoenzymes, the effect of pregnancy on only one drug has been evaluated. The full-time course of pharmacokinetic changes during pregnancy is often not studied. The

  4. Development of a Mechanistically Based, Basin-Scale Stream Temperature Model: Applications to Cumulative Effects Modeling

    Treesearch

    Douglas Allen; William Dietrich; Peter Baker; Frank Ligon; Bruce Orr

    2007-01-01

    We describe a mechanistically-based stream model, BasinTemp, which assumes that direct shortwave radiation moderated by riparian and topographic shading, controls stream temperatures during the hottest part of the year. The model was developed to support a temperature TMDL for the South Fork Eel basin in Northern California and couples a GIS and a 1-D energy balance...

  5. Radiation-Induced Carcinogenesis: Mechanistically Based Differences between Gamma-Rays and Neutrons, and Interactions with DMBA

    PubMed Central

    Shuryak, Igor; Brenner, David J.; Ullrich, Robert L.

    2011-01-01

    Different types of ionizing radiation produce different dependences of cancer risk on radiation dose/dose rate. Sparsely ionizing radiation (e.g. γ-rays) generally produces linear or upwardly curving dose responses at low doses, and the risk decreases when the dose rate is reduced (direct dose rate effect). Densely ionizing radiation (e.g. neutrons) often produces downwardly curving dose responses, where the risk initially grows with dose, but eventually stabilizes or decreases. When the dose rate is reduced, the risk increases (inverse dose rate effect). These qualitative differences suggest qualitative differences in carcinogenesis mechanisms. We hypothesize that the dominant mechanism for induction of many solid cancers by sparsely ionizing radiation is initiation of stem cells to a pre-malignant state, but for densely ionizing radiation the dominant mechanism is radiation-bystander-effect mediated promotion of already pre-malignant cell clone growth. Here we present a mathematical model based on these assumptions and test it using data on the incidence of dysplastic growths and tumors in the mammary glands of mice exposed to high or low dose rates of γ-rays and neutrons, either with or without pre-treatment with the chemical carcinogen 7,12-dimethylbenz-alpha-anthracene (DMBA). The model provides a mechanistic and quantitative explanation which is consistent with the data and may provide useful insight into human carcinogenesis. PMID:22194850

  6. Descriptive vs. mechanistic network models in plant development in the post-genomic era.

    PubMed

    Davila-Velderrain, J; Martinez-Garcia, J C; Alvarez-Buylla, E R

    2015-01-01

    Network modeling is now a widespread practice in systems biology, as well as in integrative genomics, and it constitutes a rich and diverse scientific research field. A conceptually clear understanding of the reasoning behind the main existing modeling approaches, and their associated technical terminologies, is required to avoid confusions and accelerate the transition towards an undeniable necessary more quantitative, multidisciplinary approach to biology. Herein, we focus on two main network-based modeling approaches that are commonly used depending on the information available and the intended goals: inference-based methods and system dynamics approaches. As far as data-based network inference methods are concerned, they enable the discovery of potential functional influences among molecular components. On the other hand, experimentally grounded network dynamical models have been shown to be perfectly suited for the mechanistic study of developmental processes. How do these two perspectives relate to each other? In this chapter, we describe and compare both approaches and then apply them to a given specific developmental module. Along with the step-by-step practical implementation of each approach, we also focus on discussing their respective goals, utility, assumptions, and associated limitations. We use the gene regulatory network (GRN) involved in Arabidopsis thaliana Root Stem Cell Niche patterning as our illustrative example. We show that descriptive models based on functional genomics data can provide important background information consistent with experimentally supported functional relationships integrated in mechanistic GRN models. The rationale of analysis and modeling can be applied to any other well-characterized functional developmental module in multicellular organisms, like plants and animals.

  7. Integration of biotic ligand models (BLM) and bioaccumulation kinetics into a mechanistic framework for metal uptake in aquatic organisms.

    PubMed

    Veltman, Karin; Huijbregts, Mark A J; Hendriks, A Jan

    2010-07-01

    Both biotic ligand models (BLM) and bioaccumulation models aim to quantify metal exposure based on mechanistic knowledge, but key factors included in the description of metal uptake differ between the two approaches. Here, we present a quantitative comparison of both approaches and show that BLM and bioaccumulation kinetics can be merged into a common mechanistic framework for metal uptake in aquatic organisms. Our results show that metal-specific absorption efficiencies calculated from BLM-parameters for freshwater fish are highly comparable, i.e. within a factor of 2.4 for silver, cadmium, copper, and zinc, to bioaccumulation-absorption efficiencies for predominantly marine fish. Conditional affinity constants are significantly related to the metal-specific covalent index. Additionally, the affinity constants of calcium, cadmium, copper, sodium, and zinc are significantly comparable across aquatic species, including molluscs, daphnids, and fish. This suggests that affinity constants can be estimated from the covalent index, and constants can be extrapolated across species. A new model is proposed that integrates the combined effect of metal chemodynamics, as speciation, competition, and ligand affinity, and species characteristics, as size, on metal uptake by aquatic organisms. An important direction for further research is the quantitative comparison of the proposed model with acute toxicity values for organisms belonging to different size classes.

  8. Mechanistic insights into lithium ion battery electrolyte degradation - a quantitative NMR study.

    PubMed

    Wiemers-Meyer, S; Winter, M; Nowak, S

    2016-09-29

    The changes in electrolyte composition on the molecular level and the reaction mechanisms of electrolyte degradation upon thermal aging are monitored by quantitative NMR spectroscopy, revealing similar rates of degradation for pristine and already aged electrolytes. The data analysis is not in favor of an autocatalytic reaction mechanism based on OPF 3 but rather indicates that the degradation of LiPF 6 in carbonate based solvents proceeds via a complex sequence of "linear" reactions rather than a cyclic reaction pattern which is determined by the amount of water present in the samples. All investigated electrolytes are reasonably stable at temperatures of up to 60 °C in the presence of minor amounts or absence of water hence indicating that chemical instability of electrolyte components against water is decisive for degradation and an increase in temperature ("thermal aging") just accelerates the degradation impact of water.

  9. Putting the psychology back into psychological models: mechanistic versus rational approaches.

    PubMed

    Sakamoto, Yasuaki; Jones, Mattr; Love, Bradley C

    2008-09-01

    Two basic approaches to explaining the nature of the mind are the rational and the mechanistic approaches. Rational analyses attempt to characterize the environment and the behavioral outcomes that humans seek to optimize, whereas mechanistic models attempt to simulate human behavior using processes and representations analogous to those used by humans. We compared these approaches with regard to their accounts of how humans learn the variability of categories. The mechanistic model departs in subtle ways from rational principles. In particular, the mechanistic model incrementally updates its estimates of category means and variances through error-driven learning, based on discrepancies between new category members and the current representation of each category. The model yields a prediction, which we verify, regarding the effects of order manipulations that the rational approach does not anticipate. Although both rational and mechanistic models can successfully postdict known findings, we suggest that psychological advances are driven primarily by consideration of process and representation and that rational accounts trail these breakthroughs.

  10. Heterogeneity of pulmonary perfusion as a mechanistic image-based phenotype in emphysema susceptible smokers.

    PubMed

    Alford, Sara K; van Beek, Edwin J R; McLennan, Geoffrey; Hoffman, Eric A

    2010-04-20

    Recent evidence suggests that endothelial dysfunction and pathology of pulmonary vascular responses may serve as a precursor to smoking-associated emphysema. Although it is known that emphysematous destruction leads to vasculature changes, less is known about early regional vascular dysfunction which may contribute to and precede emphysematous changes. We sought to test the hypothesis, via multidetector row CT (MDCT) perfusion imaging, that smokers showing early signs of emphysema susceptibility have a greater heterogeneity in regional perfusion parameters than emphysema-free smokers and persons who had never smoked (NS). Assuming that all smokers have a consistent inflammatory response, increased perfusion heterogeneity in emphysema-susceptible smokers would be consistent with the notion that these subjects may have the inability to block hypoxic vasoconstriction in patchy, small regions of inflammation. Dynamic ECG-gated MDCT perfusion scans with a central bolus injection of contrast were acquired in 17 NS, 12 smokers with normal CT imaging studies (SNI), and 12 smokers with subtle CT findings of centrilobular emphysema (SCE). All subjects had normal spirometry. Quantitative image analysis determined regional perfusion parameters, pulmonary blood flow (PBF), and mean transit time (MTT). Mean and coefficient of variation were calculated, and statistical differences were assessed with one-way ANOVA. MDCT-based MTT and PBF measurements demonstrate globally increased heterogeneity in SCE subjects compared with NS and SNI subjects but demonstrate similarity between NS and SNI subjects. These findings demonstrate a functional lung-imaging measure that provides a more mechanistically oriented phenotype that differentiates smokers with and without evidence of emphysema susceptibility.

  11. Mechanistic Indicators of Childhood Asthma (MICA) Study

    EPA Science Inventory

    The Mechanistic Indicators of Childhood Asthma (MICA) Study has been designed to incorporate state-of-the-art technologies to examine the physiological and environmental factors that interact to increase the risk of asthmatic responses. MICA is primarily a clinically-bases obser...

  12. On the analysis of complex biological supply chains: From Process Systems Engineering to Quantitative Systems Pharmacology.

    PubMed

    Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P

    2017-12-05

    The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.

  13. Mechanistic materials modeling for nuclear fuel performance

    DOE PAGES

    Tonks, Michael R.; Andersson, David; Phillpot, Simon R.; ...

    2017-03-15

    Fuel performance codes are critical tools for the design, certification, and safety analysis of nuclear reactors. However, their ability to predict fuel behavior under abnormal conditions is severely limited by their considerable reliance on empirical materials models correlated to burn-up (a measure of the number of fission events that have occurred, but not a unique measure of the history of the material). In this paper, we propose a different paradigm for fuel performance codes to employ mechanistic materials models that are based on the current state of the evolving microstructure rather than burn-up. In this approach, a series of statemore » variables are stored at material points and define the current state of the microstructure. The evolution of these state variables is defined by mechanistic models that are functions of fuel conditions and other state variables. The material properties of the fuel and cladding are determined from microstructure/property relationships that are functions of the state variables and the current fuel conditions. Multiscale modeling and simulation is being used in conjunction with experimental data to inform the development of these models. Finally, this mechanistic, microstructure-based approach has the potential to provide a more predictive fuel performance capability, but will require a team of researchers to complete the required development and to validate the approach.« less

  14. Supporting Mechanistic Reasoning in Domain-Specific Contexts

    ERIC Educational Resources Information Center

    Weinberg, Paul J.

    2017-01-01

    Mechanistic reasoning is an epistemic practice central within science, technology, engineering, and mathematics disciplines. Although there has been some work on mechanistic reasoning in the research literature and standards documents, much of this work targets domain-general characterizations of mechanistic reasoning; this study provides…

  15. Modeling of the pyruvate production with Escherichia coli: comparison of mechanistic and neural networks-based models.

    PubMed

    Zelić, B; Bolf, N; Vasić-Racki, D

    2006-06-01

    Three different models: the unstructured mechanistic black-box model, the input-output neural network-based model and the externally recurrent neural network model were used to describe the pyruvate production process from glucose and acetate using the genetically modified Escherichia coli YYC202 ldhA::Kan strain. The experimental data were used from the recently described batch and fed-batch experiments [ Zelić B, Study of the process development for Escherichia coli-based pyruvate production. PhD Thesis, University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb, Croatia, July 2003. (In English); Zelić et al. Bioproc Biosyst Eng 26:249-258 (2004); Zelić et al. Eng Life Sci 3:299-305 (2003); Zelić et al Biotechnol Bioeng 85:638-646 (2004)]. The neural networks were built out of the experimental data obtained in the fed-batch pyruvate production experiments with the constant glucose feed rate. The model validation was performed using the experimental results obtained from the batch and fed-batch pyruvate production experiments with the constant acetate feed rate. Dynamics of the substrate and product concentration changes was estimated using two neural network-based models for biomass and pyruvate. It was shown that neural networks could be used for the modeling of complex microbial fermentation processes, even in conditions in which mechanistic unstructured models cannot be applied.

  16. Mechanistic modeling to predict the transporter- and enzyme-mediated drug-drug interactions of repaglinide.

    PubMed

    Varma, Manthena V S; Lai, Yurong; Kimoto, Emi; Goosen, Theunis C; El-Kattan, Ayman F; Kumar, Vikas

    2013-04-01

    Quantitative prediction of complex drug-drug interactions (DDIs) is challenging. Repaglinide is mainly metabolized by cytochrome-P-450 (CYP)2C8 and CYP3A4, and is also a substrate of organic anion transporting polypeptide (OATP)1B1. The purpose is to develop a physiologically based pharmacokinetic (PBPK) model to predict the pharmacokinetics and DDIs of repaglinide. In vitro hepatic transport of repaglinide, gemfibrozil and gemfibrozil 1-O-β-glucuronide was characterized using sandwich-culture human hepatocytes. A PBPK model, implemented in Simcyp (Sheffield, UK), was developed utilizing in vitro transport and metabolic clearance data. In vitro studies suggested significant active hepatic uptake of repaglinide. Mechanistic model adequately described repaglinide pharmacokinetics, and successfully predicted DDIs with several OATP1B1 and CYP3A4 inhibitors (<10% error). Furthermore, repaglinide-gemfibrozil interaction at therapeutic dose was closely predicted using in vitro fraction metabolism for CYP2C8 (0.71), when primarily considering reversible inhibition of OATP1B1 and mechanism-based inactivation of CYP2C8 by gemfibrozil and gemfibrozil 1-O-β-glucuronide. This study demonstrated that hepatic uptake is rate-determining in the systemic clearance of repaglinide. The model quantitatively predicted several repaglinide DDIs, including the complex interactions with gemfibrozil. Both OATP1B1 and CYP2C8 inhibition contribute significantly to repaglinide-gemfibrozil interaction, and need to be considered for quantitative rationalization of DDIs with either drug.

  17. Combining correlative and mechanistic habitat suitability models to improve ecological compensation.

    PubMed

    Meineri, Eric; Deville, Anne-Sophie; Grémillet, David; Gauthier-Clerc, Michel; Béchet, Arnaud

    2015-02-01

    Only a few studies have shown positive impacts of ecological compensation on species dynamics affected by human activities. We argue that this is due to inappropriate methods used to forecast required compensation in environmental impact assessments. These assessments are mostly descriptive and only valid at limited spatial and temporal scales. However, habitat suitability models developed to predict the impacts of environmental changes on potential species' distributions should provide rigorous science-based tools for compensation planning. Here we describe the two main classes of predictive models: correlative models and individual-based mechanistic models. We show how these models can be used alone or synoptically to improve compensation planning. While correlative models are easier to implement, they tend to ignore underlying ecological processes and lack accuracy. On the contrary, individual-based mechanistic models can integrate biological interactions, dispersal ability and adaptation. Moreover, among mechanistic models, those considering animal energy balance are particularly efficient at predicting the impact of foraging habitat loss. However, mechanistic models require more field data compared to correlative models. Hence we present two approaches which combine both methods for compensation planning, especially in relation to the spatial scale considered. We show how the availability of biological databases and software enabling fast and accurate population projections could be advantageously used to assess ecological compensation requirement efficiently in environmental impact assessments. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.

  18. Base course resilient modulus for the mechanistic-empirical pavement design guide : [summary].

    DOT National Transportation Integrated Search

    2011-01-01

    Elastic modulus determination is often used in designing pavements and evaluating pavement performance. The Mechanistic-Empirical Pavement Design Guide (MEPDG) has become an important source of guidance for pavement design and rehabilitation. MEPDG r...

  19. Assessing uncertainty in mechanistic models

    Treesearch

    Edwin J. Green; David W. MacFarlane; Harry T. Valentine

    2000-01-01

    Concern over potential global change has led to increased interest in the use of mechanistic models for predicting forest growth. The rationale for this interest is that empirical models may be of limited usefulness if environmental conditions change. Intuitively, we expect that mechanistic models, grounded as far as possible in an understanding of the biology of tree...

  20. Mechanistic species distribution modelling as a link between physiology and conservation.

    PubMed

    Evans, Tyler G; Diamond, Sarah E; Kelly, Morgan W

    2015-01-01

    Climate change conservation planning relies heavily on correlative species distribution models that estimate future areas of occupancy based on environmental conditions encountered in present-day ranges. The approach benefits from rapid assessment of vulnerability over a large number of organisms, but can have poor predictive power when transposed to novel environments and reveals little in the way of causal mechanisms that define changes in species distribution or abundance. Having conservation planning rely largely on this single approach also increases the risk of policy failure. Mechanistic models that are parameterized with physiological information are expected to be more robust when extrapolating distributions to future environmental conditions and can identify physiological processes that set range boundaries. Implementation of mechanistic species distribution models requires knowledge of how environmental change influences physiological performance, and because this information is currently restricted to a comparatively small number of well-studied organisms, use of mechanistic modelling in the context of climate change conservation is limited. In this review, we propose that the need to develop mechanistic models that incorporate physiological data presents an opportunity for physiologists to contribute more directly to climate change conservation and advance the field of conservation physiology. We begin by describing the prevalence of species distribution modelling in climate change conservation, highlighting the benefits and drawbacks of both mechanistic and correlative approaches. Next, we emphasize the need to expand mechanistic models and discuss potential metrics of physiological performance suitable for integration into mechanistic models. We conclude by summarizing other factors, such as the need to consider demography, limiting broader application of mechanistic models in climate change conservation. Ideally, modellers, physiologists and

  1. LASSIM-A network inference toolbox for genome-wide mechanistic modeling.

    PubMed

    Magnusson, Rasmus; Mariotti, Guido Pio; Köpsén, Mattias; Lövfors, William; Gawel, Danuta R; Jörnsten, Rebecka; Linde, Jörg; Nordling, Torbjörn E M; Nyman, Elin; Schulze, Sylvie; Nestor, Colm E; Zhang, Huan; Cedersund, Gunnar; Benson, Mikael; Tjärnberg, Andreas; Gustafsson, Mika

    2017-06-01

    Recent technological advancements have made time-resolved, quantitative, multi-omics data available for many model systems, which could be integrated for systems pharmacokinetic use. Here, we present large-scale simulation modeling (LASSIM), which is a novel mathematical tool for performing large-scale inference using mechanistically defined ordinary differential equations (ODE) for gene regulatory networks (GRNs). LASSIM integrates structural knowledge about regulatory interactions and non-linear equations with multiple steady state and dynamic response expression datasets. The rationale behind LASSIM is that biological GRNs can be simplified using a limited subset of core genes that are assumed to regulate all other gene transcription events in the network. The LASSIM method is implemented as a general-purpose toolbox using the PyGMO Python package to make the most of multicore computers and high performance clusters, and is available at https://gitlab.com/Gustafsson-lab/lassim. As a method, LASSIM works in two steps, where it first infers a non-linear ODE system of the pre-specified core gene expression. Second, LASSIM in parallel optimizes the parameters that model the regulation of peripheral genes by core system genes. We showed the usefulness of this method by applying LASSIM to infer a large-scale non-linear model of naïve Th2 cell differentiation, made possible by integrating Th2 specific bindings, time-series together with six public and six novel siRNA-mediated knock-down experiments. ChIP-seq showed significant overlap for all tested transcription factors. Next, we performed novel time-series measurements of total T-cells during differentiation towards Th2 and verified that our LASSIM model could monitor those data significantly better than comparable models that used the same Th2 bindings. In summary, the LASSIM toolbox opens the door to a new type of model-based data analysis that combines the strengths of reliable mechanistic models with truly

  2. Quantitative sensory testing of neuropathic pain patients: potential mechanistic and therapeutic implications.

    PubMed

    Pfau, Doreen B; Geber, Christian; Birklein, Frank; Treede, Rolf-Detlef

    2012-06-01

    Quantitative sensory testing (QST) is a widely accepted tool to investigate somatosensory changes in pain patients. Many different protocols have been developed in clinical pain research within recent years. In this review, we provide an overview of QST and tested neuroanatomical pathways, including peripheral and central structures. Based on research studies using animal and human surrogate models of neuropathic pain, possible underlying mechanisms of chronic pain are discussed. Clinically, QST may be useful for 1) the identification of subgroups of patients with different underlying pain mechanisms; 2) prediction of therapeutic outcomes; and 3) quantification of therapeutic interventions in pain therapy. Combined with sensory mapping, QST may provide useful information on the site of neural damage and on mechanisms of positive and negative somatosensory abnormalities. The use of QST in individual patients for diagnostic purposes leading to individualized therapy is an interesting concept, but needs further validation.

  3. Food for Thought ... Mechanistic Validation

    PubMed Central

    Hartung, Thomas; Hoffmann, Sebastian; Stephens, Martin

    2013-01-01

    Summary Validation of new approaches in regulatory toxicology is commonly defined as the independent assessment of the reproducibility and relevance (the scientific basis and predictive capacity) of a test for a particular purpose. In large ring trials, the emphasis to date has been mainly on reproducibility and predictive capacity (comparison to the traditional test) with less attention given to the scientific or mechanistic basis. Assessing predictive capacity is difficult for novel approaches (which are based on mechanism), such as pathways of toxicity or the complex networks within the organism (systems toxicology). This is highly relevant for implementing Toxicology for the 21st Century, either by high-throughput testing in the ToxCast/ Tox21 project or omics-based testing in the Human Toxome Project. This article explores the mostly neglected assessment of a test's scientific basis, which moves mechanism and causality to the foreground when validating/qualifying tests. Such mechanistic validation faces the problem of establishing causality in complex systems. However, pragmatic adaptations of the Bradford Hill criteria, as well as bioinformatic tools, are emerging. As critical infrastructures of the organism are perturbed by a toxic mechanism we argue that by focusing on the target of toxicity and its vulnerability, in addition to the way it is perturbed, we can anchor the identification of the mechanism and its verification. PMID:23665802

  4. RNA-Seq-based toxicogenomic assessment of fresh frozen and formalin-fixed tissues yields similar mechanistic insights.

    PubMed

    Auerbach, Scott S; Phadke, Dhiral P; Mav, Deepak; Holmgren, Stephanie; Gao, Yuan; Xie, Bin; Shin, Joo Heon; Shah, Ruchir R; Merrick, B Alex; Tice, Raymond R

    2015-07-01

    Formalin-fixed, paraffin-embedded (FFPE) pathology specimens represent a potentially vast resource for transcriptomic-based biomarker discovery. We present here a comparison of results from a whole transcriptome RNA-Seq analysis of RNA extracted from fresh frozen and FFPE livers. The samples were derived from rats exposed to aflatoxin B1 (AFB1 ) and a corresponding set of control animals. Principal components analysis indicated that samples were separated in the two groups representing presence or absence of chemical exposure, both in fresh frozen and FFPE sample types. Sixty-five percent of the differentially expressed transcripts (AFB1 vs. controls) in fresh frozen samples were also differentially expressed in FFPE samples (overlap significance: P < 0.0001). Genomic signature and gene set analysis of AFB1 differentially expressed transcript lists indicated highly similar results between fresh frozen and FFPE at the level of chemogenomic signatures (i.e., single chemical/dose/duration elicited transcriptomic signatures), mechanistic and pathology signatures, biological processes, canonical pathways and transcription factor networks. Overall, our results suggest that similar hypotheses about the biological mechanism of toxicity would be formulated from fresh frozen and FFPE samples. These results indicate that phenotypically anchored archival specimens represent a potentially informative resource for signature-based biomarker discovery and mechanistic characterization of toxicity. Copyright © 2014 John Wiley & Sons, Ltd.

  5. Planning for climate change: the need for mechanistic systems-based approaches to study climate change impacts on diarrheal diseases

    PubMed Central

    Levy, Karen; Zimmerman, Julie; Elliott, Mark; Bartram, Jamie; Carlton, Elizabeth; Clasen, Thomas; Dillingham, Rebecca; Eisenberg, Joseph; Guerrant, Richard; Lantagne, Daniele; Mihelcic, James; Nelson, Kara

    2016-01-01

    Increased precipitation and temperature variability as well as extreme events related to climate change are predicted to affect the availability and quality of water globally. Already heavily burdened with diarrheal diseases due to poor access to water, sanitation and hygiene facilities, communities throughout the developing world lack the adaptive capacity to sufficiently respond to the additional adversity caused by climate change. Studies suggest that diarrhea rates are positively correlated with increased temperature, and show a complex relationship with precipitation. Although climate change will likely increase rates of diarrheal diseases on average, there is a poor mechanistic understanding of the underlying disease transmission processes and substantial uncertainty surrounding current estimates. This makes it difficult to recommend appropriate adaptation strategies. We review the relevant climate-related mechanisms behind transmission of diarrheal disease pathogens and argue that systems-based mechanistic approaches incorporating human, engineered and environmental components are urgently needed. We then review successful systems-based approaches used in other environmental health fields and detail one modeling framework to predict climate change impacts on diarrheal diseases and design adaptation strategies. PMID:26799810

  6. The Impact of Situation-Based Learning to Students’ Quantitative Literacy

    NASA Astrophysics Data System (ADS)

    Latifah, T.; Cahya, E.; Suhendra

    2017-09-01

    Nowadays, the usage of quantities can be seen almost everywhere. There has been an increase of quantitative thinking, such as quantitative reasoning and quantitative literacy, within the context of daily life. However, many people today are still not fully equipped with the knowledge of quantitative thinking. There are still a lot of individuals not having enough quantitative skills to perform well within today’s society. Based on this issue, the research aims to improve students’ quantitative literacy in junior high school. The qualitative analysis of written student work and video observations during the experiment reveal that the impact of situation-based learning affects students’ quantitative literacy.

  7. Predicting subsurface uranium transport: Mechanistic modeling constrained by experimental data

    NASA Astrophysics Data System (ADS)

    Ottman, Michael; Schenkeveld, Walter D. C.; Kraemer, Stephan

    2017-04-01

    Depleted uranium (DU) munitions and their widespread use throughout conflict zones around the world pose a persistent health threat to the inhabitants of those areas long after the conclusion of active combat. However, little emphasis has been put on developing a comprehensive, quantitative tool for use in remediation and hazard avoidance planning in a wide range of environments. In this context, we report experimental data on U interaction with soils and sediments. Here, we strive to improve existing risk assessment modeling paradigms by incorporating a variety of experimental data into a mechanistic U transport model for subsurface environments. 20 different soils and sediments from a variety of environments were chosen to represent a range of geochemical parameters that are relevant to U transport. The parameters included pH, organic matter content, CaCO3, Fe content and speciation, and clay content. pH ranged from 3 to 10, organic matter content from 6 to 120 g kg-1, CaCO3 from 0 to 700 g kg-1, amorphous Fe content from 0.3 to 6 g kg-1 and clay content from 4 to 580 g kg-1. Sorption experiments were then performed, and linear isotherms were constructed. Sorption experiment results show that among separate sets of sediments and soils, there is an inverse correlation between both soil pH and CaCO¬3 concentration relative to U sorptive affinity. The geological materials with the highest and lowest sorptive affinities for U differed in CaCO3 and organic matter concentrations, as well as clay content and pH. In a further step, we are testing if transport behavior in saturated porous media can be predicted based on adsorption isotherms and generic geochemical parameters, and comparing these modeling predictions with the results from column experiments. The comparison of these two data sets will examine if U transport can be effectively predicted from reactive transport modeling that incorporates the generic geochemical parameters. This work will serve to show

  8. Exploring the pros and cons of mechanistic case diagrams for problem-based learning

    PubMed Central

    2017-01-01

    Purpose Mechanistic case diagram (MCD) was recommended for increasing the depth of understanding of disease, but with few articles on its specific methods. We address the experience of making MCD in the fullest depth to identify the pros and cons of using MCDs in such ways. Methods During problem-based learning, we gave guidelines of MCD for its mechanistic exploration from subcellular processes to clinical features, being laid out in as much detail as possible. To understand the students’ attitudes and depth of study using MCDs, we analyzed the results of a questionnaire in an open format about experiencing MCDs and examined the resulting products. Results Through the responses to questionnaire, we found several favorable outcomes, major of which was deeper insight and comprehensive understanding of disease facilitated by the process of making well-organized diagram. The main disadvantages of these guidelines were the feeling of too much workload and difficulty of finding mechanisms. Students gave suggestions to overcome these problems: cautious reading of comprehensive texts, additional guidance from staff about depth and focus of mechanisms, and cooperative group work. From the analysis of maps, we recognized there should be allowance of diversities in the appearance of maps and many hypothetical connections, which could be related to an insufficient understanding of mechanisms in nature. Conclusion The more detailed an MCD task is, the better students can become acquainted with deep knowledges. However, this advantage should be balanced by the results that there are many ensuing difficulties for the work and deliberate help plans should be prepared. PMID:28870018

  9. DEVELOPMENT AND VALIDATION OF A MECHANISTIC GROUND SPRAYER MODEL

    EPA Science Inventory

    In the last ten years the Spray Drift Task Force (SDTF), U.S. Environmental Protection Agency (EPA), USDA Agricultural Research Service, and USDA Forest Service cooperated in the refinement and evaluation of a mechanistically-based aerial spray model (contained within AGDISP and ...

  10. Application of chemical reaction mechanistic domains to an ecotoxicity QSAR model, the KAshinhou Tool for Ecotoxicity (KATE).

    PubMed

    Furuhama, A; Hasunuma, K; Aoki, Y; Yoshioka, Y; Shiraishi, H

    2011-01-01

    The validity of chemical reaction mechanistic domains defined by skin sensitisation in the Quantitative Structure-Activity Relationship (QSAR) ecotoxicity system, KAshinhou Tools for Ecotoxicity (KATE), March 2009 version, has been assessed and an external validation of the current KATE system carried out. In the case of the fish end-point, the group of chemicals with substructures reactive to skin sensitisation always exhibited higher root mean square errors (RMSEs) than chemicals without reactive substructures under identical C- or log P-judgements in KATE. However, in the case of the Daphnia end-point this was not so, and the group of chemicals with reactive substructures did not always have higher RMSEs: the Schiff base mechanism did not function as a high error detector. In addition to the RMSE findings, the presence of outliers suggested that the KATE classification rules needs to be reconsidered, particularly for the amine group. Examination of the dependency of the organism on the toxic action of chemicals in fish and Daphnia revealed that some of the reactive substructures could be applied to the improvement of the KATE system. It was concluded that the reaction mechanistic domains of toxic action for skin sensitisation could provide useful complementary information in predicting acute aquatic ecotoxicity, especially at the fish end-point.

  11. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    PubMed

    Gizak, Agnieszka; Rakus, Dariusz

    2016-01-11

    Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  12. Combined In Situ Illumination-NMR-UV/Vis Spectroscopy: A New Mechanistic Tool in Photochemistry.

    PubMed

    Seegerer, Andreas; Nitschke, Philipp; Gschwind, Ruth M

    2018-06-18

    Synthetic applications in photochemistry are booming. Despite great progress in the development of new reactions, mechanistic investigations are still challenging. Therefore, we present a fully automated in situ combination of NMR spectroscopy, UV/Vis spectroscopy, and illumination to allow simultaneous and time-resolved detection of paramagnetic and diamagnetic species. This optical fiber-based setup enables the first acquisition of combined UV/Vis and NMR spectra in photocatalysis, as demonstrated on a conPET process. Furthermore, the broad applicability of combined UVNMR spectroscopy for light-induced processes is demonstrated on a structural and quantitative analysis of a photoswitch, including rate modulation and stabilization of transient species by temperature variation. Owing to the flexibility regarding the NMR hardware, temperature, and light sources, we expect wide-ranging applications of this setup in various research fields. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  13. Planning for climate change: The need for mechanistic systems-based approaches to study climate change impacts on diarrheal diseases.

    PubMed

    Mellor, Jonathan E; Levy, Karen; Zimmerman, Julie; Elliott, Mark; Bartram, Jamie; Carlton, Elizabeth; Clasen, Thomas; Dillingham, Rebecca; Eisenberg, Joseph; Guerrant, Richard; Lantagne, Daniele; Mihelcic, James; Nelson, Kara

    2016-04-01

    Increased precipitation and temperature variability as well as extreme events related to climate change are predicted to affect the availability and quality of water globally. Already heavily burdened with diarrheal diseases due to poor access to water, sanitation and hygiene facilities, communities throughout the developing world lack the adaptive capacity to sufficiently respond to the additional adversity caused by climate change. Studies suggest that diarrhea rates are positively correlated with increased temperature, and show a complex relationship with precipitation. Although climate change will likely increase rates of diarrheal diseases on average, there is a poor mechanistic understanding of the underlying disease transmission processes and substantial uncertainty surrounding current estimates. This makes it difficult to recommend appropriate adaptation strategies. We review the relevant climate-related mechanisms behind transmission of diarrheal disease pathogens and argue that systems-based mechanistic approaches incorporating human, engineered and environmental components are urgently needed. We then review successful systems-based approaches used in other environmental health fields and detail one modeling framework to predict climate change impacts on diarrheal diseases and design adaptation strategies. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Drug-disease modeling in the pharmaceutical industry - where mechanistic systems pharmacology and statistical pharmacometrics meet.

    PubMed

    Helmlinger, Gabriel; Al-Huniti, Nidal; Aksenov, Sergey; Peskov, Kirill; Hallow, Karen M; Chu, Lulu; Boulton, David; Eriksson, Ulf; Hamrén, Bengt; Lambert, Craig; Masson, Eric; Tomkinson, Helen; Stanski, Donald

    2017-11-15

    Modeling & simulation (M&S) methodologies are established quantitative tools, which have proven to be useful in supporting the research, development (R&D), regulatory approval, and marketing of novel therapeutics. Applications of M&S help design efficient studies and interpret their results in context of all available data and knowledge to enable effective decision-making during the R&D process. In this mini-review, we focus on two sets of modeling approaches: population-based models, which are well-established within the pharmaceutical industry today, and fall under the discipline of clinical pharmacometrics (PMX); and systems dynamics models, which encompass a range of models of (patho-)physiology amenable to pharmacological intervention, of signaling pathways in biology, and of substance distribution in the body (today known as physiologically-based pharmacokinetic models) - which today may be collectively referred to as quantitative systems pharmacology models (QSP). We next describe the convergence - or rather selected integration - of PMX and QSP approaches into 'middle-out' drug-disease models, which retain selected mechanistic aspects, while remaining parsimonious, fit-for-purpose, and able to address variability and the testing of covariates. We further propose development opportunities for drug-disease systems models, to increase their utility and applicability throughout the preclinical and clinical spectrum of pharmaceutical R&D. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Rational and Mechanistic Perspectives on Reinforcement Learning

    ERIC Educational Resources Information Center

    Chater, Nick

    2009-01-01

    This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: "mechanistic" and "rational." Reinforcement learning is often viewed in mechanistic terms--as…

  16. Intriguing mechanistic labyrinths in gold(i) catalysis

    PubMed Central

    Obradors, Carla

    2014-01-01

    Many mechanistically intriguing reactions have been developed in the last decade using gold(i) as catalyst. Here we review the main mechanistic proposals in gold-catalysed activation of alkynes and allenes, in which this metal plays a central role by stabilising a variety of complex cationic intermediates. PMID:24176910

  17. Mechanistic-empirical design concepts for continuously reinforced concrete pavements in Illinois.

    DOT National Transportation Integrated Search

    2009-04-01

    The Illinois Department of Transportation (IDOT) currently has an existing jointed plain concrete pavement : (JPCP) design based on mechanistic-empirical (M-E) principles. However, their continuously reinforced concrete : pavement (CRCP) design proce...

  18. Mechanistic exploration of a bi-directional PDT-based combination in pancreatic cancer (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Huang, Huang-Chiao; Mallidi, Srivalleesha; Liu, Joyce; Chiang, Chun-Te; Mai, Zhiming; Goldschmidt, Ruth; Rizvi, Imran; Ebrahim-Zadeh, Neema; Hasan, Tayyaba

    2016-03-01

    It is increasingly evident that the most effective cancer treatments will involve interactive regimens that target multiple non-overlapping pathways, preferably such that each component enhances the others to improve outcomes while minimizing systemic toxicities. Toward this goal, we developed a combination of photodynamic therapy and irinotecan, which mechanistically cooperate with each other, beyond their individual tumor destruction pathways, to cause synergistic reduction in orthotopic pancreatic tumor burden. A three-way mechanistic basis of the observed the synergism will be discussed: (i) PDT downregulates drug efflux transporters to increase intracellular irinotecan levels. (ii) Irinotecan reduces the expression of hypoxia-induced marker, which is upregulated by PDT. (iii) PDT downregulates irinotecan-induced survivin expression to amplify the apoptotic and anti-proliferative effects. The clinical translation potential of the combination will also be highlighted.

  19. Transitioning from AOP to IATA - Exploiting mechanistic ...

    EPA Pesticide Factsheets

    Slide presentation at satellite meeting of the QSAR2016 Meeting on How to Transition from AOP to IATA-Exploiting mechanistic insight for practical decision making. . Slide presentation at satellite meeting of the QSAR2016 Meeting on How to Transition from AOP to IATA-Exploiting mechanistic insight for practical decision making. .

  20. Quantitative and Mechanistic Assessment of Model Lipophilic Drugs in Micellar Solutions in the Transport Kinetics Across MDR1-MDCK Cell Monolayers.

    PubMed

    Ho, Norman F H; Nielsen, James; Peterson, Michelle; Burton, Philip S

    2016-02-01

    An approach to characterizing P-glycoprotein (Pgp) interaction potential for sparingly water-soluble compounds was developed using bidirectional transport kinetics in MDR1-MDCK cell monolayers. Paclitaxel, solubilized in a dilute polysorbate 80 (PS80) micellar solution, was used as a practical example. Although the passage of paclitaxel across the cell monolayer was initially governed by the thermodynamic activity of the micelle-solubilized drug solution, Pgp inhibition was sustained by the thermodynamic activity (i.e., critical micelle concentration) of the PS80 micellar solution bathing the apical (ap) membrane. The mechanistic understanding of the experimental strategies and treatment of data was supported by a biophysical model expressed in the form of transport events occurring at the ap and basolateral (bl) membranes in series whereas the vectorial directions of the transcellular kinetics were accommodated. The derived equations permitted the stepwise quantitative delineation of the Pgp efflux activity (inhibited and uninhibited by PS80) and the passive permeability coefficient of the ap membrane, the passive permeability at the bl membrane and, finally, the distinct coupling of these with efflux pump activity to identify the rate-determining steps and mechanisms. The Jmax/KM(∗) for paclitaxel was in the order of 10(-4) cm/s and the ap- and bl-membrane passive permeability coefficients were asymmetric, with bl-membrane permeability significantly greater than ap. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  1. Kinetic and Mechanistic Examination of Acid–Base Bifunctional Aminosilica Catalysts in Aldol and Nitroaldol Condensations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collier, Virginia E.; Ellebracht, Nathan C.; Lindy, George I.

    The kinetic and mechanistic understanding of cooperatively catalyzed aldol and nitroaldol condensations is probed using a series of mesoporous silicas functionalized with aminosilanes to provide bifunctional acid–base character. Mechanistically, a Hammett analysis is performed to determine the effects of electron-donating and electron-withdrawing groups of para-substituted benzaldehyde derivatives on the catalytic activity of each condensation reaction. This information is also used to discuss the validity of previously proposed catalytic mechanisms and to propose a revised mechanism with plausible reaction intermediates. For both reactions, electron-withdrawing groups increase the observed rates of reaction, though resonance effects play an important, yet subtle, role inmore » the nitroaldol condensation, in which a p-methoxy electron-donating group is also able to stabilize the proposed carbocation intermediate. Additionally, activation energies and pre-exponential factors are calculated via the Arrhenius analysis of two catalysts with similar amine loadings: one catalyst had silanols available for cooperative interactions (acid–base catalysis), while the other was treated with a silanol-capping reagent to prevent such cooperativity (base-only catalysis). The values obtained for activation energies and pre-exponential factors in each reaction are discussed in the context of the proposed mechanisms and the importance of cooperative interactions in each reaction. The catalytic activity decreases for all reactions when the silanols are capped with trimethylsilyl groups, and higher temperatures are required to make accurate rate measurements, emphasizing the vital role the weakly acidic silanols play in the catalytic cycles. The results indicate that loss of acid sites is more detrimental to the catalytic activity of the aldol condensation than the nitroaldol condensation, as evidenced by the significant decrease in the pre-exponential factor for the aldol

  2. Kinetic and Mechanistic Examination of Acid–Base Bifunctional Aminosilica Catalysts in Aldol and Nitroaldol Condensations

    DOE PAGES

    Collier, Virginia E.; Ellebracht, Nathan C.; Lindy, George I.; ...

    2015-12-09

    The kinetic and mechanistic understanding of cooperatively catalyzed aldol and nitroaldol condensations is probed using a series of mesoporous silicas functionalized with aminosilanes to provide bifunctional acid–base character. Mechanistically, a Hammett analysis is performed to determine the effects of electron-donating and electron-withdrawing groups of para-substituted benzaldehyde derivatives on the catalytic activity of each condensation reaction. This information is also used to discuss the validity of previously proposed catalytic mechanisms and to propose a revised mechanism with plausible reaction intermediates. For both reactions, electron-withdrawing groups increase the observed rates of reaction, though resonance effects play an important, yet subtle, role inmore » the nitroaldol condensation, in which a p-methoxy electron-donating group is also able to stabilize the proposed carbocation intermediate. Additionally, activation energies and pre-exponential factors are calculated via the Arrhenius analysis of two catalysts with similar amine loadings: one catalyst had silanols available for cooperative interactions (acid–base catalysis), while the other was treated with a silanol-capping reagent to prevent such cooperativity (base-only catalysis). The values obtained for activation energies and pre-exponential factors in each reaction are discussed in the context of the proposed mechanisms and the importance of cooperative interactions in each reaction. The catalytic activity decreases for all reactions when the silanols are capped with trimethylsilyl groups, and higher temperatures are required to make accurate rate measurements, emphasizing the vital role the weakly acidic silanols play in the catalytic cycles. The results indicate that loss of acid sites is more detrimental to the catalytic activity of the aldol condensation than the nitroaldol condensation, as evidenced by the significant decrease in the pre-exponential factor for the aldol

  3. Mechanistic ecohydrological modeling with Tethys-Chloris: an attempt to unravel complexity

    NASA Astrophysics Data System (ADS)

    Fatichi, S.; Ivanov, V. Y.; Caporali, E.

    2010-12-01

    The role of vegetation in controlling and mediating hydrological states and fluxes at the level of individual processes has been largely explored, which has lead to the improvement of our understanding of mechanisms and patterns in ecohydrological systems. Nonetheless, relatively few efforts have been directed toward the development of continuous, complex, mechanistic ecohydrological models operating at the watershed-scale. This study presents a novel ecohydrological model Tethys-Chloris (T&C) and aims to discuss current limitations and perspectives of the mechanistic approach in ecohydrology. The model attempts to synthesize the state-of-the-art knowledge on individual processes and mechanisms drawn from various disciplines such as hydrology, plant physiology, ecology, and biogeochemistry. The model reproduces all essential components of hydrological cycle resolving the mass and energy budgets at the hourly scale; it includes energy and mass exchanges in the atmospheric boundary layer; a module of saturated and unsaturated soil water dynamics; two layers of vegetation, and a module of snowpack evolution. The vegetation component parsimoniously parameterizes essential plant life-cycle processes, including photosynthesis, phenology, carbon allocation, tissues turnover, and soil biogeochemistry. Quantitative metrics of model performance are discussed and highlight the capabilities of T&C in reproducing ecohydrological dynamics. The simulated patterns mimic the outcome of hydrological dynamics with high realism, given the uncertainty of imposed boundary conditions and limited data availability. Furthermore, highly satisfactory results are obtained without significant (e.g., automated) calibration efforts despite the large phase-space dimensionality of the model. A significant investment into model design and development leads to such desirable behavior. This suggests that while using the presented tool for high-precision predictions can be still problematic, the

  4. Quantitative learning strategies based on word networks

    NASA Astrophysics Data System (ADS)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  5. In Silico, Experimental, Mechanistic Model for Extended-Release Felodipine Disposition Exhibiting Complex Absorption and a Highly Variable Food Interaction

    PubMed Central

    Kim, Sean H. J.; Jackson, Andre J.; Hunt, C. Anthony

    2014-01-01

    The objective of this study was to develop and explore new, in silico experimental methods for deciphering complex, highly variable absorption and food interaction pharmacokinetics observed for a modified-release drug product. Toward that aim, we constructed an executable software analog of study participants to whom product was administered orally. The analog is an object- and agent-oriented, discrete event system, which consists of grid spaces and event mechanisms that map abstractly to different physiological features and processes. Analog mechanisms were made sufficiently complicated to achieve prespecified similarity criteria. An equation-based gastrointestinal transit model with nonlinear mixed effects analysis provided a standard for comparison. Subject-specific parameterizations enabled each executed analog’s plasma profile to mimic features of the corresponding six individual pairs of subject plasma profiles. All achieved prespecified, quantitative similarity criteria, and outperformed the gastrointestinal transit model estimations. We observed important subject-specific interactions within the simulation and mechanistic differences between the two models. We hypothesize that mechanisms, events, and their causes occurring during simulations had counterparts within the food interaction study: they are working, evolvable, concrete theories of dynamic interactions occurring within individual subjects. The approach presented provides new, experimental strategies for unraveling the mechanistic basis of complex pharmacological interactions and observed variability. PMID:25268237

  6. Mechanistic modelling of the inhibitory effect of pH on microbial growth.

    PubMed

    Akkermans, Simen; Van Impe, Jan F

    2018-06-01

    Modelling and simulation of microbial dynamics as a function of processing, transportation and storage conditions is a useful tool to improve microbial food safety and quality. The goal of this research is to improve an existing methodology for building mechanistic predictive models based on the environmental conditions. The effect of environmental conditions on microbial dynamics is often described by combining the separate effects in a multiplicative way (gamma concept). This idea was extended further in this work by including the effects of the lag and stationary growth phases on microbial growth rate as independent gamma factors. A mechanistic description of the stationary phase as a function of pH was included, based on a novel class of models that consider product inhibition. Experimental results on Escherichia coli growth dynamics indicated that also the parameters of the product inhibition equations can be modelled with the gamma approach. This work has extended a modelling methodology, resulting in predictive models that are (i) mechanistically inspired, (ii) easily identifiable with a limited work load and (iii) easily extended to additional environmental conditions. Copyright © 2017. Published by Elsevier Ltd.

  7. Quantitative descriptions of generalized arousal, an elementary function of the vertebrate brain

    PubMed Central

    Quinkert, Amy Wells; Vimal, Vivek; Weil, Zachary M.; Reeke, George N.; Schiff, Nicholas D.; Banavar, Jayanth R.; Pfaff, Donald W.

    2011-01-01

    We review a concept of the most primitive, fundamental function of the vertebrate CNS, generalized arousal (GA). Three independent lines of evidence indicate the existence of GA: statistical, genetic, and mechanistic. Here we ask, is this concept amenable to quantitative analysis? Answering in the affirmative, four quantitative approaches have proven useful: (i) factor analysis, (ii) information theory, (iii) deterministic chaos, and (iv) application of a Gaussian equation. It strikes us that, to date, not just one but at least four different quantitative approaches seem necessary for describing different aspects of scientific work on GA. PMID:21555568

  8. Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared

    ERIC Educational Resources Information Center

    von Helversen, Bettina; Rieskamp, Jorg

    2009-01-01

    The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…

  9. Recent advances in mathematical modeling of developmental abnormalities using mechanistic information.

    PubMed

    Kavlock, R J

    1997-01-01

    During the last several years, significant changes in the risk assessment process for developmental toxicity of environmental contaminants have begun to emerge. The first of these changes is the development and beginning use of statistically based dose-response models [the benchmark dose (BMD) approach] that better utilize data derived from existing testing approaches. Accompanying this change is the greater emphasis placed on understanding and using mechanistic information to yield more accurate, reliable, and less uncertain risk assessments. The next stage in the evolution of risk assessment will be the use of biologically based dose-response (BBDR) models that begin to build into the statistically based models factors related to the underlying kinetic, biochemical, and/or physiologic processes perturbed by a toxicant. Such models are now emerging from several research laboratories. The introduction of quantitative models and the incorporation of biologic information into them has pointed to the need for even more sophisticated modifications for which we offer the term embryologically based dose-response (EBDR) models. Because these models would be based upon the understanding of normal morphogenesis, they represent a quantum leap in our thinking, but their complexity presents daunting challenges both to the developmental biologist and the developmental toxicologist. Implementation of these models will require extensive communication between developmental toxicologists, molecular embryologists, and biomathematicians. The remarkable progress in the understanding of mammalian embryonic development at the molecular level that has occurred over the last decade combined with advances in computing power and computational models should eventually enable these as yet hypothetical models to be brought into use.

  10. Classification of cassava genotypes based on qualitative and quantitative data.

    PubMed

    Oliveira, E J; Oliveira Filho, O S; Santos, V S

    2015-02-02

    We evaluated the genetic variation of cassava accessions based on qualitative (binomial and multicategorical) and quantitative traits (continuous). We characterized 95 accessions obtained from the Cassava Germplasm Bank of Embrapa Mandioca e Fruticultura; we evaluated these accessions for 13 continuous, 10 binary, and 25 multicategorical traits. First, we analyzed the accessions based only on quantitative traits; next, we conducted joint analysis (qualitative and quantitative traits) based on the Ward-MLM method, which performs clustering in two stages. According to the pseudo-F, pseudo-t2, and maximum likelihood criteria, we identified five and four groups based on quantitative trait and joint analysis, respectively. The smaller number of groups identified based on joint analysis may be related to the nature of the data. On the other hand, quantitative data are more subject to environmental effects in the phenotype expression; this results in the absence of genetic differences, thereby contributing to greater differentiation among accessions. For most of the accessions, the maximum probability of classification was >0.90, independent of the trait analyzed, indicating a good fit of the clustering method. Differences in clustering according to the type of data implied that analysis of quantitative and qualitative traits in cassava germplasm might explore different genomic regions. On the other hand, when joint analysis was used, the means and ranges of genetic distances were high, indicating that the Ward-MLM method is very useful for clustering genotypes when there are several phenotypic traits, such as in the case of genetic resources and breeding programs.

  11. Portable smartphone based quantitative phase microscope

    NASA Astrophysics Data System (ADS)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  12. Mechanistic Physiologically Based Pharmacokinetic Modeling of the Dissolution and Food Effect of a Biopharmaceutics Classification System IV Compound-The Venetoclax Story.

    PubMed

    Emami Riedmaier, Arian; Lindley, David J; Hall, Jeffrey A; Castleberry, Steven; Slade, Russell T; Stuart, Patricia; Carr, Robert A; Borchardt, Thomas B; Bow, Daniel A J; Nijsen, Marjoleen

    2018-01-01

    Venetoclax, a selective B-cell lymphoma-2 inhibitor, is a biopharmaceutics classification system class IV compound. The aim of this study was to develop a physiologically based pharmacokinetic (PBPK) model to mechanistically describe absorption and disposition of an amorphous solid dispersion formulation of venetoclax in humans. A mechanistic PBPK model was developed incorporating measured amorphous solubility, dissolution, metabolism, and plasma protein binding. A middle-out approach was used to define permeability. Model predictions of oral venetoclax pharmacokinetics were verified against clinical studies of fed and fasted healthy volunteers, and clinical drug interaction studies with strong CYP3A inhibitor (ketoconazole) and inducer (rifampicin). Model verification demonstrated accurate prediction of the observed food effect following a low-fat diet. Ratios of predicted versus observed C max and area under the curve of venetoclax were within 0.8- to 1.25-fold of observed ratios for strong CYP3A inhibitor and inducer interactions, indicating that the venetoclax elimination pathway was correctly specified. The verified venetoclax PBPK model is one of the first examples mechanistically capturing absorption, food effect, and exposure of an amorphous solid dispersion formulated compound. This model allows evaluation of untested drug-drug interactions, especially those primarily occurring in the intestine, and paves the way for future modeling of biopharmaceutics classification system IV compounds. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  13. Quantitative analysis to guide orphan drug development.

    PubMed

    Lesko, L J

    2012-08-01

    The development of orphan drugs for rare diseases has made impressive strides in the past 10 years. There has been a surge in orphan drug designations, but new drug approvals have not kept up. This article presents a three-pronged hierarchical strategy for quantitative analysis of data at the descriptive, mechanistic, and systems levels of the biological system that could represent a standardized and rational approach to orphan drug development. Examples are provided to illustrate the concept.

  14. A mechanistic physicochemical model of carbon dioxide transport in blood.

    PubMed

    O'Neill, David P; Robbins, Peter A

    2017-02-01

    A number of mathematical models have been produced that, given the Pco 2 and Po 2 of blood, will calculate the total concentrations for CO 2 and O 2 in blood. However, all these models contain at least some empirical features, and thus do not represent all of the underlying physicochemical processes in an entirely mechanistic manner. The aim of this study was to develop a physicochemical model of CO 2 carriage by the blood to determine whether our understanding of the physical chemistry of the major chemical components of blood together with their interactions is sufficiently strong to predict the physiological properties of CO 2 carriage by whole blood. Standard values are used for the ionic composition of the blood, the plasma albumin concentration, and the hemoglobin concentration. All K m values required for the model are taken from the literature. The distribution of bicarbonate, chloride, and H + ions across the red blood cell membrane follows that of a Gibbs-Donnan equilibrium. The system of equations that results is solved numerically using constraints for mass balance and electroneutrality. The model reproduces the phenomena associated with CO 2 carriage, including the magnitude of the Haldane effect, very well. The structural nature of the model allows various hypothetical scenarios to be explored. Here we examine the effects of 1) removing the ability of hemoglobin to form carbamino compounds; 2) allowing a degree of Cl - binding to deoxygenated hemoglobin; and 3) removing the chloride (Hamburger) shift. The insights gained could not have been obtained from empirical models. This study is the first to incorporate a mechanistic model of chloride-bicarbonate exchange between the erythrocyte and plasma into a full physicochemical model of the carriage of carbon dioxide in blood. The mechanistic nature of the model allowed a theoretical study of the quantitative significance for carbon dioxide transport of carbamino compound formation; the putative binding

  15. A mechanistic physicochemical model of carbon dioxide transport in blood

    PubMed Central

    O’Neill, David P.

    2017-01-01

    A number of mathematical models have been produced that, given the Pco2 and Po2 of blood, will calculate the total concentrations for CO2 and O2 in blood. However, all these models contain at least some empirical features, and thus do not represent all of the underlying physicochemical processes in an entirely mechanistic manner. The aim of this study was to develop a physicochemical model of CO2 carriage by the blood to determine whether our understanding of the physical chemistry of the major chemical components of blood together with their interactions is sufficiently strong to predict the physiological properties of CO2 carriage by whole blood. Standard values are used for the ionic composition of the blood, the plasma albumin concentration, and the hemoglobin concentration. All Km values required for the model are taken from the literature. The distribution of bicarbonate, chloride, and H+ ions across the red blood cell membrane follows that of a Gibbs-Donnan equilibrium. The system of equations that results is solved numerically using constraints for mass balance and electroneutrality. The model reproduces the phenomena associated with CO2 carriage, including the magnitude of the Haldane effect, very well. The structural nature of the model allows various hypothetical scenarios to be explored. Here we examine the effects of 1) removing the ability of hemoglobin to form carbamino compounds; 2) allowing a degree of Cl− binding to deoxygenated hemoglobin; and 3) removing the chloride (Hamburger) shift. The insights gained could not have been obtained from empirical models. NEW & NOTEWORTHY This study is the first to incorporate a mechanistic model of chloride-bicarbonate exchange between the erythrocyte and plasma into a full physicochemical model of the carriage of carbon dioxide in blood. The mechanistic nature of the model allowed a theoretical study of the quantitative significance for carbon dioxide transport of carbamino compound formation; the putative

  16. Liquid Chromatography-Mass Spectrometry-based Quantitative Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Fang; Liu, Tao; Qian, Weijun

    2011-07-22

    Liquid chromatography-mass spectrometry (LC-MS)-based quantitative proteomics has become increasingly applied for a broad range of biological applications due to growing capabilities for broad proteome coverage and good accuracy in quantification. Herein, we review the current LC-MS-based quantification methods with respect to their advantages and limitations, and highlight their potential applications.

  17. Reproducibility and quantitation of amplicon sequencing-based detection

    PubMed Central

    Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng

    2011-01-01

    To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative

  18. INCORPORATION OF MECHANISTIC INFORMATION IN THE ARSENIC PBPK MODEL DEVELOPMENT PROCESS

    EPA Science Inventory

    INCORPORATING MECHANISTIC INSIGHTS IN A PBPK MODEL FOR ARSENIC

    Elaina M. Kenyon, Michael F. Hughes, Marina V. Evans, David J. Thomas, U.S. EPA; Miroslav Styblo, University of North Carolina; Michael Easterling, Analytical Sciences, Inc.

    A physiologically based phar...

  19. Composite Nanomechanics: A Mechanistic Properties Prediction

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Handler, Louis M.; Manderscheid, Jane M.

    2007-01-01

    A unique mechanistic theory is described to predict the properties of nanocomposites. The theory is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations hav e been programmed in a computer code. That computer code is used to predict 25 properties of a mononanofiber laminate. The results are pr esented graphically and discussed with respect to their practical sig nificance. Most of the results show smooth distributions. Results for matrix-dependent properties show bimodal through-the-thickness distr ibution with discontinuous changes from mode to mode.

  20. Mechanistic, Mathematical Model to Predict the Dynamics of Tissue Genesis in Bone Defects via Mechanical Feedback and Mediation of Biochemical Factors

    PubMed Central

    Moore, Shannon R.; Saidel, Gerald M.; Knothe, Ulf; Knothe Tate, Melissa L.

    2014-01-01

    The link between mechanics and biology in the generation and the adaptation of bone has been well studied in context of skeletal development and fracture healing. Yet, the prediction of tissue genesis within - and the spatiotemporal healing of - postnatal defects, necessitates a quantitative evaluation of mechano-biological interactions using experimental and clinical parameters. To address this current gap in knowledge, this study aims to develop a mechanistic mathematical model of tissue genesis using bone morphogenetic protein (BMP) to represent of a class of factors that may coordinate bone healing. Specifically, we developed a mechanistic, mathematical model to predict the dynamics of tissue genesis by periosteal progenitor cells within a long bone defect surrounded by periosteum and stabilized via an intramedullary nail. The emergent material properties and mechanical environment associated with nascent tissue genesis influence the strain stimulus sensed by progenitor cells within the periosteum. Using a mechanical finite element model, periosteal surface strains are predicted as a function of emergent, nascent tissue properties. Strains are then input to a mechanistic mathematical model, where mechanical regulation of BMP-2 production mediates rates of cellular proliferation, differentiation and tissue production, to predict healing outcomes. A parametric approach enables the spatial and temporal prediction of endochondral tissue regeneration, assessed as areas of cartilage and mineralized bone, as functions of radial distance from the periosteum and time. Comparing model results to histological outcomes from two previous studies of periosteum-mediated bone regeneration in a common ovine model, it was shown that mechanistic models incorporating mechanical feedback successfully predict patterns (spatial) and trends (temporal) of bone tissue regeneration. The novel model framework presented here integrates a mechanistic feedback system based on the

  1. Testing mechanistic models of growth in insects.

    PubMed

    Maino, James L; Kearney, Michael R

    2015-11-22

    Insects are typified by their small size, large numbers, impressive reproductive output and rapid growth. However, insect growth is not simply rapid; rather, insects follow a qualitatively distinct trajectory to many other animals. Here we present a mechanistic growth model for insects and show that increasing specific assimilation during the growth phase can explain the near-exponential growth trajectory of insects. The presented model is tested against growth data on 50 insects, and compared against other mechanistic growth models. Unlike the other mechanistic models, our growth model predicts energy reserves per biomass to increase with age, which implies a higher production efficiency and energy density of biomass in later instars. These predictions are tested against data compiled from the literature whereby it is confirmed that insects increase their production efficiency (by 24 percentage points) and energy density (by 4 J mg(-1)) between hatching and the attainment of full size. The model suggests that insects achieve greater production efficiencies and enhanced growth rates by increasing specific assimilation and increasing energy reserves per biomass, which are less costly to maintain than structural biomass. Our findings illustrate how the explanatory and predictive power of mechanistic growth models comes from their grounding in underlying biological processes. © 2015 The Author(s).

  2. Mechanistic equivalent circuit modelling of a commercial polymer electrolyte membrane fuel cell

    NASA Astrophysics Data System (ADS)

    Giner-Sanz, J. J.; Ortega, E. M.; Pérez-Herranz, V.

    2018-03-01

    Electrochemical impedance spectroscopy (EIS) has been widely used in the fuel cell field since it allows deconvolving the different physic-chemical processes that affect the fuel cell performance. Typically, EIS spectra are modelled using electric equivalent circuits. In this work, EIS spectra of an individual cell of a commercial PEM fuel cell stack were obtained experimentally. The goal was to obtain a mechanistic electric equivalent circuit in order to model the experimental EIS spectra. A mechanistic electric equivalent circuit is a semiempirical modelling technique which is based on obtaining an equivalent circuit that does not only correctly fit the experimental spectra, but which elements have a mechanistic physical meaning. In order to obtain the aforementioned electric equivalent circuit, 12 different models with defined physical meanings were proposed. These equivalent circuits were fitted to the obtained EIS spectra. A 2 step selection process was performed. In the first step, a group of 4 circuits were preselected out of the initial list of 12, based on general fitting indicators as the determination coefficient and the fitted parameter uncertainty. In the second step, one of the 4 preselected circuits was selected on account of the consistency of the fitted parameter values with the physical meaning of each parameter.

  3. Mechanistic-empirical evaluation of the Mn/ROAD low volume road test sections.

    DOT National Transportation Integrated Search

    1998-05-01

    The purpose of this study was to use Mn/ROAD mainline flexible pavement data to verify, refine, and modify the Illinois Department of Transportation (IDOT) Mechanistic-Empirical (M-E) based flexible pavement design procedures and concepts.

  4. Integration of QSAR and SAR methods for the mechanistic interpretation of predictive models for carcinogenicity

    PubMed Central

    Fjodorova, Natalja; Novič, Marjana

    2012-01-01

    The knowledge-based Toxtree expert system (SAR approach) was integrated with the statistically based counter propagation artificial neural network (CP ANN) model (QSAR approach) to contribute to a better mechanistic understanding of a carcinogenicity model for non-congeneric chemicals using Dragon descriptors and carcinogenic potency for rats as a response. The transparency of the CP ANN algorithm was demonstrated using intrinsic mapping technique specifically Kohonen maps. Chemical structures were represented by Dragon descriptors that express the structural and electronic features of molecules such as their shape and electronic surrounding related to reactivity of molecules. It was illustrated how the descriptors are correlated with particular structural alerts (SAs) for carcinogenicity with recognized mechanistic link to carcinogenic activity. Moreover, the Kohonen mapping technique enables one to examine the separation of carcinogens and non-carcinogens (for rats) within a family of chemicals with a particular SA for carcinogenicity. The mechanistic interpretation of models is important for the evaluation of safety of chemicals. PMID:24688639

  5. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  6. Mechanistic-empirical design, implementation, and monitoring for flexible pavements : a project summary.

    DOT National Transportation Integrated Search

    2014-05-01

    This document is a summary of tasks performed for Project ICT-R27-060. : Mechanistic-empirical (M-E)based flexible pavement design concepts and procedures were : developed in previous Illinois Cooperative Highway Research Program projects (IHR-510...

  7. Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation: Report of an FDA Public Workshop

    PubMed Central

    Duan, J; Kesisoglou, F; Novakovic, J; Amidon, GL; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R

    2017-01-01

    On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled “Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation.”1 The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole‐body framework.2 PMID:28571121

  8. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  9. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  10. Mechanistic investigation of the formation of H2 from HCOOH with a dinuclear Ru model complex for formate hydrogen lyase.

    PubMed

    Tokunaga, Taisuke; Yatabe, Takeshi; Matsumoto, Takahiro; Ando, Tatsuya; Yoon, Ki-Seok; Ogo, Seiji

    2017-01-01

    We report the mechanistic investigation of catalytic H 2 evolution from formic acid in water using a formate-bridged dinuclear Ru complex as a formate hydrogen lyase model. The mechanistic study is based on isotope-labeling experiments involving hydrogen isotope exchange reaction.

  11. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less

  12. From Inverse Problems in Mathematical Physiology to Quantitative Differential Diagnoses

    PubMed Central

    Zenker, Sven; Rubin, Jonathan; Clermont, Gilles

    2007-01-01

    . We outline possible steps toward translating this computational approach to the bedside, to supplement today's evidence-based medicine with a quantitatively founded model-based medicine that integrates mechanistic knowledge with patient-specific information. PMID:17997590

  13. Synthesis and mechanistic studies of curcumin analog-based oximes as potential anticancer agents.

    PubMed

    Qin, Hua-Li; Leng, Jing; Youssif, Bahaa G M; Amjad, Muhammad Wahab; Raja, Maria Abdul Ghafoor; Hussain, Muhammad Ajaz; Hussain, Zahid; Kazmi, Syeda Naveed; Bukhari, Syed Nasir Abbas

    2017-09-01

    The incidence of cancer can be decreased by chemoprevention using either natural or synthetic agents. Apart from synthetic compounds, numerous natural products have exhibited promising potential to inhibit carcinogenesis in vivo. In this study, α, β-unsaturated carbonyl-based anticancer compounds were used as starting materials to synthesize new oxime analogs. The findings from the antiproliferative assay using seven different human cancer cell lines provided a clear picture of structure-activity relationship. The oxime analogs namely 7a and 8a showed strong antiproliferative activity against the cell lines. The mechanistic effects of compounds on EGFR-TK kinases and tubulin polymerization and BRAF V 600E were investigated. In addition, the efficacy of compounds in reversing the efflux-mediated resistance developed by cancer cells was also studied. The compounds 5a and 6a displayed potent activity on various targets such as BRAF V 600E and EGFR-TK kinases and also exhibited strong antiproliferative activity against different cell lines hence showing potential of multifunctional anticancer agents. © 2017 John Wiley & Sons A/S.

  14. Does Homework Really Matter for College Students in Quantitatively-Based Courses?

    ERIC Educational Resources Information Center

    Young, Nichole; Dollman, Amanda; Angel, N. Faye

    2016-01-01

    This investigation was initiated by two students in an Advanced Computer Applications course. They sought to examine the influence of graded homework on final grades in quantitatively-based business courses. They were provided with data from three quantitatively-based core business courses over a period of five years for a total of 10 semesters of…

  15. [Mechanistic modelling allows to assess pathways of DNA lesion interactions underlying chromosome aberration formation].

    PubMed

    Eĭdel'man, Iu A; Slanina, S V; Sal'nikov, I V; Andreev, S G

    2012-12-01

    The knowledge of radiation-induced chromosomal aberration (CA) mechanisms is required in many fields of radiation genetics, radiation biology, biodosimetry, etc. However, these mechanisms are yet to be quantitatively characterised. One of the reasons is that the relationships between primary lesions of DNA/chromatin/chromosomes and dose-response curves for CA are unknown because the pathways of lesion interactions in an interphase nucleus are currently inaccessible for direct experimental observation. This article aims for the comparative analysis of two principally different scenarios of formation of simple and complex interchromosomal exchange aberrations: by lesion interactions at chromosome territories' surface vs. in the whole space of the nucleus. The analysis was based on quantitative mechanistic modelling of different levels of structures and processes involved in CA formation: chromosome structure in an interphase nucleus, induction, repair and interactions of DNA lesions. It was shown that the restricted diffusion of chromosomal loci, predicted by computational modelling of chromosome organization, results in lesion interactions in the whole space of the nucleus being impossible. At the same time, predicted features of subchromosomal dynamics agrees well with in vivo observations and does not contradict the mechanism of CA formation at the surface of chromosome territories. On the other hand, the "surface mechanism" of CA formation, despite having certain qualities, proved to be insufficient to explain high frequency of complex exchange aberrations observed by mFISH technique. The alternative mechanism, CA formation on nuclear centres is expected to be sufficient to explain frequent complex exchanges.

  16. Mechanistic and "natural" body metaphors and their effects on attitudes to hormonal contraception.

    PubMed

    Walker, Susan

    2012-01-01

    A small, self-selected convenience sample of male and female contraceptive users in the United Kingdom (n = 34) were interviewed between 2006 and 2008 concerning their feelings about the body and their contraceptive attitudes and experiences. The interviewees were a sub-sample of respondents (n = 188) who completed a paper-based questionnaire on similar topics, who were recruited through a poster placed in a family planning clinic, web-based advertisements on workplace and university websites, and through direct approaches to social groups. The bodily metaphors used when discussing contraception were analyzed using an interpretative phenomenological analytical approach facilitated by Atlas.ti software. The dominant bodily metaphor was mechanistic (i.e.,"body as machine"). A subordinate but influential bodily metaphor was the "natural" body, which had connotations of connection to nature and a quasi-sacred bodily order. Interviewees drew upon this "natural" metaphorical image in the context of discussing their anxieties about hormonal contraception. Drawing upon a "natural," non-mechanistic body image in the context of contraceptive decision-making contributed to reluctance to use a hormonal form of contraception. This research suggests that clinicians could improve communication and advice about contraception by recognizing that some users may draw upon non-mechanistic body imagery.

  17. Development of traffic data input resources for the mechanistic empirical pavement design process.

    DOT National Transportation Integrated Search

    2011-12-12

    The Mechanistic-Empirical Pavement Design Guide (MEPDG) for New and Rehabilitated Pavement Structures uses : nationally based data traffic inputs and recommends that state DOTs develop their own site-specific and regional : values. To support the MEP...

  18. A framework for conducting mechanistic based reliability assessments of components operating in complex systems

    NASA Astrophysics Data System (ADS)

    Wallace, Jon Michael

    2003-10-01

    Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank

  19. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  20. Mechanistic-Empirical (M-E) Design Implementation & Monitoring for Flexible Pavements : 2018 PROJECT SUMMARY

    DOT National Transportation Integrated Search

    2018-06-01

    This document is a summary of the tasks performed for Project ICT-R27-149-1. Mechanistic-empirical (M-E)based flexible pavement design concepts and procedures were previously developed in Illinois Cooperative Highway Research Program projects IHR-...

  1. A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.

    PubMed

    Revell, Christopher; Somveille, Marius

    2017-08-29

    In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.

  2. A novel multi-walled carbon nanotube-based antibody conjugate for quantitative and semi-quantitative lateral flow assays.

    PubMed

    Sun, Wenjuan; Hu, Xiaolong; Liu, Jia; Zhang, Yurong; Lu, Jianzhong; Zeng, Libo

    2017-10-01

    In this study, the multi-walled carbon nanotubes (MWCNTs) were applied in lateral flow strips (LFS) for semi-quantitative and quantitative assays. Firstly, the solubility of MWCNTs was improved using various surfactants to enhance their biocompatibility for practical application. The dispersed MWCNTs were conjugated with the methamphetamine (MET) antibody in a non-covalent manner and then manufactured into the LFS for the quantitative detection of MET. The MWCNTs-based lateral flow assay (MWCNTs-LFA) exhibited an excellent linear relationship between the values of test line and MET when its concentration ranges from 62.5 to 1500 ng/mL. The sensitivity of the LFS was evaluated by conjugating MWCNTs with HCG antibody and the MWCNTs conjugated method is 10 times more sensitive than the one conjugated with classical colloidal gold nanoparticles. Taken together, our data demonstrate that MWCNTs-LFA is a more sensitive and reliable assay for semi-quantitative and quantitative detection which can be used in forensic analysis.

  3. Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation: Report of an FDA Public Workshop.

    PubMed

    Zhang, X; Duan, J; Kesisoglou, F; Novakovic, J; Amidon, G L; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R

    2017-08-01

    On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled "Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation." The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole-body framework. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  4. Development of local calibration factors and design criteria values for mechanistic-empirical pavement design.

    DOT National Transportation Integrated Search

    2015-08-01

    A mechanistic-empirical (ME) pavement design procedure allows for analyzing and selecting pavement structures based : on predicted distress progression resulting from stresses and strains within the pavement over its design life. The Virginia : Depar...

  5. Transporter-Enzyme Interplay: Deconvoluting Effects of Hepatic Transporters and Enzymes on Drug Disposition Using Static and Dynamic Mechanistic Models.

    PubMed

    Varma, Manthena V; El-Kattan, Ayman F

    2016-07-01

    A large body of evidence suggests hepatic uptake transporters, organic anion-transporting polypeptides (OATPs), are of high clinical relevance in determining the pharmacokinetics of substrate drugs, based on which recent regulatory guidances to industry recommend appropriate assessment of investigational drugs for the potential drug interactions. We recently proposed an extended clearance classification system (ECCS) framework in which the systemic clearance of class 1B and 3B drugs is likely determined by hepatic uptake. The ECCS framework therefore predicts the possibility of drug-drug interactions (DDIs) involving OATPs and the effects of genetic variants of SLCO1B1 early in the discovery and facilitates decision making in the candidate selection and progression. Although OATP-mediated uptake is often the rate-determining process in the hepatic clearance of substrate drugs, metabolic and/or biliary components also contribute to the overall hepatic disposition and, more importantly, to liver exposure. Clinical evidence suggests that alteration in biliary efflux transport or metabolic enzymes associated with genetic polymorphism leads to change in the pharmacodynamic response of statins, for which the pharmacological target resides in the liver. Perpetrator drugs may show inhibitory and/or induction effects on transporters and enzymes simultaneously. It is therefore important to adopt models that frame these multiple processes in a mechanistic sense for quantitative DDI predictions and to deconvolute the effects of individual processes on the plasma and hepatic exposure. In vitro data-informed mechanistic static and physiologically based pharmacokinetic models are proven useful in rationalizing and predicting transporter-mediated DDIs and the complex DDIs involving transporter-enzyme interplay. © 2016, The American College of Clinical Pharmacology.

  6. Generative mechanistic explanation building in undergraduate molecular and cellular biology

    NASA Astrophysics Data System (ADS)

    Southard, Katelyn M.; Espindola, Melissa R.; Zaepfel, Samantha D.; Bolger, Molly S.

    2017-09-01

    When conducting scientific research, experts in molecular and cellular biology (MCB) use specific reasoning strategies to construct mechanistic explanations for the underlying causal features of molecular phenomena. We explored how undergraduate students applied this scientific practice in MCB. Drawing from studies of explanation building among scientists, we created and applied a theoretical framework to explore the strategies students use to construct explanations for 'novel' biological phenomena. Specifically, we explored how students navigated the multi-level nature of complex biological systems using generative mechanistic reasoning. Interviews were conducted with introductory and upper-division biology students at a large public university in the United States. Results of qualitative coding revealed key features of students' explanation building. Students used modular thinking to consider the functional subdivisions of the system, which they 'filled in' to varying degrees with mechanistic elements. They also hypothesised the involvement of mechanistic entities and instantiated abstract schema to adapt their explanations to unfamiliar biological contexts. Finally, we explored the flexible thinking that students used to hypothesise the impact of mutations on multi-leveled biological systems. Results revealed a number of ways that students drew mechanistic connections between molecules, functional modules (sets of molecules with an emergent function), cells, tissues, organisms and populations.

  7. Novel Uses of In Vitro Data to Develop Quantitative Biological Activity Relationship Models for in Vivo Carcinogenicity Prediction.

    PubMed

    Pradeep, Prachi; Povinelli, Richard J; Merrill, Stephen J; Bozdag, Serdar; Sem, Daniel S

    2015-04-01

    The availability of large in vitro datasets enables better insight into the mode of action of chemicals and better identification of potential mechanism(s) of toxicity. Several studies have shown that not all in vitro assays can contribute as equal predictors of in vivo carcinogenicity for development of hybrid Quantitative Structure Activity Relationship (QSAR) models. We propose two novel approaches for the use of mechanistically relevant in vitro assay data in the identification of relevant biological descriptors and development of Quantitative Biological Activity Relationship (QBAR) models for carcinogenicity prediction. We demonstrate that in vitro assay data can be used to develop QBAR models for in vivo carcinogenicity prediction via two case studies corroborated with firm scientific rationale. The case studies demonstrate the similarities between QBAR and QSAR modeling in: (i) the selection of relevant descriptors to be used in the machine learning algorithm, and (ii) the development of a computational model that maps chemical or biological descriptors to a toxic endpoint. The results of both the case studies show: (i) improved accuracy and sensitivity which is especially desirable under regulatory requirements, and (ii) overall adherence with the OECD/REACH guidelines. Such mechanism based models can be used along with QSAR models for prediction of mechanistically complex toxic endpoints. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Mechanistic models of biofilm growth in porous media

    NASA Astrophysics Data System (ADS)

    Jaiswal, Priyank; Al-Hadrami, Fathiya; Atekwana, Estella A.; Atekwana, Eliot A.

    2014-07-01

    Nondestructive acoustics methods can be used to monitor in situ biofilm growth in porous media. In practice, however, acoustic methods remain underutilized due to the lack of models that can translate acoustic data into rock properties in the context of biofilm. In this paper we present mechanistic models of biofilm growth in porous media. The models are used to quantitatively interpret arrival times and amplitudes recorded in the 29 day long Davis et al. (2010) physical scale biostimulation experiment in terms of biofilm morphologies and saturation. The model pivots on addressing the sediment elastic behavior using the lower Hashin-Shtrikman bounds for grain mixing and Gassmann substitution for fluid saturation. The time-lapse P wave velocity (VP; a function of arrival times) is explained by a combination of two rock models (morphologies); "load bearing" which assumes the biofilm as an additional mineral in the rock matrix and "pore filling" which assumes the biofilm as an additional fluid phase in the pores. The time-lapse attenuation (QP-1; a function of amplitudes), on the other hand, can be explained adequately in two ways; first, through squirt flow where energy is lost from relative motion between rock matrix and pore fluid, and second, through an empirical function of porosity (φ), permeability (κ), and grain size. The squirt flow model-fitting results in higher internal φ (7% versus 5%) and more oblate pores (0.33 versus 0.67 aspect ratio) for the load-bearing morphology versus the pore-filling morphology. The empirical model-fitting results in up to 10% increase in κ at the initial stages of the load-bearing morphology. The two morphologies which exhibit distinct mechanical and hydraulic behavior could be a function of pore throat size. The biofilm mechanistic models developed in this study can be used for the interpretation of seismic data critical for the evaluation of biobarriers in bioremediation, microbial enhanced oil recovery, and CO2

  9. "Ratio via Machina": Three Standards of Mechanistic Explanation in Sociology

    ERIC Educational Resources Information Center

    Aviles, Natalie B.; Reed, Isaac Ariail

    2017-01-01

    Recently, sociologists have expended much effort in attempts to define social mechanisms. We intervene in these debates by proposing that sociologists in fact have a choice to make between three standards of what constitutes a good mechanistic explanation: substantial, formal, and metaphorical mechanistic explanation. All three standards are…

  10. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity

    PubMed Central

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A.; Bradford, William D.; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S.; Li, Rong

    2015-01-01

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein−based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. PMID:25823586

  11. Evolutionary and mechanistic theories of aging.

    PubMed

    Hughes, Kimberly A; Reynolds, Rose M

    2005-01-01

    Senescence (aging) is defined as a decline in performance and fitness with advancing age. Senescence is a nearly universal feature of multicellular organisms, and understanding why it occurs is a long-standing problem in biology. Here we present a concise review of both evolutionary and mechanistic theories of aging. We describe the development of the general evolutionary theory, along with the mutation accumulation, antagonistic pleiotropy, and disposable soma versions of the evolutionary model. The review of the mechanistic theories focuses on the oxidative stress resistance, cellular signaling, and dietary control mechanisms of life span extension. We close with a discussion of how an approach that makes use of both evolutionary and molecular analyses can address a critical question: Which of the mechanisms that can cause variation in aging actually do cause variation in natural populations?

  12. A Cyclic-Plasticity-Based Mechanistic Approach for Fatigue Evaluation of 316 Stainless Steel Under Arbitrary Loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barua, Bipul; Mohanty, Subhasish; Listwan, Joseph T.

    In this paper, a cyclic-plasticity based fully mechanistic fatigue modeling approach is presented. This is based on time-dependent stress-strain evolution of the material over the entire fatigue life rather than just based on the end of live information typically used for empirical S~N curve based fatigue evaluation approaches. Previously we presented constant amplitude fatigue test based related material models for 316 SS base, 508 LAS base and 316 SS- 316 SS weld which are used in nuclear reactor components such as pressure vessels, nozzles, and surge line pipes. However, we found that constant amplitude fatigue data based models have limitationmore » in capturing the stress-strain evolution under arbitrary fatigue loading. To address the above mentioned limitation, in this paper, we present a more advanced approach that can be used for modeling the cyclic stress-strain evolution and fatigue life not only under constant amplitude but also under any arbitrary (random/variable) fatigue loading. The related material model and analytical model results are presented for 316 SS base metal. Two methodologies (either based on time/cycle or based on accumulated plastic strain energy) to track the material parameters at a given time/cycle are discussed and associated analytical model results are presented. From the material model and analytical cyclic plasticity model results, it is found that the proposed cyclic plasticity model can predict all the important stages of material behavior during the entire fatigue life of the specimens with more than 90% accuracy« less

  13. A Cyclic-Plasticity-Based Mechanistic Approach for Fatigue Evaluation of 316 Stainless Steel Under Arbitrary Loading

    DOE PAGES

    Barua, Bipul; Mohanty, Subhasish; Listwan, Joseph T.; ...

    2017-12-05

    In this paper, a cyclic-plasticity based fully mechanistic fatigue modeling approach is presented. This is based on time-dependent stress-strain evolution of the material over the entire fatigue life rather than just based on the end of live information typically used for empirical S~N curve based fatigue evaluation approaches. Previously we presented constant amplitude fatigue test based related material models for 316 SS base, 508 LAS base and 316 SS- 316 SS weld which are used in nuclear reactor components such as pressure vessels, nozzles, and surge line pipes. However, we found that constant amplitude fatigue data based models have limitationmore » in capturing the stress-strain evolution under arbitrary fatigue loading. To address the above mentioned limitation, in this paper, we present a more advanced approach that can be used for modeling the cyclic stress-strain evolution and fatigue life not only under constant amplitude but also under any arbitrary (random/variable) fatigue loading. The related material model and analytical model results are presented for 316 SS base metal. Two methodologies (either based on time/cycle or based on accumulated plastic strain energy) to track the material parameters at a given time/cycle are discussed and associated analytical model results are presented. From the material model and analytical cyclic plasticity model results, it is found that the proposed cyclic plasticity model can predict all the important stages of material behavior during the entire fatigue life of the specimens with more than 90% accuracy« less

  14. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  15. Explanation and inference: mechanistic and functional explanations guide property generalization.

    PubMed

    Lombrozo, Tania; Gwynne, Nicholas Z

    2014-01-01

    The ability to generalize from the known to the unknown is central to learning and inference. Two experiments explore the relationship between how a property is explained and how that property is generalized to novel species and artifacts. The experiments contrast the consequences of explaining a property mechanistically, by appeal to parts and processes, with the consequences of explaining the property functionally, by appeal to functions and goals. The findings suggest that properties that are explained functionally are more likely to be generalized on the basis of shared functions, with a weaker relationship between mechanistic explanations and generalization on the basis of shared parts and processes. The influence of explanation type on generalization holds even though all participants are provided with the same mechanistic and functional information, and whether an explanation type is freely generated (Experiment 1), experimentally provided (Experiment 2), or experimentally induced (Experiment 2). The experiments also demonstrate that explanations and generalizations of a particular type (mechanistic or functional) can be experimentally induced by providing sample explanations of that type, with a comparable effect when the sample explanations come from the same domain or from a different domains. These results suggest that explanations serve as a guide to generalization, and contribute to a growing body of work supporting the value of distinguishing mechanistic and functional explanations.

  16. Quantitative SIMS Imaging of Agar-Based Microbial Communities.

    PubMed

    Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V

    2018-05-01

    After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.

  17. The Combined Use of Correlative and Mechanistic Species Distribution Models Benefits Low Conservation Status Species.

    PubMed

    Rougier, Thibaud; Lassalle, Géraldine; Drouineau, Hilaire; Dumoulin, Nicolas; Faure, Thierry; Deffuant, Guillaume; Rochard, Eric; Lambert, Patrick

    2015-01-01

    Species can respond to climate change by tracking appropriate environmental conditions in space, resulting in a range shift. Species Distribution Models (SDMs) can help forecast such range shift responses. For few species, both correlative and mechanistic SDMs were built, but allis shad (Alosa alosa), an endangered anadromous fish species, is one of them. The main purpose of this study was to provide a framework for joint analyses of correlative and mechanistic SDMs projections in order to strengthen conservation measures for species of conservation concern. Guidelines for joint representation and subsequent interpretation of models outputs were defined and applied. The present joint analysis was based on the novel mechanistic model GR3D (Global Repositioning Dynamics of Diadromous fish Distribution) which was parameterized on allis shad and then used to predict its future distribution along the European Atlantic coast under different climate change scenarios (RCP 4.5 and RCP 8.5). We then used a correlative SDM for this species to forecast its distribution across the same geographic area and under the same climate change scenarios. First, projections from correlative and mechanistic models provided congruent trends in probability of habitat suitability and population dynamics. This agreement was preferentially interpreted as referring to the species vulnerability to climate change. Climate change could not be accordingly listed as a major threat for allis shad. The congruence in predicted range limits between SDMs projections was the next point of interest. The difference, when noticed, required to deepen our understanding of the niche modelled by each approach. In this respect, the relative position of the northern range limit between the two methods strongly suggested here that a key biological process related to intraspecific variability was potentially lacking in the mechanistic SDM. Based on our knowledge, we hypothesized that local adaptations to cold

  18. The Combined Use of Correlative and Mechanistic Species Distribution Models Benefits Low Conservation Status Species

    PubMed Central

    Rougier, Thibaud; Lassalle, Géraldine; Drouineau, Hilaire; Dumoulin, Nicolas; Faure, Thierry; Deffuant, Guillaume; Rochard, Eric; Lambert, Patrick

    2015-01-01

    Species can respond to climate change by tracking appropriate environmental conditions in space, resulting in a range shift. Species Distribution Models (SDMs) can help forecast such range shift responses. For few species, both correlative and mechanistic SDMs were built, but allis shad (Alosa alosa), an endangered anadromous fish species, is one of them. The main purpose of this study was to provide a framework for joint analyses of correlative and mechanistic SDMs projections in order to strengthen conservation measures for species of conservation concern. Guidelines for joint representation and subsequent interpretation of models outputs were defined and applied. The present joint analysis was based on the novel mechanistic model GR3D (Global Repositioning Dynamics of Diadromous fish Distribution) which was parameterized on allis shad and then used to predict its future distribution along the European Atlantic coast under different climate change scenarios (RCP 4.5 and RCP 8.5). We then used a correlative SDM for this species to forecast its distribution across the same geographic area and under the same climate change scenarios. First, projections from correlative and mechanistic models provided congruent trends in probability of habitat suitability and population dynamics. This agreement was preferentially interpreted as referring to the species vulnerability to climate change. Climate change could not be accordingly listed as a major threat for allis shad. The congruence in predicted range limits between SDMs projections was the next point of interest. The difference, when noticed, required to deepen our understanding of the niche modelled by each approach. In this respect, the relative position of the northern range limit between the two methods strongly suggested here that a key biological process related to intraspecific variability was potentially lacking in the mechanistic SDM. Based on our knowledge, we hypothesized that local adaptations to cold

  19. Conditional Toxicity Value (CTV) Predictor: An In Silico Approach for Generating Quantitative Risk Estimates for Chemicals.

    PubMed

    Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A

    2018-05-01

    Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.

  20. Slow erosion of a quantitative apple resistance to Venturia inaequalis based on an isolate-specific Quantitative Trait Locus.

    PubMed

    Caffier, Valérie; Le Cam, Bruno; Al Rifaï, Mehdi; Bellanger, Marie-Noëlle; Comby, Morgane; Denancé, Caroline; Didelot, Frédérique; Expert, Pascale; Kerdraon, Tifenn; Lemarquand, Arnaud; Ravon, Elisa; Durel, Charles-Eric

    2016-10-01

    Quantitative plant resistance affects the aggressiveness of pathogens and is usually considered more durable than qualitative resistance. However, the efficiency of a quantitative resistance based on an isolate-specific Quantitative Trait Locus (QTL) is expected to decrease over time due to the selection of isolates with a high level of aggressiveness on resistant plants. To test this hypothesis, we surveyed scab incidence over an eight-year period in an orchard planted with susceptible and quantitatively resistant apple genotypes. We sampled 79 Venturia inaequalis isolates from this orchard at three dates and we tested their level of aggressiveness under controlled conditions. Isolates sampled on resistant genotypes triggered higher lesion density and exhibited a higher sporulation rate on apple carrying the resistance allele of the QTL T1 compared to isolates sampled on susceptible genotypes. Due to this ability to select aggressive isolates, we expected the QTL T1 to be non-durable. However, our results showed that the quantitative resistance based on the QTL T1 remained efficient in orchard over an eight-year period, with only a slow decrease in efficiency and no detectable increase of the aggressiveness of fungal isolates over time. We conclude that knowledge on the specificity of a QTL is not sufficient to evaluate its durability. Deciphering molecular mechanisms associated with resistance QTLs, genetic determinants of aggressiveness and putative trade-offs within pathogen populations is needed to help in understanding the erosion processes. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Comparative evaluation of statistical and mechanistic models of Escherichia coli at beaches in southern Lake Michigan

    USGS Publications Warehouse

    Safaie, Ammar; Wendzel, Aaron; Ge, Zhongfu; Nevers, Meredith; Whitman, Richard L.; Corsi, Steven R.; Phanikumar, Mantha S.

    2016-01-01

    Statistical and mechanistic models are popular tools for predicting the levels of indicator bacteria at recreational beaches. Researchers tend to use one class of model or the other, and it is difficult to generalize statements about their relative performance due to differences in how the models are developed, tested, and used. We describe a cooperative modeling approach for freshwater beaches impacted by point sources in which insights derived from mechanistic modeling were used to further improve the statistical models and vice versa. The statistical models provided a basis for assessing the mechanistic models which were further improved using probability distributions to generate high-resolution time series data at the source, long-term “tracer” transport modeling based on observed electrical conductivity, better assimilation of meteorological data, and the use of unstructured-grids to better resolve nearshore features. This approach resulted in improved models of comparable performance for both classes including a parsimonious statistical model suitable for real-time predictions based on an easily measurable environmental variable (turbidity). The modeling approach outlined here can be used at other sites impacted by point sources and has the potential to improve water quality predictions resulting in more accurate estimates of beach closures.

  2. MECHANISTIC INDICATORS OF CHILDHOOD ASTHMA (MICA)

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is interested in the interplay of environmental and genetic factors on the development and exacerbation of asthma. The Mechanistic Indicators of Childhood Asthma (MICA) study will use exposure measurements and markers of environmental ...

  3. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    PubMed

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  4. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity.

    PubMed

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A; Bradford, William D; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S; Li, Rong

    2015-03-30

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein-based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. Copyright © 2015 Zhu et al.

  5. Bird Migration Under Climate Change - A Mechanistic Approach Using Remote Sensing

    NASA Technical Reports Server (NTRS)

    Smith, James A.; Blattner, Tim; Messmer, Peter

    2010-01-01

    The broad-scale reductions and shifts that may be expected under climate change in the availability and quality of stopover habitat for long-distance migrants is an area of increasing concern for conservation biologists. Researchers generally have taken two broad approaches to the modeling of migration behaviour to understand the impact of these changes on migratory bird populations. These include models based on causal processes and their response to environmental stimulation, "mechanistic models", or models that primarily are based on observed animal distribution patterns and the correlation of these patterns with environmental variables, i.e. "data driven" models. Investigators have applied the latter technique to forecast changes in migration patterns with changes in the environment, for example, as might be expected under climate change, by forecasting how the underlying environmental data layers upon which the relationships are built will change over time. The learned geostatstical correlations are then applied to the modified data layers.. However, this is problematic. Even if the projections of how the underlying data layers will change are correct, it is not evident that the statistical relationships will remain the same, i.e. that the animal organism may not adapt its' behaviour to the changing conditions. Mechanistic models that explicitly take into account the physical, biological, and behaviour responses of an organism as well as the underlying changes in the landscape offer an alternative to address these shortcomings. The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies enable the application of the mechanistic models to predict how continental bird migration patterns may change in response to environmental change. In earlier work, we simulated the impact of effects of wetland loss and inter-annual variability on the fitness of

  6. Managing mechanistic and organic structure in health care organizations.

    PubMed

    Olden, Peter C

    2012-01-01

    Managers at all levels in a health care organization must organize work to achieve the organization's mission and goals. This requires managers to decide the organization structure, which involves dividing the work among jobs and departments and then coordinating them all toward the common purpose. Organization structure, which is reflected in an organization chart, may range on a continuum from very mechanistic to very organic. Managers must decide how mechanistic versus how organic to make the entire organization and each of its departments. To do this, managers should carefully consider 5 factors for the organization and for each individual department: external environment, goals, work production, size, and culture. Some factors may push toward more mechanistic structure, whereas others may push in the opposite direction toward more organic structure. Practical advice can help managers at all levels design appropriate structure for their departments and organization.

  7. Cognitive science as an interface between rational and mechanistic explanation.

    PubMed

    Chater, Nick

    2014-04-01

    Cognitive science views thought as computation; and computation, by its very nature, can be understood in both rational and mechanistic terms. In rational terms, a computation solves some information processing problem (e.g., mapping sensory information into a description of the external world; parsing a sentence; selecting among a set of possible actions). In mechanistic terms, a computation corresponds to causal chain of events in a physical device (in engineering context, a silicon chip; in biological context, the nervous system). The discipline is thus at the interface between two very different styles of explanation--as the papers in the current special issue well illustrate, it explores the interplay of rational and mechanistic forces. Copyright © 2014 Cognitive Science Society, Inc.

  8. Chemical kinetic mechanistic models to investigate cancer biology and impact cancer medicine.

    PubMed

    Stites, Edward C

    2013-04-01

    Traditional experimental biology has provided a mechanistic understanding of cancer in which the malignancy develops through the acquisition of mutations that disrupt cellular processes. Several drugs developed to target such mutations have now demonstrated clinical value. These advances are unequivocal testaments to the value of traditional cellular and molecular biology. However, several features of cancer may limit the pace of progress that can be made with established experimental approaches alone. The mutated genes (and resultant mutant proteins) function within large biochemical networks. Biochemical networks typically have a large number of component molecules and are characterized by a large number of quantitative properties. Responses to a stimulus or perturbation are typically nonlinear and can display qualitative changes that depend upon the specific values of variable system properties. Features such as these can complicate the interpretation of experimental data and the formulation of logical hypotheses that drive further research. Mathematical models based upon the molecular reactions that define these networks combined with computational studies have the potential to deal with these obstacles and to enable currently available information to be more completely utilized. Many of the pressing problems in cancer biology and cancer medicine may benefit from a mathematical treatment. As work in this area advances, one can envision a future where such models may meaningfully contribute to the clinical management of cancer patients.

  9. Enhancing value of clinical pharmacodynamics in oncology drug development: An alliance between quantitative pharmacology and translational science.

    PubMed

    Venkatakrishnan, K; Ecsedy, J A

    2017-01-01

    Clinical pharmacodynamic evaluation is a key component of the "pharmacologic audit trail" in oncology drug development. We posit that its value can and should be greatly enhanced via application of a robust quantitative pharmacology framework informed by biologically mechanistic considerations. Herein, we illustrate examples of intersectional blindspots across the disciplines of quantitative pharmacology and translational science and offer a roadmap aimed at enhancing the caliber of clinical pharmacodynamic research in the development of oncology therapeutics. © 2016 American Society for Clinical Pharmacology and Therapeutics.

  10. Development of Alabama traffic factors for use in mechanistic-empirical pavement design.

    DOT National Transportation Integrated Search

    2015-02-01

    The pavement engineering community is moving toward design practices that use mechanistic-empirical (M-E) approaches to the design and analysis of pavement structures. This effort is : embodied in the Mechanistic-Empirical Pavement Design Guide (MEPD...

  11. Quantitative system drift compensates for altered maternal inputs to the gap gene network of the scuttle fly Megaselia abdita

    PubMed Central

    Wotton, Karl R; Jiménez-Guri, Eva; Crombach, Anton; Janssens, Hilde; Alcaine-Colet, Anna; Lemke, Steffen; Schmidt-Ott, Urs; Jaeger, Johannes

    2015-01-01

    The segmentation gene network in insects can produce equivalent phenotypic outputs despite differences in upstream regulatory inputs between species. We investigate the mechanistic basis of this phenomenon through a systems-level analysis of the gap gene network in the scuttle fly Megaselia abdita (Phoridae). It combines quantification of gene expression at high spatio-temporal resolution with systematic knock-downs by RNA interference (RNAi). Initiation and dynamics of gap gene expression differ markedly between M. abdita and Drosophila melanogaster, while the output of the system converges to equivalent patterns at the end of the blastoderm stage. Although the qualitative structure of the gap gene network is conserved, there are differences in the strength of regulatory interactions between species. We term such network rewiring ‘quantitative system drift’. It provides a mechanistic explanation for the developmental hourglass model in the dipteran lineage. Quantitative system drift is likely to be a widespread mechanism for developmental evolution. DOI: http://dx.doi.org/10.7554/eLife.04785.001 PMID:25560971

  12. Antiprotozoal Nitazoxanide Derivatives: Synthesis, Bioassays and QSAR Study Combined with Docking for Mechanistic Insight

    PubMed Central

    Scior, Thomas; Lozano-Aponte, Jorge; Ajmani, Subhash; Hernández-Montero, Eduardo; Chávez-Silva, Fabiola; Hernández-Núñez, Emanuel; Moo-Puc, Rosa; Fraguela-Collar, Andres; Navarrete-Vázquez, Gabriel

    2015-01-01

    In view of the serious health problems concerning infectious diseases in heavily populated areas, we followed the strategy of lead compound diversification to evaluate the near-by chemical space for new organic compounds. To this end, twenty derivatives of nitazoxanide (NTZ) were synthesized and tested for activity against Entamoeba histolytica parasites. To ensure drug-likeliness and activity relatedness of the new compounds, the synthetic work was assisted by a quantitative structure-activity relationships study (QSAR). Many of the inherent downsides – well-known to QSAR practitioners – we circumvented thanks to workarounds which we proposed in prior QSAR publication. To gain further mechanistic insight on a molecular level, ligand-enzyme docking simulations were carried out since NTZ is known to inhibit the protozoal pyruvate ferredoxin oxidoreductase (PFOR) enzyme as its biomolecular target. PMID:25872791

  13. Model-Based Analysis of Biopharmaceutic Experiments To Improve Mechanistic Oral Absorption Modeling: An Integrated in Vitro in Vivo Extrapolation Perspective Using Ketoconazole as a Model Drug.

    PubMed

    Pathak, Shriram M; Ruff, Aaron; Kostewicz, Edmund S; Patel, Nikunjkumar; Turner, David B; Jamei, Masoud

    2017-12-04

    Mechanistic modeling of in vitro data generated from metabolic enzyme systems (viz., liver microsomes, hepatocytes, rCYP enzymes, etc.) facilitates in vitro-in vivo extrapolation (IVIV_E) of metabolic clearance which plays a key role in the successful prediction of clearance in vivo within physiologically-based pharmacokinetic (PBPK) modeling. A similar concept can be applied to solubility and dissolution experiments whereby mechanistic modeling can be used to estimate intrinsic parameters required for mechanistic oral absorption simulation in vivo. However, this approach has not widely been applied within an integrated workflow. We present a stepwise modeling approach where relevant biopharmaceutics parameters for ketoconazole (KTZ) are determined and/or confirmed from the modeling of in vitro experiments before being directly used within a PBPK model. Modeling was applied to various in vitro experiments, namely: (a) aqueous solubility profiles to determine intrinsic solubility, salt limiting solubility factors and to verify pK a ; (b) biorelevant solubility measurements to estimate bile-micelle partition coefficients; (c) fasted state simulated gastric fluid (FaSSGF) dissolution for formulation disintegration profiling; and (d) transfer experiments to estimate supersaturation and precipitation parameters. These parameters were then used within a PBPK model to predict the dissolved and total (i.e., including the precipitated fraction) concentrations of KTZ in the duodenum of a virtual population and compared against observed clinical data. The developed model well characterized the intraluminal dissolution, supersaturation, and precipitation behavior of KTZ. The mean simulated AUC 0-t of the total and dissolved concentrations of KTZ were comparable to (within 2-fold of) the corresponding observed profile. Moreover, the developed PBPK model of KTZ successfully described the impact of supersaturation and precipitation on the systemic plasma concentration profiles of

  14. Quantitative imaging of mammalian transcriptional dynamics: from single cells to whole embryos.

    PubMed

    Zhao, Ziqing W; White, Melanie D; Bissiere, Stephanie; Levi, Valeria; Plachta, Nicolas

    2016-12-23

    Probing dynamic processes occurring within the cell nucleus at the quantitative level has long been a challenge in mammalian biology. Advances in bio-imaging techniques over the past decade have enabled us to directly visualize nuclear processes in situ with unprecedented spatial and temporal resolution and single-molecule sensitivity. Here, using transcription as our primary focus, we survey recent imaging studies that specifically emphasize the quantitative understanding of nuclear dynamics in both time and space. These analyses not only inform on previously hidden physical parameters and mechanistic details, but also reveal a hierarchical organizational landscape for coordinating a wide range of transcriptional processes shared by mammalian systems of varying complexity, from single cells to whole embryos.

  15. Application of empirical and mechanistic-empirical pavement design procedures to Mn/ROAD concrete pavement test sections

    DOT National Transportation Integrated Search

    1997-05-01

    Current pavement design procedures are based principally on empirical approaches. The current trend toward developing more mechanistic-empirical type pavement design methods led Minnesota to develop the Minnesota Road Research Project (Mn/ROAD), a lo...

  16. Mechanistic modeling of reactive soil nitrogen emissions across agricultural management practices

    NASA Astrophysics Data System (ADS)

    Rasool, Q. Z.; Miller, D. J.; Bash, J. O.; Venterea, R. T.; Cooter, E. J.; Hastings, M. G.; Cohan, D. S.

    2017-12-01

    The global reactive nitrogen (N) budget has increased by a factor of 2-3 from pre-industrial levels. This increase is especially pronounced in highly N fertilized agricultural regions in summer. The reactive N emissions from soil to atmosphere can be in reduced (NH3) or oxidized (NO, HONO, N2O) forms, depending on complex biogeochemical transformations of soil N reservoirs. Air quality models like CMAQ typically neglect soil emissions of HONO and N2O. Previously, soil NO emissions estimated by models like CMAQ remained parametric and inconsistent with soil NH3 emissions. Thus, there is a need to more mechanistically and consistently represent the soil N processes that lead to reactive N emissions to the atmosphere. Our updated approach estimates soil NO, HONO and N2O emissions by incorporating detailed agricultural fertilizer inputs from EPIC, and CMAQ-modeled N deposition, into the soil N pool. EPIC addresses the nitrification, denitrification and volatilization rates along with soil N pools for agricultural soils. Suitable updates to account for factors like nitrite (NO2-) accumulation not addressed in EPIC, will also be made. The NO and N2O emissions from nitrification and denitrification are computed mechanistically using the N sub-model of DAYCENT. These mechanistic definitions use soil water content, temperature, NH4+ and NO3- concentrations, gas diffusivity and labile C availability as dependent parameters at various soil layers. Soil HONO emissions found to be most probable under high NO2- availability will be based on observed ratios of HONO to NO emissions under different soil moistures, pH and soil types. The updated scheme will utilize field-specific soil properties and N inputs across differing manure management practices such as tillage. Comparison of the modeled soil NO emission rates from the new mechanistic and existing schemes against field measurements will be discussed. Our updated framework will help to predict the diurnal and daily variability

  17. Biomechanics-based in silico medicine: the manifesto of a new science.

    PubMed

    Viceconti, Marco

    2015-01-21

    In this perspective article we discuss the role of contemporary biomechanics in the light of recent applications such as the development of the so-called Virtual Physiological Human technologies for physiology-based in silico medicine. In order to build Virtual Physiological Human (VPH) models, computer models that capture and integrate the complex systemic dynamics of living organisms across radically different space-time scales, we need to re-formulate a vast body of existing biology and physiology knowledge so that it is formulated as a quantitative hypothesis, which can be expressed in mathematical terms. Once the predictive accuracy of these models is confirmed against controlled experiments and against clinical observations, we will have VPH model that can reliably predict certain quantitative changes in health status of a given patient, but also, more important, we will have a theory, in the true meaning this word has in the scientific method. In this scenario, biomechanics plays a very important role, biomechanics is one of the few areas of life sciences where we attempt to build full mechanistic explanations based on quantitative observations, in other words, we investigate living organisms like physical systems. This is in our opinion a Copernican revolution, around which the scope of biomechanics should be re-defined. Thus, we propose a new definition for our research domain "Biomechanics is the study of living organisms as mechanistic systems". Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Mechanistic Understanding of Microbial Plugging for Improved Sweep Efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steven Bryant; Larry Britton

    2008-09-30

    Microbial plugging has been proposed as an effective low cost method of permeability reduction. Yet there is a dearth of information on the fundamental processes of microbial growth in porous media, and there are no suitable data to model the process of microbial plugging as it relates to sweep efficiency. To optimize the field implementation, better mechanistic and volumetric understanding of biofilm growth within a porous medium is needed. In particular, the engineering design hinges upon a quantitative relationship between amount of nutrient consumption, amount of growth, and degree of permeability reduction. In this project experiments were conducted to obtainmore » new data to elucidate this relationship. Experiments in heterogeneous (layered) beadpacks showed that microbes could grow preferentially in the high permeability layer. Ultimately this caused flow to be equally divided between high and low permeability layers, precisely the behavior needed for MEOR. Remarkably, classical models of microbial nutrient uptake in batch experiments do not explain the nutrient consumption by the same microbes in flow experiments. We propose a simple extension of classical kinetics to account for the self-limiting consumption of nutrient observed in our experiments, and we outline a modeling approach based on architecture and behavior of biofilms. Such a model would account for the changing trend of nutrient consumption by bacteria with the increasing biomass and the onset of biofilm formation. However no existing model can explain the microbial preference for growth in high permeability regions, nor is there any obvious extension of the model for this observation. An attractive conjecture is that quorum sensing is involved in the heterogeneous bead packs.« less

  19. Mechanistic Links Between PARP, NAD, and Brain Inflammation After TBI

    DTIC Science & Technology

    2015-10-01

    1 AWARD NUMBER: W81XWH-13-2-0091 TITLE: Mechanistic Links Between PARP, NAD , and Brain Inflammation After TBI PRINCIPAL INVESTIGATOR...COVERED 25 Sep 2014 - 24 Sep 2015 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Mechanistic Links Between PARP, NAD , and Brain Inflammation After TBI 5b. GRANT...efficacy of veliparib and NAD as agents for suppressing inflammation and improving outcomes after traumatic brain injury. The animal models include

  20. The Newtonian Mechanistic Paradigm, Special Education, and Contours of Alternatives: An Overview.

    ERIC Educational Resources Information Center

    Heshusius, Lous

    1989-01-01

    The article examines theoretical reorientations in special education away from the Newtonian mechanistic paradigm toward an emerging holistic paradigm. Recent literature is critiqued for renaming theories as paradigms, thereby providing an illusion of change while leaving fundamental mechanistic assumptions in place. (Author/DB)

  1. MR Imaging-based Semi-quantitative Methods for Knee Osteoarthritis

    PubMed Central

    JARRAYA, Mohamed; HAYASHI, Daichi; ROEMER, Frank Wolfgang; GUERMAZI, Ali

    2016-01-01

    Magnetic resonance imaging (MRI)-based semi-quantitative (SQ) methods applied to knee osteoarthritis (OA) have been introduced during the last decade and have fundamentally changed our understanding of knee OA pathology since then. Several epidemiological studies and clinical trials have used MRI-based SQ methods to evaluate different outcome measures. Interest in MRI-based SQ scoring system has led to continuous update and refinement. This article reviews the different SQ approaches for MRI-based whole organ assessment of knee OA and also discuss practical aspects of whole joint assessment. PMID:26632537

  2. In Silico Oncology: Quantification of the In Vivo Antitumor Efficacy of Cisplatin-Based Doublet Therapy in Non-Small Cell Lung Cancer (NSCLC) through a Multiscale Mechanistic Model

    PubMed Central

    Kolokotroni, Eleni; Dionysiou, Dimitra; Veith, Christian; Kim, Yoo-Jin; Franz, Astrid; Grgic, Aleksandar; Bohle, Rainer M.; Stamatakos, Georgios

    2016-01-01

    The 5-year survival of non-small cell lung cancer patients can be as low as 1% in advanced stages. For patients with resectable disease, the successful choice of preoperative chemotherapy is critical to eliminate micrometastasis and improve operability. In silico experimentations can suggest the optimal treatment protocol for each patient based on their own multiscale data. A determinant for reliable predictions is the a priori estimation of the drugs’ cytotoxic efficacy on cancer cells for a given treatment. In the present work a mechanistic model of cancer response to treatment is applied for the estimation of a plausible value range of the cell killing efficacy of various cisplatin-based doublet regimens. Among others, the model incorporates the cancer related mechanism of uncontrolled proliferation, population heterogeneity, hypoxia and treatment resistance. The methodology is based on the provision of tumor volumetric data at two time points, before and after or during treatment. It takes into account the effect of tumor microenvironment and cell repopulation on treatment outcome. A thorough sensitivity analysis based on one-factor-at-a-time and latin hypercube sampling/partial rank correlation coefficient approaches has established the volume growth rate and the growth fraction at diagnosis as key features for more accurate estimates. The methodology is applied on the retrospective data of thirteen patients with non-small cell lung cancer who received cisplatin in combination with gemcitabine, vinorelbine or docetaxel in the neoadjuvant context. The selection of model input values has been guided by a comprehensive literature survey on cancer-specific proliferation kinetics. The latin hypercube sampling has been recruited to compensate for patient-specific uncertainties. Concluding, the present work provides a quantitative framework for the estimation of the in-vivo cell-killing ability of various chemotherapies. Correlation studies of such estimates with

  3. Mechanistic model for catalytic recombination during aerobraking maneuvers

    NASA Technical Reports Server (NTRS)

    Willey, Ronald J.

    1989-01-01

    Several mechanistic models are developed to predict recombination coefficients for use in heat shield design for reusable surface insulation (RSI) on aerobraking vehicles such as space shuttles. The models are applied over a temperature range of 300 to 1800 K and a stagnation pressure range of 0 to 3,000 Pa. A four parameter model in temperature was found to work best; however, several models (including those with atom concentrations at the surface) were also investigated. Mechanistic models developed with atom concentration terms may be applicable when sufficient data becomes available. The requirement is shown for recombination experiments in the 300 to 1000 K and 1500 to 1850 K temperature range, with deliberate concentration variations.

  4. Specialists without spirit: limitations of the mechanistic biomedical model.

    PubMed

    Hewa, S; Hetherington, R W

    1995-06-01

    This paper examines the origin and the development of the mechanistic model of the human body and health in terms of Max Weber's theory of rationalization. It is argued that the development of Western scientific medicine is a part of the broad process of rationalization that began in sixteenth century Europe as a result of the Reformation. The development of the mechanistic view of the human body in Western medicine is consistent with the ideas of calculability, predictability, and control-the major tenets of the process of rationalization as described by Weber. In recent years, however, the limitations of the mechanistic model have been the topic of many discussions. George Engel, a leading advocate of general systems theory, is one of the leading proponents of a new medical model which includes the general quality of life, clean environment, and psychological, or spiritual stability of life. The paper concludes with consideration of the potential of Engel's proposed new model in the context of the current state of rationalization in modern industrialized society.

  5. Validation of Greyscale-Based Quantitative Ultrasound in Manual Wheelchair Users

    PubMed Central

    Collinger, Jennifer L.; Fullerton, Bradley; Impink, Bradley G.; Koontz, Alicia M.; Boninger, Michael L.

    2010-01-01

    Objective The primary aim of this study is to establish the validity of greyscale-based quantitative ultrasound (QUS) measures of the biceps and supraspinatus tendons. Design Nine QUS measures of the biceps and supraspinatus tendons were computed from ultrasound images collected from sixty-seven manual wheelchair users. Shoulder pathology was measured using questionnaires, physical examination maneuvers, and a clinical ultrasound grading scale. Results Increased age, duration of wheelchair use, and body mass correlated with a darker, more homogenous tendon appearance. Subjects with pain during physical examination tests for biceps tenderness and acromioclavicular joint tenderness exhibited significantly different supraspinatus QUS values. Even when controlling for tendon depth, QUS measures of the biceps tendon differed significantly between subjects with healthy tendons, mild tendinosis, and severe tendinosis. Clinical grading of supraspinatus tendon health was correlated with QUS measures of the supraspinatus tendon. Conclusions Quantitative ultrasound is valid method to quantify tendinopathy and may allow for early detection of tendinosis. Manual wheelchair users are at a high risk for developing shoulder tendon pathology and may benefit from quantitative ultrasound-based research that focuses on identifying interventions designed to reduce this risk. PMID:20407304

  6. From the exposome to mechanistic understanding of chemical ...

    EPA Pesticide Factsheets

    BACKGROUND: Current definitions of the exposome expand beyond the initial idea to consider the totality of exposure and aim to relate to biological effects. While the exposome has been established for human health, its principles can be extended to include broader ecological issues. The assessment of exposure is tightly interlinked with hazard assessment. OBJECTIVES: We explore if mechanistic understanding of the causal links between exposure and adverse effects on human health and the environment can be improved by integrating the exposome approach with the adverse outcome pathway (AOP) concept - a framework to structure and organize the sequence of toxicological events from an initial molecular interaction of a chemical to an adverse outcome.METHODS: This review was informed by a Workshop organized by the Integrated Project EXPOSOME at the UFZ Helmholtz Centre for Environmental Research in Leipzig, Germany. DISCUSSION: The exposome encompasses all chemicals, including exogenous chemicals and endogenous compounds that are produced in response to external factors. By complementing the exposome research with the AOP concept, we can achieve a better mechanistic understanding, weigh the importance of various components of the exposome, and determine primary risk drivers. The ability to interpret multiple exposures and mixture effects at the mechanistic level requires a more holistic approach facilitated by the exposome concept.CONCLUSION: Incorporating the AOP conc

  7. The effect of environmental factors on the implementation of the Mechanistic-empirical pavement design guide (MEPDG).

    DOT National Transportation Integrated Search

    2011-07-01

    Current pavement design based on the AASHTO Design Guide uses an empirical approach from the results of the AASHO Road Test conducted in 1958. To address some of the limitations of the original design guide, AASHTO developed a new guide: Mechanistic ...

  8. Why did Jacques Monod make the choice of mechanistic determinism?

    PubMed

    Loison, Laurent

    2015-06-01

    The development of molecular biology placed in the foreground a mechanistic and deterministic conception of the functioning of macromolecules. In this article, I show that this conception was neither obvious, nor necessary. Taking Jacques Monod as a case study, I detail the way he gradually came loose from a statistical understanding of determinism to finally support a mechanistic understanding. The reasons of the choice made by Monod at the beginning of the 1950s can be understood only in the light of the general theoretical schema supported by the concept of mechanistic determinism. This schema articulates three fundamental notions for Monod, namely that of the rigidity of the sequence of the genetic program, that of the intrinsic stability of macromolecules (DNA and proteins), and that of the specificity of molecular interactions. Copyright © 2015 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  9. Engineering properties of resin modified pavement (RMP) for mechanistic design

    NASA Astrophysics Data System (ADS)

    Anderton, Gary Lee

    1997-11-01

    The research study described in this report focuses on determining the engineering properties of the resin modified pavement (RMP) material relating to pavement performance, and then developing a rational mechanistic design procedure to replace the current empirical design procedure. A detailed description of RMP is provided, including a review of the available literature on this relatively new pavement technology. Field evaluations of four existing and two new RMP project sites were made to assess critical failure modes and to obtain pavement samples for subsequent laboratory testing. Various engineering properties of laboratory-produced and field-recovered samples of RMP were measured and analyzed. The engineering properties evaluated included those relating to the material's stiffness, strength, thermal properties, and traffic-related properties. Comparisons of these data to typical values for asphalt concrete and portland cement concrete were made to relate the physical nature of RMP to more common pavement surfacing materials. A mechanistic design procedure was developed to determine appropriate thickness profiles of RMP, using stiffness and fatigue properties determined by this study. The design procedure is based on the U.S. Army Corps of Engineers layered elastic method for airfield flexible pavements. The WESPAVE computer program was used to demonstrate the new design procedure for a hypothetical airfield apron design. The results of the study indicated that RMP is a relatively stiff, viscoelastic pavement surfacing material with many of its strength and stiffness properties falling between those of typical asphalt concrete and portland cement concrete. The RMP's thermal and traffic-related properties indicated favorable field performance. The layered elastic design approach appeared to be a reasonable and practical method for RMP mechanistic pavement design, and this design procedure was recommended for future use and development.

  10. Mechanistic species distribution modeling reveals a niche shift during invasion.

    PubMed

    Chapman, Daniel S; Scalone, Romain; Štefanić, Edita; Bullock, James M

    2017-06-01

    Niche shifts of nonnative plants can occur when they colonize novel climatic conditions. However, the mechanistic basis for niche shifts during invasion is poorly understood and has rarely been captured within species distribution models. We quantified the consequence of between-population variation in phenology for invasion of common ragweed (Ambrosia artemisiifolia L.) across Europe. Ragweed is of serious concern because of its harmful effects as a crop weed and because of its impact on public health as a major aeroallergen. We developed a forward mechanistic species distribution model based on responses of ragweed development rates to temperature and photoperiod. The model was parameterized and validated from the literature and by reanalyzing data from a reciprocal common garden experiment in which native and invasive populations were grown within and beyond the current invaded range. It could therefore accommodate between-population variation in the physiological requirements for flowering, and predict the potentially invaded ranges of individual populations. Northern-origin populations that were established outside the generally accepted climate envelope of the species had lower thermal requirements for bud development, suggesting local adaptation of phenology had occurred during the invasion. The model predicts that this will extend the potentially invaded range northward and increase the average suitability across Europe by 90% in the current climate and 20% in the future climate. Therefore, trait variation observed at the population scale can trigger a climatic niche shift at the biogeographic scale. For ragweed, earlier flowering phenology in established northern populations could allow the species to spread beyond its current invasive range, substantially increasing its risk to agriculture and public health. Mechanistic species distribution models offer the possibility to represent niche shifts by varying the traits and niche responses of individual

  11. Advances in Quantitative Proteomics of Microbes and Microbial Communities

    NASA Astrophysics Data System (ADS)

    Waldbauer, J.; Zhang, L.; Rizzo, A. I.

    2015-12-01

    Quantitative measurements of gene expression are key to developing a mechanistic, predictive understanding of how microbial metabolism drives many biogeochemical fluxes and responds to environmental change. High-throughput RNA-sequencing can afford a wealth of information about transcript-level expression patterns, but it is becoming clear that expression dynamics are often very different at the protein level where biochemistry actually occurs. These divergent dynamics between levels of biological organization necessitate quantitative proteomic measurements to address many biogeochemical questions. The protein-level expression changes that underlie shifts in the magnitude, or even the direction, of metabolic and biogeochemical fluxes can be quite subtle and test the limits of current quantitative proteomics techniques. Here we describe methodologies for high-precision, whole-proteome quantification that are applicable to both model organisms of biogeochemical interest that may not be genetically tractable, and to complex community samples from natural environments. Employing chemical derivatization of peptides with multiple isotopically-coded tags, this strategy is rapid and inexpensive, can be implemented on a wide range of mass spectrometric instrumentation, and is relatively insensitive to chromatographic variability. We demonstrate the utility of this quantitative proteomics approach in application to both isolates and natural communities of sulfur-metabolizing and photosynthetic microbes.

  12. A mechanistic investigation on copolymerization of ethylene with polar monomers using a cyclophane-based Pd(II) alpha-diimine catalyst.

    PubMed

    Popeney, Chris S; Guan, Zhibin

    2009-09-02

    A detailed mechanistic investigation of the copolymerization of ethylene and methyl acrylate (MA) by a Pd(II) cyclophane-based alpha-diimine catalyst is reported. Our previous observations of unusually high incorporations of acrylates in copolymerization using this catalyst (J. Am. Chem. Soc. 2007, 129, 10062) prompted us to conduct a full mechanistic study on ethylene/MA copolymerization, which indicates a dramatic departure from normal Curtin-Hammett kinetic behavior as observed in copolymerization using the normal Brookhart type of Pd(II) alpha-diimine catalysts. Further investigation reveals that this contrasting behavior originates from the axial blocking effect of the cyclophane ligand hindering olefin substitution and equilibration. In equilibrium studies of ethylene with nitriles, the cyclophane catalyst was found to more strongly favor the linearly binding nitrile ligands as compared to the standard acyclic Pd(II) alpha-diimine catalysts. Ethylene exchange rates in the complexes [(N--N)PdMe(C(2)H(4))](+) (N--N = diimine) were measured by 2D EXSY NMR spectroscopy and found to be over 100 times slower in the cyclophane case. Measurement of the slow equilibration of ethylene, methyl acrylate, and 4-methoxystyrene in cyclophane-based Pd(II) olefin complexes by (1)H NMR and fitting of the obtained kinetic plots allowed for the estimation of exchange rates and equilibrium constants of the olefins. After extrapolation to typical polymerization temperature, DeltaG(double dagger) = 20.6 and 16.4 kcal/mol for ethylene-methyl acrylate exchange in the forward (ethylene displacement by methyl acrylate) and reverse directions, respectively. These values are of similar magnitude to the previously determined migratory insertion barriers of ethylene (DeltaG(double dagger) = 18.9 kcal/mol) and methyl acrylate (DeltaG(double dagger) = 16.3 kcal/mol) under equivalent conditions, but contrast strongly to the rapid olefin exchange seen in the Brookhart acyclic catalyst. The

  13. A Mechanistic-Based Healing Model for Self-Healing Glass Seals Used in Solid Oxide Fuel Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Wei; Sun, Xin; Stephens, Elizabeth V.

    The usage of self-healing glass as hermetic seals is a recent advancement in sealing technology development for the planar solid oxide fuel cells (SOFCs). Because of its capability of restoring the mechanical properties at elevated temperatures, the self-healing glass seal is expected to provide high reliability in maintaining the long-term structural integrity and functionality of SOFCs. In order to accommodate the design and to evaluate the effectiveness of such engineering seals under various thermo-mechanical operating conditions, computational modeling framework needs to be developed to accurately capture and predict the healing behavior of the glass material. In the present work, amore » mechanistic-based two-stage model was developed to study the stress and temperature-dependent crack healing of the self-healing glass materials. The model was first calibrated by experimental measurements combined with the kinetic Monte Carlo (kMC) simulation results and then implemented into the finite element analysis (FEA). The effects of various factors, i.e. stress, temperature, crack morphology, on the healing behavior of the glass were investigated and discussed.« less

  14. Mechanistic failure mode investigation and resolution of parvovirus retentive filters.

    PubMed

    LaCasse, Daniel; Lute, Scott; Fiadeiro, Marcus; Basha, Jonida; Stork, Matthew; Brorson, Kurt; Godavarti, Ranga; Gallo, Chris

    2016-07-08

    Virus retentive filters are a key product safety measure for biopharmaceuticals. A simplistic perception is that they function solely based on a size-based particle removal mechanism of mechanical sieving and retention of particles based on their hydrodynamic size. Recent observations have revealed a more nuanced picture, indicating that changes in viral particle retention can result from process pressure and/or flow interruptions. In this study, a mechanistic investigation was performed to help identify a potential mechanism leading to the reported reduced particle retention in small virus filters. Permeate flow rate or permeate driving force were varied and analyzed for their impact on particle retention in three commercially available small virus retentive filters. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:959-970, 2016. © 2016 American Institute of Chemical Engineers.

  15. A Unifying Mechanistic Model of Selective Attention in Spiking Neurons

    PubMed Central

    Bobier, Bruce; Stewart, Terrence C.; Eliasmith, Chris

    2014-01-01

    Visuospatial attention produces myriad effects on the activity and selectivity of cortical neurons. Spiking neuron models capable of reproducing a wide variety of these effects remain elusive. We present a model called the Attentional Routing Circuit (ARC) that provides a mechanistic description of selective attentional processing in cortex. The model is described mathematically and implemented at the level of individual spiking neurons, with the computations for performing selective attentional processing being mapped to specific neuron types and laminar circuitry. The model is used to simulate three studies of attention in macaque, and is shown to quantitatively match several observed forms of attentional modulation. Specifically, ARC demonstrates that with shifts of spatial attention, neurons may exhibit shifting and shrinking of receptive fields; increases in responses without changes in selectivity for non-spatial features (i.e. response gain), and; that the effect on contrast-response functions is better explained as a response-gain effect than as contrast-gain. Unlike past models, ARC embodies a single mechanism that unifies the above forms of attentional modulation, is consistent with a wide array of available data, and makes several specific and quantifiable predictions. PMID:24921249

  16. Exploring Organic Mechanistic Puzzles with Molecular Modeling

    ERIC Educational Resources Information Center

    Horowitz, Gail; Schwartz, Gary

    2004-01-01

    The molecular modeling was used to reinforce more general skills such as deducing and drawing reaction mechanisms, analyzing reaction kinetics and thermodynamics and drawing reaction coordinate energy diagrams. This modeling was done through the design of mechanistic puzzles, involving reactions not familiar to the students.

  17. Microfluidics-based digital quantitative PCR for single-cell small RNA quantification.

    PubMed

    Yu, Tian; Tang, Chong; Zhang, Ying; Zhang, Ruirui; Yan, Wei

    2017-09-01

    Quantitative analyses of small RNAs at the single-cell level have been challenging because of limited sensitivity and specificity of conventional real-time quantitative PCR methods. A digital quantitative PCR (dqPCR) method for miRNA quantification has been developed, but it requires the use of proprietary stem-loop primers and only applies to miRNA quantification. Here, we report a microfluidics-based dqPCR (mdqPCR) method, which takes advantage of the Fluidigm BioMark HD system for both template partition and the subsequent high-throughput dqPCR. Our mdqPCR method demonstrated excellent sensitivity and reproducibility suitable for quantitative analyses of not only miRNAs but also all other small RNA species at the single-cell level. Using this method, we discovered that each sperm has a unique miRNA profile. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. A preliminary study of mechanistic approach in pavement design to accommodate climate change effects

    NASA Astrophysics Data System (ADS)

    Harnaeni, S. R.; Pramesti, F. P.; Budiarto, A.; Setyawan, A.

    2018-03-01

    Road damage is caused by some factors, including climate changes, overload, and inappropriate procedure for material and development process. Meanwhile, climate change is a phenomenon which cannot be avoided. The effects observed include air temperature rise, sea level rise, rainfall changes, and the intensity of extreme weather phenomena. Previous studies had shown the impacts of climate changes on road damage. Therefore, several measures to anticipate the damage should be considered during the planning and construction in order to reduce the cost of road maintenance. There are three approaches generally applied in the design of flexible pavement thickness, namely mechanistic approach, mechanistic-empirical (ME) approach and empirical approach. The advantages of applying mechanistic approach or mechanistic-empirical (ME) approaches are its efficiency and reliability in the design of flexible pavement thickness as well as its capacity to accommodate climate changes in compared to empirical approach. However, generally, the design of flexible pavement thickness in Indonesia still applies empirical approach. This preliminary study aimed to emphasize the importance of the shifting towards a mechanistic approach in the design of flexible pavement thickness.

  19. Design of cinnamaldehyde amino acid Schiff base compounds based on the quantitative structure–activity relationship

    Treesearch

    Hui Wang; Mingyue Jiang; Shujun Li; Chung-Yun Hse; Chunde Jin; Fangli Sun; Zhuo Li

    2017-01-01

    Cinnamaldehyde amino acid Schiff base (CAAS) is a new class of safe, bioactive compounds which could be developed as potential antifungal agents for fungal infections. To design new cinnamaldehyde amino acid Schiff base compounds with high bioactivity, the quantitative structure–activity relationships (QSARs) for CAAS compounds against Aspergillus niger (A. niger) and...

  20. The value of mechanistic biophysical information for systems-level understanding of complex biological processes such as cytokinesis.

    PubMed

    Pollard, Thomas D

    2014-12-02

    This review illustrates the value of quantitative information including concentrations, kinetic constants and equilibrium constants in modeling and simulating complex biological processes. Although much has been learned about some biological systems without these parameter values, they greatly strengthen mechanistic accounts of dynamical systems. The analysis of muscle contraction is a classic example of the value of combining an inventory of the molecules, atomic structures of the molecules, kinetic constants for the reactions, reconstitutions with purified proteins and theoretical modeling to account for the contraction of whole muscles. A similar strategy is now being used to understand the mechanism of cytokinesis using fission yeast as a favorable model system. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  1. Quantitative evaluation methods of skin condition based on texture feature parameters.

    PubMed

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  2. Mechanistic Modelling of Drug-Induced Liver Injury: Investigating the Role of Innate Immune Responses.

    PubMed

    Shoda, Lisl Km; Battista, Christina; Siler, Scott Q; Pisetsky, David S; Watkins, Paul B; Howell, Brett A

    2017-01-01

    Drug-induced liver injury (DILI) remains an adverse event of significant concern for drug development and marketed drugs, and the field would benefit from better tools to identify liver liabilities early in development and/or to mitigate potential DILI risk in otherwise promising drugs. DILIsym software takes a quantitative systems toxicology approach to represent DILI in pre-clinical species and in humans for the mechanistic investigation of liver toxicity. In addition to multiple intrinsic mechanisms of hepatocyte toxicity (ie, oxidative stress, bile acid accumulation, mitochondrial dysfunction), DILIsym includes the interaction between hepatocytes and cells of the innate immune response in the amplification of liver injury and in liver regeneration. The representation of innate immune responses, detailed here, consolidates much of the available data on the innate immune response in DILI within a single framework and affords the opportunity to systematically investigate the contribution of the innate response to DILI.

  3. An example problem illustrating the application of the national lime association mixture design and testing protocol (MDTP) to ascertain engineering properties of lime-treated subgrades for mechanistic pavement design/analysis.

    DOT National Transportation Integrated Search

    2001-09-01

    This document presents an example of mechanistic design and analysis using a mix design and : testing protocol. More specifically, it addresses the structural properties of lime-treated subgrade, : subbase, and base layers through mechanistic design ...

  4. Emergent Global Patterns of Ecosystem Structure and Function from a Mechanistic General Ecosystem Model

    PubMed Central

    Emmott, Stephen; Hutton, Jon; Lyutsarev, Vassily; Smith, Matthew J.; Scharlemann, Jörn P. W.; Purves, Drew W.

    2014-01-01

    Anthropogenic activities are causing widespread degradation of ecosystems worldwide, threatening the ecosystem services upon which all human life depends. Improved understanding of this degradation is urgently needed to improve avoidance and mitigation measures. One tool to assist these efforts is predictive models of ecosystem structure and function that are mechanistic: based on fundamental ecological principles. Here we present the first mechanistic General Ecosystem Model (GEM) of ecosystem structure and function that is both global and applies in all terrestrial and marine environments. Functional forms and parameter values were derived from the theoretical and empirical literature where possible. Simulations of the fate of all organisms with body masses between 10 µg and 150,000 kg (a range of 14 orders of magnitude) across the globe led to emergent properties at individual (e.g., growth rate), community (e.g., biomass turnover rates), ecosystem (e.g., trophic pyramids), and macroecological scales (e.g., global patterns of trophic structure) that are in general agreement with current data and theory. These properties emerged from our encoding of the biology of, and interactions among, individual organisms without any direct constraints on the properties themselves. Our results indicate that ecologists have gathered sufficient information to begin to build realistic, global, and mechanistic models of ecosystems, capable of predicting a diverse range of ecosystem properties and their response to human pressures. PMID:24756001

  5. Emergent global patterns of ecosystem structure and function from a mechanistic general ecosystem model.

    PubMed

    Harfoot, Michael B J; Newbold, Tim; Tittensor, Derek P; Emmott, Stephen; Hutton, Jon; Lyutsarev, Vassily; Smith, Matthew J; Scharlemann, Jörn P W; Purves, Drew W

    2014-04-01

    Anthropogenic activities are causing widespread degradation of ecosystems worldwide, threatening the ecosystem services upon which all human life depends. Improved understanding of this degradation is urgently needed to improve avoidance and mitigation measures. One tool to assist these efforts is predictive models of ecosystem structure and function that are mechanistic: based on fundamental ecological principles. Here we present the first mechanistic General Ecosystem Model (GEM) of ecosystem structure and function that is both global and applies in all terrestrial and marine environments. Functional forms and parameter values were derived from the theoretical and empirical literature where possible. Simulations of the fate of all organisms with body masses between 10 µg and 150,000 kg (a range of 14 orders of magnitude) across the globe led to emergent properties at individual (e.g., growth rate), community (e.g., biomass turnover rates), ecosystem (e.g., trophic pyramids), and macroecological scales (e.g., global patterns of trophic structure) that are in general agreement with current data and theory. These properties emerged from our encoding of the biology of, and interactions among, individual organisms without any direct constraints on the properties themselves. Our results indicate that ecologists have gathered sufficient information to begin to build realistic, global, and mechanistic models of ecosystems, capable of predicting a diverse range of ecosystem properties and their response to human pressures.

  6. Atopic Dermatitis According to GARP: New Mechanistic Insights in Disease Pathogenesis.

    PubMed

    Nousbeck, Janna; Irvine, Alan D

    2016-12-01

    In complex disease such as atopic dermatitis, the journey from identification of strong risk loci to profound functional and mechanistic insights can take several years. Here, Manz et al. have elegantly deciphered the mechanistic pathways in the well-established 11q13.5 atopic dermatitis risk locus. Their genetic and functional insights emphasize a role for T regulatory cells in atopic dermatitis pathogenesis. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. AASHTO mechanistic-empirical pavement design guide parametric study.

    DOT National Transportation Integrated Search

    2012-03-01

    This study focuses on assessing the robustness of the AASHTO Mechanistic-Empirical Pavement Design Guide (MEPDG v 1.1) for rigid pavement : design projects in Wisconsin. The primary tasks conducted in this study included performing sensitivity analys...

  8. Physiological constraints on organismal response to global warming: Mechanistic insights from clinally varying populations and implications for assessing endangerment.

    PubMed

    Bernardo, Joseph; Spotila, James R

    2006-03-22

    Recent syntheses indicate that global warming affects diverse biological processes, but also highlight the potential for some species to adapt behaviourally or evolutionarily to rapid climate change. Far less attention has addressed the alternative, that organisms lacking this ability may face extinction, a fate projected to befall one-quarter of global biodiversity. This conclusion is controversial, in part because there exist few mechanistic studies that show how climate change could precipitate extinction. We provide a concrete, mechanistic example of warming as a stressor of organisms that are closely adapted to cool climates from a comparative analysis of organismal tolerance among clinally varying populations along a natural thermal gradient. We found that two montane salamanders exhibit significant metabolic depression at temperatures within the natural thermal range experienced by low and middle elevation populations. Moreover, the magnitude of depression was inversely related to native elevation, suggesting that low elevation populations are already living near the limit of their physiological tolerances. If this finding generally applies to other montane specialists, the prognosis for biodiversity loss in typically diverse montane systems is sobering. We propose that indices of warming-induced stress tolerance may provide a critical new tool for quantitative assessments of endangerment due to anthropogenic climate change across diverse species.

  9. Quantitative detection of bovine and porcine gelatin difference using surface plasmon resonance based biosensor

    NASA Astrophysics Data System (ADS)

    Wardani, Devy P.; Arifin, Muhammad; Suharyadi, Edi; Abraha, Kamsul

    2015-05-01

    Gelatin is a biopolymer derived from collagen that is widely used in food and pharmaceutical products. Due to some religion restrictions and health issues regarding the gelatin consumption which is extracted from certain species, it is necessary to establish a robust, reliable, sensitive and simple quantitative method to detect gelatin from different parent collagen species. To the best of our knowledge, there has not been a gelatin differentiation method based on optical sensor that could detect gelatin from different species quantitatively. Surface plasmon resonance (SPR) based biosensor is known to be a sensitive, simple and label free optical method for detecting biomaterials that is able to do quantitative detection. Therefore, we have utilized SPR-based biosensor to detect the differentiation between bovine and porcine gelatin in various concentration, from 0% to 10% (w/w). Here, we report the ability of SPR-based biosensor to detect difference between both gelatins, its sensitivity toward the gelatin concentration change, its reliability and limit of detection (LOD) and limit of quantification (LOQ) of the sensor. The sensor's LOD and LOQ towards bovine gelatin concentration are 0.38% and 1.26% (w/w), while towards porcine gelatin concentration are 0.66% and 2.20% (w/w), respectively. The results show that SPR-based biosensor is a promising tool for detecting gelatin from different raw materials quantitatively.

  10. Improved Mechanistic Understanding of Natural Gas Methane Emissions from Spatially Resolved Aircraft Measurements

    DOE PAGES

    Schwietzke, Stefan; Pétron, Gabrielle; Conley, Stephen; ...

    2017-06-05

    Divergence in recent oil and gas related methane emission estimates between aircraft studies (basin total for a midday window) and emissions inventories (annualized regional and national statistics) indicate the need for better understanding the experimental design, including temporal and spatial alignment and interpretation of results. In our aircraft-based methane emission estimates in a major U.S. shale gas basin resolved from west to east show (i) similar spatial distributions for 2 days, (ii) strong spatial correlations with reported NG production (R 2 = 0.75) and active gas well pad count (R 2 = 0.81), and (iii) 2× higher emissions in themore » western half (normalized by gas production) despite relatively homogeneous dry gas and well characteristics. Operator reported hourly activity data show that midday episodic emissions from manual liquid unloadings (a routine operation in this basin and elsewhere) could explain ~1/3 of the total emissions detected midday by the aircraft and ~2/3 of the west–east difference in emissions. The 22% emission difference between both days further emphasizes that episodic sources can substantially impact midday methane emissions and that aircraft may detect daily peak emissions rather than daily averages that are generally employed in emissions inventories. And while the aircraft approach is valid, quantitative, and independent, this study sheds new light on the interpretation of previous basin scale aircraft studies, and provides an improved mechanistic understanding of oil and gas related methane emissions.« less

  11. Improved Mechanistic Understanding of Natural Gas Methane Emissions from Spatially Resolved Aircraft Measurements.

    PubMed

    Schwietzke, Stefan; Pétron, Gabrielle; Conley, Stephen; Pickering, Cody; Mielke-Maday, Ingrid; Dlugokencky, Edward J; Tans, Pieter P; Vaughn, Tim; Bell, Clay; Zimmerle, Daniel; Wolter, Sonja; King, Clark W; White, Allen B; Coleman, Timothy; Bianco, Laura; Schnell, Russell C

    2017-06-20

    Divergence in recent oil and gas related methane emission estimates between aircraft studies (basin total for a midday window) and emissions inventories (annualized regional and national statistics) indicate the need for better understanding the experimental design, including temporal and spatial alignment and interpretation of results. Our aircraft-based methane emission estimates in a major U.S. shale gas basin resolved from west to east show (i) similar spatial distributions for 2 days, (ii) strong spatial correlations with reported NG production (R 2 = 0.75) and active gas well pad count (R 2 = 0.81), and (iii) 2× higher emissions in the western half (normalized by gas production) despite relatively homogeneous dry gas and well characteristics. Operator reported hourly activity data show that midday episodic emissions from manual liquid unloadings (a routine operation in this basin and elsewhere) could explain ∼1/3 of the total emissions detected midday by the aircraft and ∼2/3 of the west-east difference in emissions. The 22% emission difference between both days further emphasizes that episodic sources can substantially impact midday methane emissions and that aircraft may detect daily peak emissions rather than daily averages that are generally employed in emissions inventories. While the aircraft approach is valid, quantitative, and independent, our study sheds new light on the interpretation of previous basin scale aircraft studies, and provides an improved mechanistic understanding of oil and gas related methane emissions.

  12. Improved Mechanistic Understanding of Natural Gas Methane Emissions from Spatially Resolved Aircraft Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwietzke, Stefan; Pétron, Gabrielle; Conley, Stephen

    Divergence in recent oil and gas related methane emission estimates between aircraft studies (basin total for a midday window) and emissions inventories (annualized regional and national statistics) indicate the need for better understanding the experimental design, including temporal and spatial alignment and interpretation of results. In our aircraft-based methane emission estimates in a major U.S. shale gas basin resolved from west to east show (i) similar spatial distributions for 2 days, (ii) strong spatial correlations with reported NG production (R 2 = 0.75) and active gas well pad count (R 2 = 0.81), and (iii) 2× higher emissions in themore » western half (normalized by gas production) despite relatively homogeneous dry gas and well characteristics. Operator reported hourly activity data show that midday episodic emissions from manual liquid unloadings (a routine operation in this basin and elsewhere) could explain ~1/3 of the total emissions detected midday by the aircraft and ~2/3 of the west–east difference in emissions. The 22% emission difference between both days further emphasizes that episodic sources can substantially impact midday methane emissions and that aircraft may detect daily peak emissions rather than daily averages that are generally employed in emissions inventories. And while the aircraft approach is valid, quantitative, and independent, this study sheds new light on the interpretation of previous basin scale aircraft studies, and provides an improved mechanistic understanding of oil and gas related methane emissions.« less

  13. Mechanistic study of manganese-substituted glycerol dehydrogenase using a kinetic and thermodynamic analysis.

    PubMed

    Fang, Baishan; Niu, Jin; Ren, Hong; Guo, Yingxia; Wang, Shizhen

    2014-01-01

    Mechanistic insights regarding the activity enhancement of dehydrogenase by metal ion substitution were investigated by a simple method using a kinetic and thermodynamic analysis. By profiling the binding energy of both the substrate and product, the metal ion's role in catalysis enhancement was revealed. Glycerol dehydrogenase (GDH) from Klebsiella pneumoniae sp., which demonstrated an improvement in activity by the substitution of a zinc ion with a manganese ion, was used as a model for the mechanistic study of metal ion substitution. A kinetic model based on an ordered Bi-Bi mechanism was proposed considering the noncompetitive product inhibition of dihydroxyacetone (DHA) and the competitive product inhibition of NADH. By obtaining preliminary kinetic parameters of substrate and product inhibition, the number of estimated parameters was reduced from 10 to 4 for a nonlinear regression-based kinetic parameter estimation. The simulated values of time-concentration curves fit the experimental values well, with an average relative error of 11.5% and 12.7% for Mn-GDH and GDH, respectively. A comparison of the binding energy of enzyme ternary complex for Mn-GDH and GDH derived from kinetic parameters indicated that metal ion substitution accelerated the release of dioxyacetone. The metal ion's role in catalysis enhancement was explicated.

  14. Computational modeling approaches to quantitative structure-binding kinetics relationships in drug discovery.

    PubMed

    De Benedetti, Pier G; Fanelli, Francesca

    2018-03-21

    Simple comparative correlation analyses and quantitative structure-kinetics relationship (QSKR) models highlight the interplay of kinetic rates and binding affinity as an essential feature in drug design and discovery. The choice of the molecular series, and their structural variations, used in QSKR modeling is fundamental to understanding the mechanistic implications of ligand and/or drug-target binding and/or unbinding processes. Here, we discuss the implications of linear correlations between kinetic rates and binding affinity constants and the relevance of the computational approaches to QSKR modeling. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. The physicochemical process of bacterial attachment to abiotic surfaces: Challenges for mechanistic studies, predictability and the development of control strategies.

    PubMed

    Wang, Yi; Lee, Sui Mae; Dykes, Gary

    2015-01-01

    Bacterial attachment to abiotic surfaces can be explained as a physicochemical process. Mechanisms of the process have been widely studied but are not yet well understood due to their complexity. Physicochemical processes can be influenced by various interactions and factors in attachment systems, including, but not limited to, hydrophobic interactions, electrostatic interactions and substratum surface roughness. Mechanistic models and control strategies for bacterial attachment to abiotic surfaces have been established based on the current understanding of the attachment process and the interactions involved. Due to a lack of process control and standardization in the methodologies used to study the mechanisms of bacterial attachment, however, various challenges are apparent in the development of models and control strategies. In this review, the physicochemical mechanisms, interactions and factors affecting the process of bacterial attachment to abiotic surfaces are described. Mechanistic models established based on these parameters are discussed in terms of their limitations. Currently employed methods to study these parameters and bacterial attachment are critically compared. The roles of these parameters in the development of control strategies for bacterial attachment are reviewed, and the challenges that arise in developing mechanistic models and control strategies are assessed.

  16. MECHANISTIC AND SOURCE UNDERSTANDING OF PCDD/F FORMATION

    EPA Science Inventory

    The paper discusses mechanistic and source understanding of polychlorinated dibenzo-p-dioxin and dibenzofuran (PCDD/F) formation. (NOTE: Considerable research effort has been expended over the last 15-plus years to understand how combustion sources result in formation of PCDDs/F...

  17. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  18. Biomarker Discovery and Mechanistic Studies of Prostate Cancer Using Targeted Proteomic Approaches

    DTIC Science & Technology

    2010-07-01

    1-0431 TITLE: Biomarker Discovery and Mechanistic Studies of Prostate Cancer Using Targeted Proteomic Approaches PRINCIPAL INVESTIGATOR...June 2010 4. TITLE AND SUBTITLE Biomarker Discovery and Mechanistic Studies of Prostate Cancer Using Targeted Proteomic 5a. CONTRACT NUMBER...1-0430; W81XWH-08-1-0431; Grant sponsor: NIH/NCRR COBRE Grant; Grant number: 1P20RR020171; Grant sponsor: NIH/NIDDK Grant; Grant number: R01DK053525

  19. Quantitative comparison of catalytic mechanisms and overall reactions in convergently evolved enzymes: implications for classification of enzyme function.

    PubMed

    Almonacid, Daniel E; Yera, Emmanuel R; Mitchell, John B O; Babbitt, Patricia C

    2010-03-12

    definitions of EC sub-subclasses for improved discrimination in their classification of enzyme reactions. The results also indicate that mechanistic convergence of reaction steps is widespread, suggesting that quantitative measurement of mechanistic similarity can inform approaches for functional annotation.

  20. Quantitative Comparison of Catalytic Mechanisms and Overall Reactions in Convergently Evolved Enzymes: Implications for Classification of Enzyme Function

    PubMed Central

    Almonacid, Daniel E.; Yera, Emmanuel R.; Mitchell, John B. O.; Babbitt, Patricia C.

    2010-01-01

    definitions of EC sub-subclasses for improved discrimination in their classification of enzyme reactions. The results also indicate that mechanistic convergence of reaction steps is widespread, suggesting that quantitative measurement of mechanistic similarity can inform approaches for functional annotation. PMID:20300652

  1. New Simulation Methods to Facilitate Achieving a Mechanistic Understanding of Basic Pharmacology Principles in the Classroom

    ERIC Educational Resources Information Center

    Grover, Anita; Lam, Tai Ning; Hunt, C. Anthony

    2008-01-01

    We present a simulation tool to aid the study of basic pharmacology principles. By taking advantage of the properties of agent-based modeling, the tool facilitates taking a mechanistic approach to learning basic concepts, in contrast to the traditional empirical methods. Pharmacodynamics is a particular aspect of pharmacology that can benefit from…

  2. Quantitative Residual Strain Analyses on Strain Hardened Nickel Based Alloy

    NASA Astrophysics Data System (ADS)

    Yonezawa, Toshio; Maeguchi, Takaharu; Goto, Toru; Juan, Hou

    Many papers have reported about the effects of strain hardening by cold rolling, grinding, welding, etc. on stress corrosion cracking susceptibility of nickel based alloys and austenitic stainless steels for LWR pipings and components. But, the residual strain value due to cold rolling, grinding, welding, etc. is not so quantitatively evaluated.

  3. Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models

    NASA Astrophysics Data System (ADS)

    Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea

    2014-05-01

    Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.

  4. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    PubMed

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. A Statistical Framework for Protein Quantitation in Bottom-Up MS-Based Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and associated confidence measures. Challenges include the presence of low quality or incorrectly identified peptides and informative missingness. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model that carefully accounts for informative missingness in peak intensities and allows unbiased, model-based, protein-level estimation and inference. The model is applicable to both label-based and label-free quantitation experiments. We also provide automated, model-based, algorithms for filtering of proteins and peptides as well as imputation of missing values. Two LC/MS datasets are used to illustrate themore » methods. In simulation studies, our methods are shown to achieve substantially more discoveries than standard alternatives. Availability: The software has been made available in the opensource proteomics platform DAnTE (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu Supplementary information: Supplementary data are available at Bioinformatics online.« less

  6. Incorporation of lysosomal sequestration in the mechanistic model for prediction of tissue distribution of basic drugs.

    PubMed

    Assmus, Frauke; Houston, J Brian; Galetin, Aleksandra

    2017-11-15

    The prediction of tissue-to-plasma water partition coefficients (Kpu) from in vitro and in silico data using the tissue-composition based model (Rodgers & Rowland, J Pharm Sci. 2005, 94(6):1237-48.) is well established. However, distribution of basic drugs, in particular into lysosome-rich lung tissue, tends to be under-predicted by this approach. The aim of this study was to develop an extended mechanistic model for the prediction of Kpu which accounts for lysosomal sequestration and the contribution of different cell types in the tissue of interest. The extended model is based on compound-specific physicochemical properties and tissue composition data to describe drug ionization, distribution into tissue water and drug binding to neutral lipids, neutral phospholipids and acidic phospholipids in tissues, including lysosomes. Physiological data on the types of cells contributing to lung, kidney and liver, their lysosomal content and lysosomal pH were collated from the literature. The predictive power of the extended mechanistic model was evaluated using a dataset of 28 basic drugs (pK a ≥7.8, 17 β-blockers, 11 structurally diverse drugs) for which experimentally determined Kpu data in rat tissue have been reported. Accounting for the lysosomal sequestration in the extended mechanistic model improved the accuracy of Kpu predictions in lung compared to the original Rodgers model (56% drugs within 2-fold or 88% within 3-fold of observed values). Reduction in the extent of Kpu under-prediction was also evident in liver and kidney. However, consideration of lysosomal sequestration increased the occurrence of over-predictions, yielding overall comparable model performances for kidney and liver, with 68% and 54% of Kpu values within 2-fold error, respectively. High lysosomal concentration ratios relative to cytosol (>1000-fold) were predicted for the drugs investigated; the extent differed depending on the lysosomal pH and concentration of acidic phospholipids among

  7. Non-directed aromatic C–H amination: catalytic and mechanistic studies enabled by Pd catalyst and reagent design†

    PubMed Central

    Bandara, H. M. D.; Jin, D.; Mantell, M. A.; Field, K. D.; Wang, A.; Narayanan, R. P.; Deskins, N. A.; Emmert, M. H.

    2016-01-01

    This manuscript describes the systematic development of pyridine-type ligands, which promote the Pd catalyzed, non-directed amination of benzene in combination with novel, hydroxylamine-based electrophilic amination reagents. DFT calculations and mechanistic experiments provide insights into the factors influencing the arene C–H amination protocol. PMID:28066540

  8. In vivo quantitative analysis of Talin turnover in response to force

    PubMed Central

    Hákonardóttir, Guðlaug Katrín; López-Ceballos, Pablo; Herrera-Reyes, Alejandra Donají; Das, Raibatak; Coombs, Daniel; Tanentzapf, Guy

    2015-01-01

    Cell adhesion to the extracellular matrix (ECM) allows cells to form and maintain three-dimensional tissue architecture. Cell–ECM adhesions are stabilized upon exposure to mechanical force. In this study, we used quantitative imaging and mathematical modeling to gain mechanistic insight into how integrin-based adhesions respond to increased and decreased mechanical forces. A critical means of regulating integrin-based adhesion is provided by modulating the turnover of integrin and its adhesion complex (integrin adhesion complex [IAC]). The turnover of the IAC component Talin, a known mechanosensor, was analyzed using fluorescence recovery after photobleaching. Experiments were carried out in live, intact flies in genetic backgrounds that increased or decreased the force applied on sites of adhesion. This analysis showed that when force is elevated, the rate of assembly of new adhesions increases such that cell–ECM adhesion is stabilized. Moreover, under conditions of decreased force, the overall rate of turnover, but not the proportion of adhesion complex components undergoing turnover, increases. Using point mutations, we identify the key functional domains of Talin that mediate its response to force. Finally, by fitting a mathematical model to the data, we uncover the mechanisms that mediate the stabilization of ECM-based adhesion during development. PMID:26446844

  9. This Mechanistic Step Is ''Productive'': Organic Chemistry Students' Backward-Oriented Reasoning

    ERIC Educational Resources Information Center

    Caspari, I.; Weinrich, M. L.; Sevian, H.; Graulich, N.

    2018-01-01

    If an organic chemistry student explains that she represents a mechanistic step because ''it's a productive part of the mechanism,'' what meaning could the professor teaching the class attribute to this statement, what is actually communicated, and what does it mean for the student? The professor might think that the explanation is based on…

  10. Calibrating the mechanistic-empirical pavement design guide for Kansas.

    DOT National Transportation Integrated Search

    2015-04-01

    The Kansas Department of Transportation (KDOT) is moving toward the implementation of the new American : Association of State Highway and Transportation Officials (AASHTO) Mechanistic-Empirical Pavement Design Guide (MEPDG) : for pavement design. The...

  11. Quantitative genetic bases of anthocyanin variation in grape (Vitis vinifera L. ssp. sativa) berry: a quantitative trait locus to quantitative trait nucleotide integrated study.

    PubMed

    Fournier-Level, Alexandre; Le Cunff, Loïc; Gomez, Camila; Doligez, Agnès; Ageorges, Agnès; Roux, Catherine; Bertrand, Yves; Souquet, Jean-Marc; Cheynier, Véronique; This, Patrice

    2009-11-01

    The combination of QTL mapping studies of synthetic lines and association mapping studies of natural diversity represents an opportunity to throw light on the genetically based variation of quantitative traits. With the positional information provided through quantitative trait locus (QTL) mapping, which often leads to wide intervals encompassing numerous genes, it is now feasible to directly target candidate genes that are likely to be responsible for the observed variation in completely sequenced genomes and to test their effects through association genetics. This approach was performed in grape, a newly sequenced genome, to decipher the genetic architecture of anthocyanin content. Grapes may be either white or colored, ranging from the lightest pink to the darkest purple tones according to the amount of anthocyanin accumulated in the berry skin, which is a crucial trait for both wine quality and human nutrition. Although the determinism of the white phenotype has been fully identified, the genetic bases of the quantitative variation of anthocyanin content in berry skin remain unclear. A single QTL responsible for up to 62% of the variation in the anthocyanin content was mapped on a Syrah x Grenache F(1) pseudo-testcross. Among the 68 unigenes identified in the grape genome within the QTL interval, a cluster of four Myb-type genes was selected on the basis of physiological evidence (VvMybA1, VvMybA2, VvMybA3, and VvMybA4). From a core collection of natural resources (141 individuals), 32 polymorphisms revealed significant association, and extended linkage disequilibrium was observed. Using a multivariate regression method, we demonstrated that five polymorphisms in VvMybA genes except VvMybA4 (one retrotransposon, three single nucleotide polymorphisms and one 2-bp insertion/deletion) accounted for 84% of the observed variation. All these polymorphisms led to either structural changes in the MYB proteins or differences in the VvMybAs promoters. We concluded that

  12. Impact of implementation choices on quantitative predictions of cell-based computational models

    NASA Astrophysics Data System (ADS)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  13. Roughness Versus Charge Contributions to Representative Discrete Heterogeneity Underlying Mechanistic Prediction of Colloid Attachment, Detachment and Breakthrough-Elution Behavior Under Environmental Conditions.

    NASA Astrophysics Data System (ADS)

    Johnson, William; Farnsworth, Anna; Vanness, Kurt; Hilpert, Markus

    2017-04-01

    The key element of a mechanistic theory to predict colloid attachment in porous media under environmental conditions where colloid-collector repulsion exists (unfavorable conditions for attachment) is representation of the nano-scale surface heterogeneity (herein called discrete heterogeneity) that drives colloid attachment under unfavorable conditions. The observed modes of colloid attachment under unfavorable conditions emerge from simulations that incorporate discrete heterogeneity. Quantitative prediction of attachment (and detachment) requires capturing the sizes, spatial frequencies, and other properties of roughness asperities and charge heterodomains in discrete heterogeneity representations of different surfaces. The fact that a given discrete heterogeneity representation will interact differently with different-sized colloids as well as different ionic strengths for a given sized colloid allows backing out representative discrete heterogeneity via comparison of simulations to experiments performed across a range of colloid size, solution IS, and fluid velocity. This has been achieved on unfavorable smooth surfaces yielding quantitative prediction of attachment, and qualitative prediction of detachment in response to ionic strength or flow perturbations. Extending this treatment to rough surfaces, and representing the contributions of nanoscale roughness as well as charge heterogeneity is a focus of this talk. Another focus of this talk is the upscaling the pore scale simulations to produce contrasting breakthrough-elution behaviors at the continuum (column) scale that are observed, for example, for different-sized colloids, or same-sized colloids under different ionic strength conditions. The outcome of mechanistic pore scale simulations incorporating discrete heterogeneity and subsequent upscaling is that temporal processes such as blocking and ripening will emerge organically from these simulations, since these processes fundamentally stem from the

  14. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    PubMed

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  15. Problems in mechanistic theoretical models for cell transformation by ionizing radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, A.; Holley, W.R.

    1991-10-01

    A mechanistic model based on yields of double strand breaks has been developed to determine the dose response curves for cell transformation frequencies. At its present stage the model is applicable to immortal cell lines and to various qualities (X-rays, Neon and Iron) of ionizing radiation. Presently, we have considered four types of processes which can lead to activation phenomena: (1) point mutation events on a regulatory segment of selected oncogenes, (2) inactivation of suppressor genes, through point mutation, (3) deletion of a suppressor gene by a single track, and (4) deletion of a suppressor gene by two tracks.

  16. Characterizing Students' Mechanistic Reasoning about London Dispersion Forces

    ERIC Educational Resources Information Center

    Becker, Nicole; Noyes, Keenan; Cooper, Melanie

    2016-01-01

    Characterizing how students construct causal mechanistic explanations for chemical phenomena can provide us with important insights into the ways that students develop understanding of chemistry concepts. Here, we present two qualitative studies of undergraduate general chemistry students' reasoning about the causes of London dispersion forces in…

  17. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    PubMed

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  18. Mechanistic-empirical pavement design guide calibration for pavement rehabilitation.

    DOT National Transportation Integrated Search

    2013-01-01

    The Oregon Department of Transportation (ODOT) is in the process of implementing the recently introduced AASHTO : Mechanistic-Empirical Pavement Design Guide (MEPDG) for new pavement sections. The majority of pavement work : conducted by ODOT involve...

  19. Draft user's guide for UDOT mechanistic-empirical pavement design.

    DOT National Transportation Integrated Search

    2009-10-01

    Validation of the new AASHTO Mechanistic-Empirical Pavement Design Guides (MEPDG) nationally calibrated pavement distress and smoothness prediction models when applied under Utah conditions, and local calibration of the new hot-mix asphalt (HMA) p...

  20. Mechanistic understanding of monosaccharide-air flow battery electrochemistry

    NASA Astrophysics Data System (ADS)

    Scott, Daniel M.; Tsang, Tsz Ho; Chetty, Leticia; Aloi, Sekotilani; Liaw, Bor Yann

    Recently, an inexpensive monosaccharide-air flow battery configuration has been demonstrated to utilize a strong base and a mediator redox dye to harness electrical power from the partial oxidation of glucose. Here the mechanistic understanding of glucose oxidation in this unique glucose-air power source is further explored by acid-base titration experiments, 13C NMR, and comparison of results from chemically different redox mediators (indigo carmine vs. methyl viologen) and sugars (fructose vs. glucose) via studies using electrochemical techniques. Titration results indicate that gluconic acid is the main product of the cell reaction, as supported by evidence in the 13C NMR spectra. Using indigo carmine as the mediator dye and fructose as the energy source, an abiotic cell configuration generates a power density of 1.66 mW cm -2, which is greater than that produced from glucose under similar conditions (ca. 1.28 mW cm -2). A faster transition from fructose into the ene-diol intermediate than from glucose likely contributed to this difference in power density.

  1. Molecular Signaling Network Motifs Provide a Mechanistic Basis for Cellular Threshold Responses

    PubMed Central

    Bhattacharya, Sudin; Conolly, Rory B.; Clewell, Harvey J.; Kaminski, Norbert E.; Andersen, Melvin E.

    2014-01-01

    Background: Increasingly, there is a move toward using in vitro toxicity testing to assess human health risk due to chemical exposure. As with in vivo toxicity testing, an important question for in vitro results is whether there are thresholds for adverse cellular responses. Empirical evaluations may show consistency with thresholds, but the main evidence has to come from mechanistic considerations. Objectives: Cellular response behaviors depend on the molecular pathway and circuitry in the cell and the manner in which chemicals perturb these circuits. Understanding circuit structures that are inherently capable of resisting small perturbations and producing threshold responses is an important step towards mechanistically interpreting in vitro testing data. Methods: Here we have examined dose–response characteristics for several biochemical network motifs. These network motifs are basic building blocks of molecular circuits underpinning a variety of cellular functions, including adaptation, homeostasis, proliferation, differentiation, and apoptosis. For each motif, we present biological examples and models to illustrate how thresholds arise from specific network structures. Discussion and Conclusion: Integral feedback, feedforward, and transcritical bifurcation motifs can generate thresholds. Other motifs (e.g., proportional feedback and ultrasensitivity)produce responses where the slope in the low-dose region is small and stays close to the baseline. Feedforward control may lead to nonmonotonic or hormetic responses. We conclude that network motifs provide a basis for understanding thresholds for cellular responses. Computational pathway modeling of these motifs and their combinations occurring in molecular signaling networks will be a key element in new risk assessment approaches based on in vitro cellular assays. Citation: Zhang Q, Bhattacharya S, Conolly RB, Clewell HJ III, Kaminski NE, Andersen ME. 2014. Molecular signaling network motifs provide a

  2. Magnetic Resonance-based Motion Correction for Quantitative PET in Simultaneous PET-MR Imaging.

    PubMed

    Rakvongthai, Yothin; El Fakhri, Georges

    2017-07-01

    Motion degrades image quality and quantitation of PET images, and is an obstacle to quantitative PET imaging. Simultaneous PET-MR offers a tool that can be used for correcting the motion in PET images by using anatomic information from MR imaging acquired concurrently. Motion correction can be performed by transforming a set of reconstructed PET images into the same frame or by incorporating the transformation into the system model and reconstructing the motion-corrected image. Several phantom and patient studies have validated that MR-based motion correction strategies have great promise for quantitative PET imaging in simultaneous PET-MR. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Fragment-based quantitative structure-activity relationship (FB-QSAR) for fragment-based drug design.

    PubMed

    Du, Qi-Shi; Huang, Ri-Bo; Wei, Yu-Tuo; Pang, Zong-Wen; Du, Li-Qin; Chou, Kuo-Chen

    2009-01-30

    In cooperation with the fragment-based design a new drug design method, the so-called "fragment-based quantitative structure-activity relationship" (FB-QSAR) is proposed. The essence of the new method is that the molecular framework in a family of drug candidates are divided into several fragments according to their substitutes being investigated. The bioactivities of molecules are correlated with the physicochemical properties of the molecular fragments through two sets of coefficients in the linear free energy equations. One coefficient set is for the physicochemical properties and the other for the weight factors of the molecular fragments. Meanwhile, an iterative double least square (IDLS) technique is developed to solve the two sets of coefficients in a training data set alternately and iteratively. The IDLS technique is a feedback procedure with machine learning ability. The standard Two-dimensional quantitative structure-activity relationship (2D-QSAR) is a special case, in the FB-QSAR, when the whole molecule is treated as one entity. The FB-QSAR approach can remarkably enhance the predictive power and provide more structural insights into rational drug design. As an example, the FB-QSAR is applied to build a predictive model of neuraminidase inhibitors for drug development against H5N1 influenza virus. (c) 2008 Wiley Periodicals, Inc.

  4. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  5. Mindfulness & Self-Compassion Meditation for Combat Posttraumatic Stress Disorder: Randomized Controlled Trial and Mechanistic Study

    DTIC Science & Technology

    2013-10-01

    VA Ann Arbor PTSD clinic; and c.) conducting a translational neuroimaging mechanistic study with pre- and post fMRI and neurocognitive testing . 15...might be helpful – both in terms of the psychological characteristics of change, and in terms of neural mechanisms in the brain. Mindfulness...neurocognitive testing . Our novel 16 week Mindfulness and Self-compassion group intervention, “Mindfulness-based Exposure therapy” (MBET), was developed

  6. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    PubMed

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  7. Biomarker Discovery and Mechanistic Studies of Prostate Cancer Using Targeted Proteomic Approaches

    DTIC Science & Technology

    2012-07-01

    1-0431 TITLE: Biomarker Discovery and Mechanistic Studies of Prostate Cancer Using Targeted Proteomic Approaches PRINCIPAL INVESTIGATOR...July 2012 2. REPORT TYPE Final 3. DATES COVERED (From - To) 1 July 2008 – 30 June 2012 4. TITLE AND SUBTITLE Biomarker Discovery and Mechanistic...Department of Defense Synergistic Idea Development Award W81XWH-08-1-0430 (to H.Z) and W81XWH-08-1-0431 (to N.K.), an NIH/NCRR COBRE grant 1P20RR020171 (to

  8. Synthesising quantitative and qualitative research in evidence-based patient information.

    PubMed

    Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan

    2007-03-01

    Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and

  9. Synthesising quantitative and qualitative research in evidence‐based patient information

    PubMed Central

    Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan

    2007-01-01

    Background Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence‐based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. Aims This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Methods Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non‐quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg “explain what the test involves”) was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. Results 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. Conclusions A

  10. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    PubMed

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic

  11. Trichloroethylene: Mechanistic, epidemiologic and other supporting evidence of carcinogenic hazard.

    PubMed

    Rusyn, Ivan; Chiu, Weihsueh A; Lash, Lawrence H; Kromhout, Hans; Hansen, Johnni; Guyton, Kathryn Z

    2014-01-01

    The chlorinated solvent trichloroethylene (TCE) is a ubiquitous environmental pollutant. The carcinogenic hazard of TCE was the subject of a 2012 evaluation by a Working Group of the International Agency for Research on Cancer (IARC). Information on exposures, relevant data from epidemiologic studies, bioassays in experimental animals, and toxicity and mechanism of action studies was used to conclude that TCE is carcinogenic to humans (Group 1). This article summarizes the key evidence forming the scientific bases for the IARC classification. Exposure to TCE from environmental sources (including hazardous waste sites and contaminated water) is common throughout the world. While workplace use of TCE has been declining, occupational exposures remain of concern, especially in developing countries. The strongest human evidence is from studies of occupational TCE exposure and kidney cancer. Positive, although less consistent, associations were reported for liver cancer and non-Hodgkin lymphoma. TCE is carcinogenic at multiple sites in multiple species and strains of experimental animals. The mechanistic evidence includes extensive data on the toxicokinetics and genotoxicity of TCE and its metabolites. Together, available evidence provided a cohesive database supporting the human cancer hazard of TCE, particularly in the kidney. For other target sites of carcinogenicity, mechanistic and other data were found to be more limited. Important sources of susceptibility to TCE toxicity and carcinogenicity were also reviewed by the Working Group. In all, consideration of the multiple evidence streams presented herein informed the IARC conclusions regarding the carcinogenicity of TCE. © 2013.

  12. Trichloroethylene: Mechanistic, Epidemiologic and Other Supporting Evidence of Carcinogenic Hazard

    PubMed Central

    Rusyn, Ivan; Chiu, Weihsueh A.; Lash, Lawrence H.; Kromhout, Hans; Hansen, Johnni; Guyton, Kathryn Z.

    2013-01-01

    The chlorinated solvent trichloroethylene (TCE) is a ubiquitous environmental pollutant. The carcinogenic hazard of TCE was the subject of a 2012 evaluation by a Working Group of the International Agency for Research on Cancer (IARC). Information on exposures, relevant data from epidemiologic studies, bioassays in experimental animals, and toxicity and mechanism of action studies was used to conclude that TCE is carcinogenic to humans (Group 1). This article summarizes the key evidence forming the scientific bases for the IARC classification. Exposure to TCE from environmental sources (including from hazardous waste sites and contaminated water) is common throughout the world. While workplace use of TCE has been declining, occupational exposures remain of concern, especially in developing countries. Strongest human evidence is from studies of occupational TCE exposure and kidney cancer. Positive, although less consistent, associations were reported for liver cancer and non-Hodgkin's lymphoma. TCE is carcinogenic at multiple sites in multiple species and strains of experimental animals. The mechanistic evidence includes extensive data on the toxicokinetics and genotoxicity of TCE and its metabolites. Together, available evidence provided a cohesive database supporting the human cancer hazard of TCE, particularly in the kidney. For other target sites of carcinogenicity, mechanistic and other data were found to be more limited. Important sources of susceptibility to TCE toxicity and carcinogenicity were also reviewed by the Working Group. In all, consideration of the multiple evidence streams presented herein informed the IARC conclusions regarding the carcinogenicity of TCE. PMID:23973663

  13. Quantitative detection of melamine based on terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojing; Wang, Cuicui; Liu, Shangjian; Zuo, Jian; Zhou, Zihan; Zhang, Cunlin

    2018-01-01

    Melamine is an organic base and a trimer of cyanamide, with a 1, 3, 5-triazine skeleton. It is usually used for the production of plastics, glue and flame retardants. Melamine combines with acid and related compounds to form melamine cyanurate and related crystal structures, which have been implicated as contaminants or biomarkers in protein adulterations by lawbreakers, especially in milk powder. This paper is focused on developing an available method for quantitative detection of melamine in the fields of security inspection and nondestructive testing based on THz-TDS. Terahertz (THz) technology has promising applications for the detection and identification of materials because it exhibits the properties of spectroscopy, good penetration and safety. Terahertz time-domain spectroscopy (THz-TDS) is a key technique that is applied to spectroscopic measurement of materials based on ultrafast femtosecond laser. In this study, the melamine and its mixture with polyethylene powder in different consistence are measured using the transmission THz-TDS. And we obtained the refractive index spectra and the absorption spectrum of different concentrations of melamine on 0.2-2.8THz. In the refractive index spectra, it is obvious to see that decline trend with the decrease of concentration; and in the absorption spectrum, two peaks of melamine at 1.98THz and 2.28THz can be obtained. Based on the experimental result, the absorption coefficient and the consistence of the melamine in the mixture are determined. Finally, methods for quantitative detection of materials in the fields of nondestructive testing and quality control based on THz-TDS have been studied.

  14. Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.

    PubMed

    Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman

    2016-10-28

    Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  15. Quantitative data standardization of X-ray based densitometry methods

    NASA Astrophysics Data System (ADS)

    Sergunova, K. A.; Petraikin, A. V.; Petrjajkin, F. A.; Akhmad, K. S.; Semenov, D. S.; Potrakhov, N. N.

    2018-02-01

    In the present work is proposed the design of special liquid phantom for assessing the accuracy of quantitative densitometric data. Also are represented the dependencies between the measured bone mineral density values and the given values for different X-ray based densitometry techniques. Shown linear graphs make it possible to introduce correction factors to increase the accuracy of BMD measurement by QCT, DXA and DECT methods, and to use them for standardization and comparison of measurements.

  16. Application of Mechanistic Toxicology Data to Ecological Risk Assessments

    EPA Science Inventory

    The ongoing evolution of knowledge and tools in the areas of molecular biology, bioinformatics, and systems biology holds significant promise for reducing uncertainties associated with ecological risk assessment. As our understanding of the mechanistic basis of responses of organ...

  17. Implementation of mechanistic pavement design : field and laboratory implementation.

    DOT National Transportation Integrated Search

    2006-12-01

    One of the most important parameters needed for 2002 Mechanistic Pavement Design Guide is the dynamic modulus (E*). : The dynamic modulus (E*) describes the relationship between stress and strain for a linear viscoelastic material. The E* is the : pr...

  18. A semi-mechanistic model of dead fine fuel moisture for Temperate and Mediterranean ecosystems

    NASA Astrophysics Data System (ADS)

    Resco de Dios, Víctor; Fellows, Aaron; Boer, Matthias; Bradstock, Ross; Nolan, Rachel; Goulden, Michel

    2014-05-01

    Fire is a major disturbance in terrestrial ecosystems globally. It has an enormous economic and social cost, and leads to fatalities in the worst cases. The moisture content of the vegetation (fuel moisture) is one of the main determinants of fire risk. Predicting the moisture content of dead and fine fuel (< 2.5 cm in diameter) is particularly important, as this is often the most important component of the fuel complex for fire propagation. A variety of drought indices, empirical and mechanistic models have been proposed to model fuel moisture. A commonality across these different approaches is that they have been neither validated across large temporal datasets nor validated across broadly different vegetation types. Here, we present the results of a study performed at 6 locations in California, USA (5 sites) and New South Wales, Australia (1 site), where 10-hours fuel moisture content was continuously measured every 30 minutes during one full year at each site. We observed that drought indices did not accurately predict fuel moisture, and that empirical and mechanistic models both needed site-specific calibrations, which hinders their global application as indices of fuel moisture. We developed a novel, single equation and semi-mechanistic model, based on atmospheric vapor-pressure deficit. Across sites and years, mean absolute error (MAE) of predicted fuel moisture was 4.7%. MAE dropped <1% in the critical range of fuel moisture <10%. The high simplicity, accuracy and precision of our model makes it suitable for a wide range of applications: from operational purposes, to global vegetation models.

  19. Sandwich-Cultured Hepatocytes for Mechanistic Understanding of Hepatic Disposition of Parent Drugs and Metabolites by Transporter-Enzyme Interplay.

    PubMed

    Matsunaga, Norikazu; Fukuchi, Yukina; Imawaka, Haruo; Tamai, Ikumi

    2018-05-01

    Functional interplay between transporters and drug-metabolizing enzymes is currently one of the hottest topics in the field of drug metabolism and pharmacokinetics. Uptake transporter-enzyme interplay is important to determine intrinsic hepatic clearance based on the extended clearance concept. Enzyme and efflux transporter interplay, which includes both sinusoidal (basolateral) and canalicular efflux transporters, determines the fate of metabolites formed in the liver. As sandwich-cultured hepatocytes (SCHs) maintain metabolic activities and form a canalicular network, the whole interplay between uptake and efflux transporters and drug-metabolizing enzymes can be investigated simultaneously. In this article, we review the utility and applicability of SCHs for mechanistic understanding of hepatic disposition of both parent drugs and metabolites. In addition, the utility of SCHs for mimicking species-specific disposition of parent drugs and metabolites in vivo is described. We also review application of SCHs for clinically relevant prediction of drug-drug interactions caused by drugs and metabolites. The usefulness of mathematical modeling of hepatic disposition of parent drugs and metabolites in SCHs is described to allow a quantitative understanding of an event in vitro and to develop a more advanced model to predict in vivo disposition. Copyright © 2018 by The American Society for Pharmacology and Experimental Therapeutics.

  20. Mechanistic model to predict colostrum intake based on deuterium oxide dilution technique data and impact of gestation and prefarrowing diets on piglet intake and sow yield of colostrum.

    PubMed

    Theil, P K; Flummer, C; Hurley, W L; Kristensen, N B; Labouriau, R L; Sørensen, M T

    2014-12-01

    The aims of the present study were to quantify colostrum intake (CI) of piglets using the D2O dilution technique, to develop a mechanistic model to predict CI, to compare these data with CI predicted by a previous empirical predictive model developed for bottle-fed piglets, and to study how composition of diets fed to gestating sows affected piglet CI, sow colostrum yield (CY), and colostrum composition. In total, 240 piglets from 40 litters were enriched with D2O. The CI measured by D2O from birth until 24 h after the birth of first-born piglet was on average 443 g (SD 151). Based on measured CI, a mechanistic model to predict CI was developed using piglet characteristics (24-h weight gain [WG; g], BW at birth [BWB; kg], and duration of CI [D; min]: CI, g=-106+2.26 WG+200 BWB+0.111 D-1,414 WG/D+0.0182 WG/BWB (R2=0.944). This model was used to predict the CI for all colostrum suckling piglets within the 40 litters (n=500, mean=437 g, SD=153 g) and was compared with the CI predicted by a previous empirical predictive model (mean=305 g, SD=140 g). The previous empirical model underestimated the CI by 30% compared with that obtained by the new mechanistic model. The sows were fed 1 of 4 gestation diets (n=10 per diet) based on different fiber sources (low fiber [17%] or potato pulp, pectin residue, or sugarbeet pulp [32 to 40%]) from mating until d 108 of gestation. From d 108 of gestation until parturition, sows were fed 1 of 5 prefarrowing diets (n=8 per diet) varying in supplemented fat (3% animal fat, 8% coconut oil, 8% sunflower oil, 8% fish oil, or 4% fish oil+4% octanoic acid). Sows fed diets with pectin residue or sugarbeet pulp during gestation produced colostrum with lower protein, fat, DM, and energy concentrations and higher lactose concentrations, and their piglets had greater CI as compared with sows fed potato pulp or the low-fiber diet (P<0.05), and sows fed pectin residue had a greater CY than potato pulp-fed sows (P<0.05). Prefarrowing diets affected

  1. Mechanistic modeling of pesticide exposure: The missing keystone of honey bee toxicology.

    PubMed

    Sponsler, Douglas B; Johnson, Reed M

    2017-04-01

    The role of pesticides in recent honey bee losses is controversial, partly because field studies often fail to detect effects predicted by laboratory studies. This dissonance highlights a critical gap in the field of honey bee toxicology: there exists little mechanistic understanding of the patterns and processes of exposure that link honey bees to pesticides in their environment. The authors submit that 2 key processes underlie honey bee pesticide exposure: 1) the acquisition of pesticide by foraging bees, and 2) the in-hive distribution of pesticide returned by foragers. The acquisition of pesticide by foraging bees must be understood as the spatiotemporal intersection between environmental contamination and honey bee foraging activity. This implies that exposure is distributional, not discrete, and that a subset of foragers may acquire harmful doses of pesticide while the mean colony exposure would appear safe. The in-hive distribution of pesticide is a complex process driven principally by food transfer interactions between colony members, and this process differs importantly between pollen and nectar. High priority should be placed on applying the extensive literature on honey bee biology to the development of more rigorously mechanistic models of honey bee pesticide exposure. In combination with mechanistic effects modeling, mechanistic exposure modeling has the potential to integrate the field of honey bee toxicology, advancing both risk assessment and basic research. Environ Toxicol Chem 2017;36:871-881. © 2016 SETAC. © 2016 SETAC.

  2. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics

    PubMed Central

    Lavallée-Adam, Mathieu

    2017-01-01

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. PMID:27010334

  3. Real time quantitative phase microscopy based on single-shot transport of intensity equation (ssTIE) method

    NASA Astrophysics Data System (ADS)

    Yu, Wei; Tian, Xiaolin; He, Xiaoliang; Song, Xiaojun; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-08-01

    Microscopy based on transport of intensity equation provides quantitative phase distributions which opens another perspective for cellular observations. However, it requires multi-focal image capturing while mechanical and electrical scanning limits its real time capacity in sample detections. Here, in order to break through this restriction, real time quantitative phase microscopy based on single-shot transport of the intensity equation method is proposed. A programmed phase mask is designed to realize simultaneous multi-focal image recording without any scanning; thus, phase distributions can be quantitatively retrieved in real time. It is believed the proposed method can be potentially applied in various biological and medical applications, especially for live cell imaging.

  4. Method and platform standardization in MRM-based quantitative plasma proteomics.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  5. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform

    NASA Astrophysics Data System (ADS)

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-11-01

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (r

  6. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    NASA Astrophysics Data System (ADS)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  7. Experiencing Teaching and Learning Quantitative Reasoning in a Project-Based Context

    ERIC Educational Resources Information Center

    Muir, Tracey; Beswick, Kim; Callingham, Rosemary; Jade, Katara

    2016-01-01

    This paper presents the findings of a small-scale study that investigated the issues and challenges of teaching and learning about quantitative reasoning (QR) within a project-based learning (PjBL) context. Students and teachers were surveyed and interviewed about their experiences of learning and teaching QR in that context in contrast to…

  8. Tear gas: an epidemiological and mechanistic reassessment

    PubMed Central

    Rothenberg, Craig; Achanta, Satyanarayana; Svendsen, Erik R.

    2016-01-01

    Deployments of tear gas and pepper spray have rapidly increased worldwide. Large amounts of tear gas have been used in densely populated cities, including Cairo, Istanbul, Rio de Janeiro, Manama (Bahrain), and Hong Kong. In the United States, tear gas was used extensively during recent riots in Ferguson, Missouri. Whereas tear gas deployment systems have rapidly improved—with aerial drone systems tested and requested by law enforcement—epidemiological and mechanistic research have lagged behind and have received little attention. Case studies and recent epidemiological studies revealed that tear gas agents can cause lung, cutaneous, and ocular injuries, with individuals affected by chronic morbidities at high risk for complications. Mechanistic studies identified the ion channels TRPV1 and TRPA1 as targets of capsaicin in pepper spray, and of the tear gas agents chloroacetophenone, CS, and CR. TRPV1 and TRPA1 localize to pain‐sensing peripheral sensory neurons and have been linked to acute and chronic pain, cough, asthma, lung injury, dermatitis, itch, and neurodegeneration. In animal models, transient receptor potential inhibitors show promising effects as potential countermeasures against tear gas injuries. On the basis of the available data, a reassessment of the health risks of tear gas exposures in the civilian population is advised, and development of new countermeasures is proposed. PMID:27391380

  9. Mechanistic Approach to Understanding the Toxicity of the Azole Fungicide Triadimefon to a Nontarget Aquatic Insect and Implications for Exposure Assessment

    EPA Science Inventory

    We utilized mechanistic and stereoselective based in vitro metabolism assays and sublethal exposures of triadimefon to gain insight into the extent of carbonyl reduction and the toxic mode of action of triadimefon with black fly (Diptera: Simuliidae) larvae.

  10. Modeling systems-level dynamics: Understanding without mechanistic explanation in integrative systems biology.

    PubMed

    MacLeod, Miles; Nersessian, Nancy J

    2015-02-01

    In this paper we draw upon rich ethnographic data of two systems biology labs to explore the roles of explanation and understanding in large-scale systems modeling. We illustrate practices that depart from the goal of dynamic mechanistic explanation for the sake of more limited modeling goals. These processes use abstract mathematical formulations of bio-molecular interactions and data fitting techniques which we call top-down abstraction to trade away accurate mechanistic accounts of large-scale systems for specific information about aspects of those systems. We characterize these practices as pragmatic responses to the constraints many modelers of large-scale systems face, which in turn generate more limited pragmatic non-mechanistic forms of understanding of systems. These forms aim at knowledge of how to predict system responses in order to manipulate and control some aspects of them. We propose that this analysis of understanding provides a way to interpret what many systems biologists are aiming for in practice when they talk about the objective of a "systems-level understanding." Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. SYMPOSIUM SESSION PROPOSAL: INCORPORATION OF MODE OF ACTION INTO MECHANISTICALLY-BASED QUANTITATIVE MODELS

    EPA Science Inventory

    The biological processes by which environmental pollutants induce adverse health effects is most likely regulated by complex interactions dependent upon the route of exposure, dose, kinetics of distribution, and multiple cellular responses. To further complicate deciphering thes...

  12. Dronabinol and chronic pain: importance of mechanistic considerations.

    PubMed

    de Vries, Marjan; van Rijckevorsel, Dagmar C M; Wilder-Smith, Oliver H G; van Goor, Harry

    2014-08-01

    Although medicinal cannabis has been used for many centuries, the therapeutic potential of delta-9-tetrahydrocannabinol (Δ9-THC; international non-proprietary name = dronabinol) in current pain management remains unclear. Several pharmaceutical products with defined natural or synthesized Δ9-THC content have been developed, resulting in increasing numbers of clinical trials investigating the analgesic efficacy of dronabinol in various pain conditions. Different underlying pain mechanisms, including sensitization of nociceptive sensory pathways and alterations in cognitive and autonomic processing, might explain the varying analgesic effects of dronabinol in chronic pain states. The pharmacokinetics, pharmacodynamics and mechanisms of action of products with a defined dronabinol content are summarized. Additionally, randomized clinical trials investigating the analgesic efficacy of pharmaceutical cannabis based products are reviewed for the treatment of chronic nonmalignant pain. We suggest a mechanism-based approach beyond measurement of subjective pain relief to evaluate the therapeutic potential of dronabinol in chronic pain management. Development of objective mechanistic diagnostic biomarkers reflecting altered sensory and cognitive processing in the brain is essential to evaluate dronabinol induced analgesia, and to permit identification of responders and/or non-responders to dronabinol treatment.

  13. Quantitative and qualitative 5-aminolevulinic acid–induced protoporphyrin IX fluorescence in skull base meningiomas

    PubMed Central

    Bekelis, Kimon; Valdés, Pablo A.; Erkmen, Kadir; Leblond, Frederic; Kim, Anthony; Wilson, Brian C.; Harris, Brent T.; Paulsen, Keith D.; Roberts, David W.

    2011-01-01

    Object Complete resection of skull base meningiomas provides patients with the best chance for a cure; however, surgery is frequently difficult given the proximity of lesions to vital structures, such as cranial nerves, major vessels, and venous sinuses. Accurate discrimination between tumor and normal tissue is crucial for optimal tumor resection. Qualitative assessment of protoporphyrin IX (PpIX) fluorescence following the exogenous administration of 5-aminolevulinic acid (ALA) has demonstrated utility in malignant glioma resection but limited use in meningiomas. Here the authors demonstrate the use of ALA-induced PpIX fluorescence guidance in resecting a skull base meningioma and elaborate on the advantages and disadvantages provided by both quantitative and qualitative fluorescence methodologies in skull base meningioma resection. Methods A 52-year-old patient with a sphenoid wing WHO Grade I meningioma underwent tumor resection as part of an institutional review board–approved prospective study of fluorescence-guided resection. A surgical microscope modified for fluorescence imaging was used for the qualitative assessment of visible fluorescence, and an intraoperative probe for in situ fluorescence detection was utilized for quantitative measurements of PpIX. The authors assessed the detection capabilities of both the qualitative and quantitative fluorescence approaches. Results The patient harboring a sphenoid wing meningioma with intraorbital extension underwent radical resection of the tumor with both visibly and nonvisibly fluorescent regions. The patient underwent a complete resection without any complications. Some areas of the tumor demonstrated visible fluorescence. The quantitative probe detected neoplastic tissue better than the qualitative modified surgical microscope. The intraoperative probe was particularly useful in areas that did not reveal visible fluorescence, and tissue from these areas was confirmed as tumor following histopathological

  14. PCA-based groupwise image registration for quantitative MRI.

    PubMed

    Huizinga, W; Poot, D H J; Guyader, J-M; Klaassen, R; Coolen, B F; van Kranenburg, M; van Geuns, R J M; Uitterdijk, A; Polfliet, M; Vandemeulebroucke, J; Leemans, A; Niessen, W J; Klein, S

    2016-04-01

    Quantitative magnetic resonance imaging (qMRI) is a technique for estimating quantitative tissue properties, such as the T1 and T2 relaxation times, apparent diffusion coefficient (ADC), and various perfusion measures. This estimation is achieved by acquiring multiple images with different acquisition parameters (or at multiple time points after injection of a contrast agent) and by fitting a qMRI signal model to the image intensities. Image registration is often necessary to compensate for misalignments due to subject motion and/or geometric distortions caused by the acquisition. However, large differences in image appearance make accurate image registration challenging. In this work, we propose a groupwise image registration method for compensating misalignment in qMRI. The groupwise formulation of the method eliminates the requirement of choosing a reference image, thus avoiding a registration bias. The method minimizes a cost function that is based on principal component analysis (PCA), exploiting the fact that intensity changes in qMRI can be described by a low-dimensional signal model, but not requiring knowledge on the specific acquisition model. The method was evaluated on 4D CT data of the lungs, and both real and synthetic images of five different qMRI applications: T1 mapping in a porcine heart, combined T1 and T2 mapping in carotid arteries, ADC mapping in the abdomen, diffusion tensor mapping in the brain, and dynamic contrast-enhanced mapping in the abdomen. Each application is based on a different acquisition model. The method is compared to a mutual information-based pairwise registration method and four other state-of-the-art groupwise registration methods. Registration accuracy is evaluated in terms of the precision of the estimated qMRI parameters, overlap of segmented structures, distance between corresponding landmarks, and smoothness of the deformation. In all qMRI applications the proposed method performed better than or equally well as

  15. Investigation of mechanistic deterioration modeling for bridge design and management.

    DOT National Transportation Integrated Search

    2017-04-01

    The ongoing deterioration of highway bridges in Colorado dictates that an effective method for allocating limited management resources be developed. In order to predict bridge deterioration in advance, mechanistic models that analyze the physical pro...

  16. A statistical framework for protein quantitation in bottom-up MS-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less

  17. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics.

    PubMed

    Lavallée-Adam, Mathieu; Yates, John R

    2016-03-24

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the Web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  18. A dynamic and mechanistic model of PCB bioaccumulation in the European hake ( Merluccius merluccius)

    NASA Astrophysics Data System (ADS)

    Bodiguel, Xavier; Maury, Olivier; Mellon-Duval, Capucine; Roupsard, François; Le Guellec, Anne-Marie; Loizeau, Véronique

    2009-08-01

    Bioaccumulation is difficult to document because responses differ among chemical compounds, with environmental conditions, and physiological processes characteristic of each species. We use a mechanistic model, based on the Dynamic Energy Budget (DEB) theory, to take into account this complexity and study factors impacting accumulation of organic pollutants in fish through ontogeny. The bioaccumulation model proposed is a comprehensive approach that relates evolution of hake PCB contamination to physiological information about the fish, such as diet, metabolism, reserve and reproduction status. The species studied is the European hake ( Merluccius merluccius, L. 1758). The model is applied to study the total concentration and the lipid normalised concentration of 4 PCB congeners in male and female hakes from the Gulf of Lions (NW Mediterranean sea) and the Bay of Biscay (NE Atlantic ocean). Outputs of the model compare consistently to measurements over the life span of fish. Simulation results clearly demonstrate the relative effects of food contamination, growth and reproduction on the PCB bioaccumulation in hake. The same species living in different habitats and exposed to different PCB prey concentrations exhibit marked difference in the body accumulation of PCBs. At the adult stage, female hakes have a lower PCB concentration compared to males for a given length. We successfully simulated these sex-specific PCB concentrations by considering two mechanisms: a higher energy allocation to growth for females and a transfer of PCBs from the female to its eggs when allocating lipids from reserve to eggs. Finally, by its mechanistic description of physiological processes, the model is relevant for other species and sets the stage for a mechanistic understanding of toxicity and ecological effects of organic contaminants in marine organisms.

  19. Mechanistic insights on the cycloisomerization of polyunsaturated precursors catalyzed by platinum and gold complexes.

    PubMed

    Soriano, Elena; Marco-Contelles, José

    2009-08-18

    Organometallic chemistry provides powerful tools for the stereocontrolled synthesis of heterocycles and carbocycles. The electrophilic transition metals Pt(II) and Au(I, III) are efficient catalysts in these transitions and promote a variety of organic transformations of unsaturated precursors. These reactions produce functionalized cyclic and acyclic scaffolds for the synthesis of natural and non-natural products efficiently, under mild conditions, and with excellent chemoselectivity. Because these transformations are strongly substrate-dependent, they are versatile and may yield diverse molecular scaffolds. Therefore, synthetic chemists need a mechanistic interpretation to optimize this reaction process and design a new generation of catalysts. However, so far, no intermediate species has been isolated or characterized, so the formulated mechanistic hypotheses have been primarily based on labeling studies or trapping reactions. Recently, theoretical DFT studies have become a useful tool in our research, giving us insights into the key intermediates and into a variety of plausible reaction pathways. In this Account, we present a comprehensive mechanistic overview of transformations promoted by Pt and Au in a non-nucleophilic medium based on quantum-mechanical studies. The calculations are consistent with the experimental observations and provide fundamental insights into the versatility of these reaction processes. The reactivity of these metals results from their peculiar Lewis acid properties: the alkynophilic character of these soft metals and the pi-acid activation of unsaturated groups promotes the intra- or intermolecular attack of a nucleophile. 1,n-Enynes (n = 3-8) are particularly important precursors, and their transformation may yield a variety of cycloadducts depending on the molecular structure. However, the calculations suggest that these different cyclizations would have closely related reaction mechanisms, and we propose a unified mechanistic

  20. A comprehensive mechanistic model for upward two-phase flow in wellbores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sylvester, N.D.; Sarica, C.; Shoham, O.

    1994-05-01

    A comprehensive model is formulated to predict the flow behavior for upward two-phase flow. This model is composed of a model for flow-pattern prediction and a set of independent mechanistic models for predicting such flow characteristics as holdup and pressure drop in bubble, slug, and annular flow. The comprehensive model is evaluated by using a well data bank made up of 1,712 well cases covering a wide variety of field data. Model performance is also compared with six commonly used empirical correlations and the Hasan-Kabir mechanistic model. Overall model performance is in good agreement with the data. In comparison withmore » other methods, the comprehensive model performed the best.« less

  1. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  2. MECHANISTIC DOSIMETRY MODELS OF NANOMATERIAL DEPOSITION IN THE RESPIRATORY TRACT

    EPA Science Inventory

    Accurate health risk assessments of inhalation exposure to nanomaterials will require dosimetry models that account for interspecies differences in dose delivered to the respiratory tract. Mechanistic models offer the advantage to interspecies extrapolation that physicochemica...

  3. Ultra-fast quantitative imaging using ptychographic iterative engine based digital micro-mirror device

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-01-01

    As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.

  4. Mechanistic Studies at the Interface Between Organometallic Chemistry and Homogeneous Catalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casey, Charles P

    Mechanistic Studies at the Interface Between Organometallic Chemistry and Homogeneous Catalysis Charles P. Casey, Principal Investigator Department of Chemistry, University of Wisconsin - Madison, Madison, Wisconsin 53706 Phone 608-262-0584 FAX: 608-262-7144 Email: casey@chem.wisc.edu http://www.chem.wisc.edu/main/people/faculty/casey.html Executive Summary. Our goal was to learn the intimate mechanistic details of reactions involved in homogeneous catalysis and to use the insight we gain to develop new and improved catalysts. Our work centered on the hydrogenation of polar functional groups such as aldehydes and ketones and on hydroformylation. Specifically, we concentrated on catalysts capable of simultaneously transferring hydride from a metal center and a proton frommore » an acidic oxygen or nitrogen center to an aldehyde or ketone. An economical iron based catalyst was developed and patented. Better understanding of fundamental organometallic reactions and catalytic processes enabled design of energy and material efficient chemical processes. Our work contributed to the development of catalysts for the selective and mild hydrogenation of ketones and aldehydes; this will provide a modern green alternative to reductions by LiAlH4 and NaBH4, which require extensive work-up procedures and produce waste streams. (C5R4OH)Ru(CO)2H Hydrogenation Catalysts. Youval Shvo described a remarkable catalytic system in which the key intermediate (C5R4OH)Ru(CO)2H (1) has an electronically coupled acidic OH unit and a hydridic RuH unit. Our efforts centered on understanding and improving upon this important catalyst for reduction of aldehydes and ketones. Our mechanistic studies established that the reduction of aldehydes by 1 to produce alcohols and a diruthenium bridging hydride species occurs much more rapidly than regeneration of the ruthenium hydride from the diruthenium bridging hydride species. Our mechanistic studies require simultaneous transfer of hydride from

  5. Identification of key characteristics of male reproductive toxicants as an approach for screening and sorting mechanistic evidence.

    EPA Science Inventory

    The application of systematic review practices in human health assessment includes integration of multi-disciplinary evidence from epidemiological, experimental, and mechanistic studies. Although mode of action analysis relies on the evaluation of mechanistic and toxicological ou...

  6. FRET-based genetically-encoded sensors for quantitative monitoring of metabolites.

    PubMed

    Mohsin, Mohd; Ahmad, Altaf; Iqbal, Muhammad

    2015-10-01

    Neighboring cells in the same tissue can exist in different states of dynamic activities. After genomics, proteomics and metabolomics, fluxomics is now equally important for generating accurate quantitative information on the cellular and sub-cellular dynamics of ions and metabolite, which is critical for functional understanding of organisms. Various spectrometry techniques are used for monitoring ions and metabolites, although their temporal and spatial resolutions are limited. Discovery of the fluorescent proteins and their variants has revolutionized cell biology. Therefore, novel tools and methods targeting sub-cellular compartments need to be deployed in specific cells and targeted to sub-cellular compartments in order to quantify the target-molecule dynamics directly. We require tools that can measure cellular activities and protein dynamics with sub-cellular resolution. Biosensors based on fluorescence resonance energy transfer (FRET) are genetically encoded and hence can specifically target sub-cellular organelles by fusion to proteins or targetted sequences. Since last decade, FRET-based genetically encoded sensors for molecules involved in energy production, reactive oxygen species and secondary messengers have helped to unravel key aspects of cellular physiology. This review, describing the design and principles of sensors, presents a database of sensors for different analytes/processes, and illustrate examples of application in quantitative live cell imaging.

  7. Smartphone based visual and quantitative assays on upconversional paper sensor.

    PubMed

    Mei, Qingsong; Jing, Huarong; Li, You; Yisibashaer, Wuerzha; Chen, Jian; Nan Li, Bing; Zhang, Yong

    2016-01-15

    The integration of smartphone with paper sensors recently has been gain increasing attentions because of the achievement of quantitative and rapid analysis. However, smartphone based upconversional paper sensors have been restricted by the lack of effective methods to acquire luminescence signals on test paper. Herein, by the virtue of 3D printing technology, we exploited an auxiliary reusable device, which orderly assembled a 980nm mini-laser, optical filter and mini-cavity together, for digitally imaging the luminescence variations on test paper and quantitative analyzing pesticide thiram by smartphone. In detail, copper ions decorated NaYF4:Yb/Tm upconversion nanoparticles were fixed onto filter paper to form test paper, and the blue luminescence on it would be quenched after additions of thiram through luminescence resonance energy transfer mechanism. These variations could be monitored by the smartphone camera, and then the blue channel intensities of obtained colored images were calculated to quantify amounts of thiram through a self-written Android program installed on the smartphone, offering a reliable and accurate detection limit of 0.1μM for the system. This work provides an initial demonstration of integrating upconversion nanosensors with smartphone digital imaging for point-of-care analysis on a paper-based platform. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Electrochemical processes and mechanistic aspects of field-effect sensors for biomolecules

    PubMed Central

    Huang, Weiguo; Diallo, Abdou Karim; Dailey, Jennifer L.; Besar, Kalpana

    2017-01-01

    Electronic biosensing is a leading technology for determining concentrations of biomolecules. In some cases, the presence of an analyte molecule induces a measured change in current flow, while in other cases, a new potential difference is established. In the particular case of a field effect biosensor, the potential difference is monitored as a change in conductance elsewhere in the device, such as across a film of an underlying semiconductor. Often, the mechanisms that lead to these responses are not specifically determined. Because improved understanding of these mechanisms will lead to improved performance, it is important to highlight those studies where various mechanistic possibilities are investigated. This review explores a range of possible mechanistic contributions to field-effect biosensor signals. First, we define the field-effect biosensor and the chemical interactions that lead to the field effect, followed by a section on theoretical and mechanistic background. We then discuss materials used in field-effect biosensors and approaches to improving signals from field-effect biosensors. We specifically cover the biomolecule interactions that produce local electric fields, structures and processes at interfaces between bioanalyte solutions and electronic materials, semiconductors used in biochemical sensors, dielectric layers used in top-gated sensors, and mechanisms for converting the surface voltage change to higher signal/noise outputs in circuits. PMID:29238595

  9. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Quantitative structure-property relationship (correlation analysis) of phosphonic acid-based chelates in design of MRI contrast agent.

    PubMed

    Tiwari, Anjani K; Ojha, Himanshu; Kaul, Ankur; Dutta, Anupama; Srivastava, Pooja; Shukla, Gauri; Srivastava, Rakesh; Mishra, Anil K

    2009-07-01

    Nuclear magnetic resonance imaging is a very useful tool in modern medical diagnostics, especially when gadolinium (III)-based contrast agents are administered to the patient with the aim of increasing the image contrast between normal and diseased tissues. With the use of soft modelling techniques such as quantitative structure-activity relationship/quantitative structure-property relationship after a suitable description of their molecular structure, we have studied a series of phosphonic acid for designing new MRI contrast agent. Quantitative structure-property relationship studies with multiple linear regression analysis were applied to find correlation between different calculated molecular descriptors of the phosphonic acid-based chelating agent and their stability constants. The final quantitative structure-property relationship mathematical models were found as--quantitative structure-property relationship Model for phosphonic acid series (Model 1)--log K(ML) = {5.00243(+/-0.7102)}- MR {0.0263(+/-0.540)}n = 12 l r l = 0.942 s = 0.183 F = 99.165 quantitative structure-property relationship Model for phosphonic acid series (Model 2)--log K(ML) = {5.06280(+/-0.3418)}- MR {0.0252(+/- .198)}n = 12 l r l = 0.956 s = 0.186 F = 99.256.

  11. Portable paper-based device for quantitative colorimetric assays relying on light reflectance principle.

    PubMed

    Li, Bowei; Fu, Longwen; Zhang, Wei; Feng, Weiwei; Chen, Lingxin

    2014-04-01

    This paper presents a novel paper-based analytical device based on the colorimetric paper assays through its light reflectance. The device is portable, low cost (<20 dollars), and lightweight (only 176 g) that is available to assess the cost-effectiveness and appropriateness of the original health care or on-site detection information. Based on the light reflectance principle, the signal can be obtained directly, stably and user-friendly in our device. We demonstrated the utility and broad applicability of this technique with measurements of different biological and pollution target samples (BSA, glucose, Fe, and nitrite). Moreover, the real samples of Fe (II) and nitrite in the local tap water were successfully analyzed, and compared with the standard UV absorption method, the quantitative results showed good performance, reproducibility, and reliability. This device could provide quantitative information very conveniently and show great potential to broad fields of resource-limited analysis, medical diagnostics, and on-site environmental detection. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Thermodynamics-based models of transcriptional regulation with gene sequence.

    PubMed

    Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing

    2015-12-01

    Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.

  13. Target and Tissue Selectivity Prediction by Integrated Mechanistic Pharmacokinetic-Target Binding and Quantitative Structure Activity Modeling.

    PubMed

    Vlot, Anna H C; de Witte, Wilhelmus E A; Danhof, Meindert; van der Graaf, Piet H; van Westen, Gerard J P; de Lange, Elizabeth C M

    2017-12-04

    Selectivity is an important attribute of effective and safe drugs, and prediction of in vivo target and tissue selectivity would likely improve drug development success rates. However, a lack of understanding of the underlying (pharmacological) mechanisms and availability of directly applicable predictive methods complicates the prediction of selectivity. We explore the value of combining physiologically based pharmacokinetic (PBPK) modeling with quantitative structure-activity relationship (QSAR) modeling to predict the influence of the target dissociation constant (K D ) and the target dissociation rate constant on target and tissue selectivity. The K D values of CB1 ligands in the ChEMBL database are predicted by QSAR random forest (RF) modeling for the CB1 receptor and known off-targets (TRPV1, mGlu5, 5-HT1a). Of these CB1 ligands, rimonabant, CP-55940, and Δ 8 -tetrahydrocanabinol, one of the active ingredients of cannabis, were selected for simulations of target occupancy for CB1, TRPV1, mGlu5, and 5-HT1a in three brain regions, to illustrate the principles of the combined PBPK-QSAR modeling. Our combined PBPK and target binding modeling demonstrated that the optimal values of the K D and k off for target and tissue selectivity were dependent on target concentration and tissue distribution kinetics. Interestingly, if the target concentration is high and the perfusion of the target site is low, the optimal K D value is often not the lowest K D value, suggesting that optimization towards high drug-target affinity can decrease the benefit-risk ratio. The presented integrative structure-pharmacokinetic-pharmacodynamic modeling provides an improved understanding of tissue and target selectivity.

  14. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform.

    PubMed

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-12-14

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.

  15. Rearrangements of Allylic Sulfinates to Sulfones: A Mechanistic Study

    ERIC Educational Resources Information Center

    Ball, David B.; Mollard, Paul; Voigtritter, Karl R.; Ball, Jenelle L.

    2010-01-01

    Most current organic chemistry textbooks are organized by functional groups and those of us who teach organic chemistry use functional-group organization in our courses but ask students to learn organic chemistry from a mechanistic approach. To enrich and extend the chemical understanding and knowledge of pericyclic-type reactions for chemistry…

  16. Analytical techniques for mechanistic characterization of EUV photoresists

    NASA Astrophysics Data System (ADS)

    Grzeskowiak, Steven; Narasimhan, Amrit; Murphy, Michael; Ackerman, Christian; Kaminsky, Jake; Brainard, Robert L.; Denbeaux, Greg

    2017-03-01

    Extreme ultraviolet (EUV, 13.5 nm) lithography is the prospective technology for high volume manufacturing by the microelectronics industry. Significant strides towards achieving adequate EUV source power and availability have been made recently, but a limited rate of improvement in photoresist performance still delays the implementation of EUV. Many fundamental questions remain to be answered about the exposure mechanisms of even the relatively well understood chemically amplified EUV photoresists. Moreover, several groups around the world are developing revolutionary metal-based resists whose EUV exposure mechanisms are even less understood. Here, we describe several evaluation techniques to help elucidate mechanistic details of EUV exposure mechanisms of chemically amplified and metal-based resists. EUV absorption coefficients are determined experimentally by measuring the transmission through a resist coated on a silicon nitride membrane. Photochemistry can be evaluated by monitoring small outgassing reaction products to provide insight into photoacid generator or metal-based resist reactivity. Spectroscopic techniques such as thin-film Fourier transform infrared (FTIR) spectroscopy can measure the chemical state of a photoresist system pre- and post-EUV exposure. Additionally, electrolysis can be used to study the interaction between photoresist components and low energy electrons. Collectively, these techniques improve our current understanding of photomechanisms for several EUV photoresist systems, which is needed to develop new, better performing materials needed for high volume manufacturing.

  17. Mechanistic Insights into the Efficacy of Sodium Bicarbonate Supplementation to Improve Athletic Performance.

    PubMed

    Siegler, Jason C; Marshall, Paul W M; Bishop, David; Shaw, Greg; Green, Simon

    2016-12-01

    A large proportion of empirical research and reviews investigating the ergogenic potential of sodium bicarbonate (NaHCO 3 ) supplementation have focused predominately on performance outcomes and only speculate about underlying mechanisms responsible for any benefit. The aim of this review was to critically evaluate the influence of NaHCO 3 supplementation on mechanisms associated with skeletal muscle fatigue as it translates directly to exercise performance. Mechanistic links between skeletal muscle fatigue, proton accumulation (or metabolic acidosis) and NaHCO 3 supplementation have been identified to provide a more targeted, evidence-based approach to direct future research, as well as provide practitioners with a contemporary perspective on the potential applications and limitations of this supplement. The mechanisms identified have been broadly categorised under the sections 'Whole-body Metabolism', 'Muscle Physiology' and 'Motor Pathways', and when possible, the performance outcomes of these studies contextualized within an integrative framework of whole-body exercise where other factors such as task demand (e.g. large vs. small muscle groups), cardio-pulmonary and neural control mechanisms may outweigh any localised influence of NaHCO 3 . Finally, the 'Performance Applications' section provides further interpretation for the practitioner founded on the mechanistic evidence provided in this review and other relevant, applied NaHCO 3 performance-related studies.

  18. Secondary dispersal driven by overland flow in drylands: Review and mechanistic model development.

    PubMed

    Thompson, Sally E; Assouline, Shmuel; Chen, Li; Trahktenbrot, Ana; Svoray, Tal; Katul, Gabriel G

    2014-01-01

    Seed dispersal alters gene flow, reproduction, migration and ultimately spatial organization of dryland ecosystems. Because many seeds in drylands lack adaptations for long-distance dispersal, seed transport by secondary processes such as tumbling in the wind or mobilization in overland flow plays a dominant role in determining where seeds ultimately germinate. Here, recent developments in modeling runoff generation in spatially complex dryland ecosystems are reviewed with the aim of proposing improvements to mechanistic modeling of seed dispersal processes. The objective is to develop a physically-based yet operational framework for determining seed dispersal due to surface runoff, a process that has gained recent experimental attention. A Buoyant OBject Coupled Eulerian - Lagrangian Closure model (BOB-CELC) is proposed to represent seed movement in shallow surface flows. The BOB-CELC is then employed to investigate the sensitivity of seed transport to landscape and storm properties and to the spatial configuration of vegetation patches interspersed within bare earth. The potential to simplify seed transport outcomes by considering the limiting behavior of multiple runoff events is briefly considered, as is the potential for developing highly mechanistic, spatially explicit models that link seed transport, vegetation structure and water movement across multiple generations of dryland plants.

  19. Simulating the Risk of Liver Fluke Infection using a Mechanistic Hydro-epidemiological Model

    NASA Astrophysics Data System (ADS)

    Beltrame, Ludovica; Dunne, Toby; Rose, Hannah; Walker, Josephine; Morgan, Eric; Vickerman, Peter; Wagener, Thorsten

    2016-04-01

    Liver Fluke (Fasciola hepatica) is a common parasite found in livestock and responsible for considerable economic losses throughout the world. Risk of infection is strongly influenced by climatic and hydrological conditions, which characterise the host environment for parasite development and transmission. Despite on-going control efforts, increases in fluke outbreaks have been reported in recent years in the UK, and have been often attributed to climate change. Currently used fluke risk models are based on empirical relationships derived between historical climate and incidence data. However, hydro-climate conditions are becoming increasingly non-stationary due to climate change and direct anthropogenic impacts such as land use change, making empirical models unsuitable for simulating future risk. In this study we introduce a mechanistic hydro-epidemiological model for Liver Fluke, which explicitly simulates habitat suitability for disease development in space and time, representing the parasite life cycle in connection with key environmental conditions. The model is used to assess patterns of Liver Fluke risk for two catchments in the UK under current and potential future climate conditions. Comparisons are made with a widely used empirical model employing different datasets, including data from regional veterinary laboratories. Results suggest that mechanistic models can achieve adequate predictive ability and support adaptive fluke control strategies under climate change scenarios.

  20. Considerations for potency equivalent calculations in the Ah receptor-based CALUX bioassay: normalization of superinduction results for improved sample potency estimation.

    PubMed

    Baston, David S; Denison, Michael S

    2011-02-15

    The chemically activated luciferase expression (CALUX) system is a mechanistically based recombinant luciferase reporter gene cell bioassay used in combination with chemical extraction and clean-up methods for the detection and relative quantitation of 2,3,7,8-tetrachlorodibenzo-p-dioxin and related dioxin-like halogenated aromatic hydrocarbons in a wide variety of sample matrices. While sample extracts containing complex mixtures of chemicals can produce a variety of distinct concentration-dependent luciferase induction responses in CALUX cells, these effects are produced through a common mechanism of action (i.e. the Ah receptor (AhR)) allowing normalization of results and sample potency determination. Here we describe the diversity in CALUX response to PCDD/Fs from sediment and soil extracts and not only report the occurrence of superinduction of the CALUX bioassay, but we describe a mechanistically based approach for normalization of superinduction data that results in a more accurate estimation of the relative potency of such sample extracts. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Considerations for potency equivalent calculations in the Ah receptor-based CALUX bioassay: Normalization of superinduction results for improved sample potency estimation

    PubMed Central

    Baston, David S.; Denison, Michael S.

    2011-01-01

    The chemically activated luciferase expression (CALUX) system is a mechanistically based recombinant luciferase reporter gene cell bioassay used in combination with chemical extraction and clean-up methods for the detection and relative quantitation of 2,3,7,8-tetrachlorodibenzo-p-dioxin and related dioxin-like halogenated aromatic hydrocarbons in a wide variety of sample matrices. While sample extracts containing complex mixtures of chemicals can produce a variety of distinct concentration-dependent luciferase induction responses in CALUX cells, these effects are produced through a common mechanism of action (i.e. the Ah receptor (AhR)) allowing normalization of results and sample potency determination. Here we describe the diversity in CALUX response to PCDD/Fs from sediment and soil extracts and not only report the occurrence of superinduction of the CALUX bioassay, but we describe a mechanistically based approach for normalization of superinduction data that results in a more accurate estimation of the relative potency of such sample extracts. PMID:21238730

  2. A new charge-tagged proline-based organocatalyst for mechanistic studies using electrospray mass spectrometry

    PubMed Central

    Willms, J Alexander; Beel, Rita; Schmidt, Martin L; Mundt, Christian

    2014-01-01

    Summary A new 4-hydroxy-L-proline derivative with a charged 1-ethylpyridinium-4-phenoxy substituent has been synthesized with the aim of facilitating mechanistic studies of proline-catalyzed reactions by ESI mass spectrometry. The charged residue ensures a strongly enhanced ESI response compared to neutral unmodified proline. The connection by a rigid linker fixes the position of the charge tag far away from the catalytic center in order to avoid unwanted interactions. The use of a charged catalyst leads to significantly enhanced ESI signal abundances for every catalyst-derived species which are the ones of highest interest present in a reacting solution. The new charged proline catalyst has been tested in the direct asymmetric inverse aldol reaction between aldehydes and diethyl ketomalonate. Two intermediates in accordance with the List–Houk mechanism for enamine catalysis have been detected and characterized by gas-phase fragmentation. In addition, their temporal evolution has been followed using a microreactor continuous-flow technique. PMID:25246962

  3. Structural, mechanistic and functional insight into gliotoxin bis-thiomethylation in Aspergillus fumigatus

    PubMed Central

    Dolan, Stephen K.; Bock, Tobias; Hering, Vanessa; Owens, Rebecca A.; Jones, Gary W.

    2017-01-01

    Gliotoxin is an epipolythiodioxopiperazine (ETP) class toxin, contains a disulfide bridge that mediates its toxic effects via redox cycling and is produced by the opportunistic fungal pathogen Aspergillus fumigatus. Self-resistance against gliotoxin is effected by the gliotoxin oxidase GliT, and attenuation of gliotoxin biosynthesis is catalysed by gliotoxin S-methyltransferase GtmA. Here we describe the X-ray crystal structures of GtmA-apo (1.66 Å), GtmA complexed to S-adenosylhomocysteine (1.33 Å) and GtmA complexed to S-adenosylmethionine (2.28 Å), providing mechanistic insights into this important biotransformation. We further reveal that simultaneous elimination of the ability of A. fumigatus to dissipate highly reactive dithiol gliotoxin, via deletion of GliT and GtmA, results in the most significant hypersensitivity to exogenous gliotoxin observed to date. Indeed, quantitative proteomic analysis of ΔgliT::ΔgtmA reveals an uncontrolled over-activation of the gli-cluster upon gliotoxin exposure. The data presented herein reveal, for the first time, the extreme risk associated with intracellular dithiol gliotoxin biosynthesis—in the absence of an efficient dismutation capacity. Significantly, a previously concealed protective role for GtmA and functionality of ETP bis-thiomethylation as an ancestral protection strategy against dithiol compounds is now evident. PMID:28179499

  4. Structural, mechanistic and functional insight into gliotoxin bis-thiomethylation in Aspergillus fumigatus.

    PubMed

    Dolan, Stephen K; Bock, Tobias; Hering, Vanessa; Owens, Rebecca A; Jones, Gary W; Blankenfeldt, Wulf; Doyle, Sean

    2017-02-01

    Gliotoxin is an epipolythiodioxopiperazine (ETP) class toxin, contains a disulfide bridge that mediates its toxic effects via redox cycling and is produced by the opportunistic fungal pathogen Aspergillus fumigatus Self-resistance against gliotoxin is effected by the gliotoxin oxidase GliT, and attenuation of gliotoxin biosynthesis is catalysed by gliotoxin S -methyltransferase GtmA. Here we describe the X-ray crystal structures of GtmA-apo (1.66 Å), GtmA complexed to S -adenosylhomocysteine (1.33 Å) and GtmA complexed to S -adenosylmethionine (2.28 Å), providing mechanistic insights into this important biotransformation. We further reveal that simultaneous elimination of the ability of A. fumigatus to dissipate highly reactive dithiol gliotoxin, via deletion of GliT and GtmA, results in the most significant hypersensitivity to exogenous gliotoxin observed to date. Indeed, quantitative proteomic analysis of Δ gliT ::Δ gtmA reveals an uncontrolled over-activation of the gli -cluster upon gliotoxin exposure. The data presented herein reveal, for the first time, the extreme risk associated with intracellular dithiol gliotoxin biosynthesis-in the absence of an efficient dismutation capacity. Significantly, a previously concealed protective role for GtmA and functionality of ETP bis -thiomethylation as an ancestral protection strategy against dithiol compounds is now evident. © 2017 The Authors.

  5. REDUCING UNCERTAINTY IN RISK ASSESSMENT USING MECHANISTIC DATA: ENHANCING THE U.S. EPA DEVELOPMENTAL NEUROTOXICITY TESTING GUIDELINES

    EPA Science Inventory

    SUMMARY: Mechanistic data should provide the Agency with a more accurate basis to estimate risk than do the Agency’s default assumptions (10x uncertainty factors, etc.), thereby improving risk assessment decisions. NTD is providing mechanistic data for toxicant effects on two maj...

  6. Quantitative phase microscopy for cellular dynamics based on transport of intensity equation.

    PubMed

    Li, Ying; Di, Jianglei; Ma, Chaojie; Zhang, Jiwei; Zhong, Jinzhan; Wang, Kaiqiang; Xi, Teli; Zhao, Jianlin

    2018-01-08

    We demonstrate a simple method for quantitative phase imaging of tiny transparent objects such as living cells based on the transport of intensity equation. The experiments are performed using an inverted bright field microscope upgraded with a flipping imaging module, which enables to simultaneously create two laterally separated images with unequal defocus distances. This add-on module does not include any lenses or gratings and is cost-effective and easy-to-alignment. The validity of this method is confirmed by the measurement of microlens array and human osteoblastic cells in culture, indicating its potential in the applications of dynamically measuring living cells and other transparent specimens in a quantitative, non-invasive and label-free manner.

  7. Understanding the influence of biofilm accumulation on the hydraulic properties of soils: a mechanistic approach based on experimental data

    NASA Astrophysics Data System (ADS)

    Carles Brangarí, Albert; Sanchez-Vila, Xavier; Freixa, Anna; Romaní, Anna M.; Fernàndez-Garcia, Daniel

    2017-04-01

    The distribution, amount, and characteristics of biofilms and its components govern the capacity of soils to let water through, to transport solutes, and the reactions occurring. Therefore, unraveling the relationship between microbial dynamics and the hydraulic properties of soils is of concern for the management of natural systems and many technological applications. However, the increased complexity of both the microbial communities and the geochemical processes entailed by them causes that the phenomenon of bioclogging remains poorly understood. This highlights the need for a better understanding of the microbial components such as live and dead bacteria and extracellular polymeric substances (EPS), as well as of their spatial distribution. This work tries to shed some light on these issues, providing experimental data and a new mechanistic model that predicts the variably saturated hydraulic properties of bio-amended soils based on these data. We first present a long-term laboratory infiltration experiment that aims at studying the temporal variation of selected biogeochemical parameters along the infiltration path. The setup consists of a 120-cm-high soil tank instrumented with an array of sensors plus soil and liquid samplers. Sensors measured a wide range of parameters in continuous, such as volumetric water content, electrical conductivity, temperature, water pressure, soil suction, dissolved oxygen, and pH. Samples were kept for chemical and biological analyses. Results indicate that: i) biofilm is present at all depths, denoting the potential for deep bioclogging, ii) the redox conditions profile shows different stages, indicating that the community was adapted to changing redox conditions, iii) bacterial activity, richness and diversity also exhibit zonation with depth, and iv) the hydraulic properties of the soil experienced significant changes as biofilm proliferated. Based on experimental evidences, we propose a tool to predict changes in the

  8. Does Mechanistic Thinking Improve Student Success in Organic Chemistry?

    ERIC Educational Resources Information Center

    Grove, Nathaniel P.; Cooper, Melanie M.; Cox, Elizabeth L.

    2012-01-01

    The use of the curved-arrow notation to depict electron flow during mechanistic processes is one of the most important representational conventions in the organic chemistry curriculum. Our previous research documented a disturbing trend: when asked to predict the products of a series of reactions, many students do not spontaneously engage in…

  9. Liquid crystal-based biosensor with backscattering interferometry: A quantitative approach.

    PubMed

    Khan, Mashooq; Park, Soo-Young

    2017-01-15

    We developed a new technology that uses backscattering interferometry (BSI) to quantitatively measure nematic liquid crystal (NLC)-based biosensors, those usually relied on texture reading for on/off signals. The LC-based BSI comprised an octadecyltrichlorosilane (OTS)-coated square capillary filled with 4-cyano-4'-pentylbiphenyl (5CB, a nematic LC at room temperature). The LC/water interface in the capillary was functionalized by a coating of poly(acrylicacid-b-4-cyanobiphenyl-4'-oxyundecylacrylate) (PAA-b-LCP) and immobilized with the enzymes glucose oxidase (GOx) and horseradish peroxidase (HRP) through covalent linkage to the PAA chains (5CB PAA-GOx:HRP ) for glucose detection. Laser irradiation of the LC near the LC/water interface resulted in backscattered fringes with high contrast. The change in the spatial position of the fringes (because of the change in the orientation of the LC caused by the GOx:HRP enzymatic reaction of glucose) altered the output voltage of the photodetector when its active area was aligned with the edge of one of the fringes. The change in the intensity at the photodetector allowed the detection limit of the instrument to be as low as 0.008mM with a linear range of 0.02-9mM in a short response time (~60s). This LC-based BSI technique allows for quantitative, sensitive, selective, reproducible, easily obtainable, and interference-free detection in a large linear dynamic range and for practical applications with human serum. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. A CZT-based blood counter for quantitative molecular imaging.

    PubMed

    Espagnet, Romain; Frezza, Andrea; Martin, Jean-Pierre; Hamel, Louis-André; Lechippey, Laëtitia; Beauregard, Jean-Mathieu; Després, Philippe

    2017-12-01

    Robust quantitative analysis in positron emission tomography (PET) and in single-photon emission computed tomography (SPECT) typically requires the time-activity curve as an input function for the pharmacokinetic modeling of tracer uptake. For this purpose, a new automated tool for the determination of blood activity as a function of time is presented. The device, compact enough to be used on the patient bed, relies on a peristaltic pump for continuous blood withdrawal at user-defined rates. Gamma detection is based on a 20 × 20 × 15 mm 3 cadmium zinc telluride (CZT) detector, read by custom-made electronics and a field-programmable gate array-based signal processing unit. A graphical user interface (GUI) allows users to select parameters and easily perform acquisitions. This paper presents the overall design of the device as well as the results related to the detector performance in terms of stability, sensitivity and energy resolution. Results from a patient study are also reported. The device achieved a sensitivity of 7.1 cps/(kBq/mL) and a minimum detectable activity of 2.5 kBq/ml for 18 F. The gamma counter also demonstrated an excellent stability with a deviation in count rates inferior to 0.05% over 6 h. An energy resolution of 8% was achieved at 662 keV. The patient study was conclusive and demonstrated that the compact gamma blood counter developed has the sensitivity and the stability required to conduct quantitative molecular imaging studies in PET and SPECT.

  11. Problem-based learning on quantitative analytical chemistry course

    NASA Astrophysics Data System (ADS)

    Fitri, Noor

    2017-12-01

    This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.

  12. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    NASA Astrophysics Data System (ADS)

    Walters, Charles David

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008) related to quantitative reasoning. However, this may prove challenging, as prior to entering the classroom, PSTs often have few opportunities to develop MKT by examining and reflecting on students' thinking. Videos offer one avenue through which such opportunities are possible. In this study, I report on the design of a mini-course for PSTs that featured a series of videos created as part of a proof-of-concept NSF-funded project. These MathTalk videos highlight the ways in which the quantitative reasoning of two high school students developed over time. Using a mixed approach to grounded theory, I analyzed pre- and postinterviews using an extant coding scheme based on the Silverman and Thompson (2008) framework for the development of MKT. This analysis revealed a shift in participants' affect as well as three distinct shifts in their MKT around quantitative reasoning with distances, including shifts in: (a) quantitative reasoning; (b) point of view (decentering); and (c) orientation toward problem solving. Using the four-part focusing framework (Lobato, Hohensee, & Rhodehamel, 2013), I analyzed classroom data to account for how participants' noticing was linked with the shifts in MKT. Notably, their increased noticing of aspects of MKT around quantitative reasoning with distances, which features prominently in the MathTalk videos, seemed to contribute to the emergence of the shifts in MKT. Results from this study link elements of the learning environment to the development of specific facets of MKT around quantitative reasoning with distances. These connections suggest that vicarious experiences with two students' quantitative

  13. A Quantitative Comparative Study Measuring Consumer Satisfaction Based on Health Record Format

    ERIC Educational Resources Information Center

    Moore, Vivianne E.

    2013-01-01

    This research study used a quantitative comparative method to investigate the relationship between consumer satisfaction and communication based on the format of health record. The central problem investigated in this research study related to the format of health record used and consumer satisfaction with care provided and effect on communication…

  14. The evolution of honey bee dance communication: a mechanistic perspective.

    PubMed

    Barron, Andrew B; Plath, Jenny Aino

    2017-12-01

    Honey bee dance has been intensively studied as a communication system, and yet we still know very little about the neurobiological mechanisms supporting how dances are produced and interpreted. Here, we discuss how new information on the functions of the central complex (CX) of the insect brain might shed some light on possible neural mechanisms of dance behaviour. We summarise the features of dance communication across the species of the genus Apis We then propose that neural mechanisms of orientation and spatial processing found to be supported by the CX may function in dance communication also, and that this mechanistic link could explain some specific features of the dance form. This is purely a hypothesis, but in proposing this hypothesis, and how it might be investigated, we hope to stimulate new mechanistic analyses of dance communication. © 2017. Published by The Company of Biologists Ltd.

  15. Smoking cessation alters intestinal microbiota: insights from quantitative investigations on human fecal samples using FISH.

    PubMed

    Biedermann, Luc; Brülisauer, Karin; Zeitz, Jonas; Frei, Pascal; Scharl, Michael; Vavricka, Stephan R; Fried, Michael; Loessner, Martin J; Rogler, Gerhard; Schuppler, Markus

    2014-09-01

    There has been a dramatic increase in investigations on the potential mechanistic role of the intestinal microbiota in various diseases and factors modulating intestinal microbial composition. We recently reported on intestinal microbial shifts after smoking cessation in humans. In this study, we aimed to conduct further microbial analyses and verify our previous results obtained by pyrosequencing using a direct quantitative microbial approach. Stool samples of healthy smoking human subjects undergoing controlled smoking cessation during a 9-week observational period were analyzed and compared with 2 control groups, ongoing smoking and nonsmoking subjects. Fluorescence in situ hybridization was applied to quantify specific bacterial groups. Intestinal microbiota composition was substantially altered after smoking cessation as characterized by an increase in key representatives from the phyla of Firmicutes (Clostridium coccoides, Eubacterium rectale, and Clostridium leptum subgroup) and Actinobacteria (HGC bacteria and Bifidobacteria) as well as a decrease in Bacteroidetes (Prevotella spp. and Bacteroides spp.) and Proteobacteria (β- and γ-subgroup of Proteobacteria). As determined by fluorescence in situ hybridization, an independent direct quantitative microbial approach, we could confirm that intestinal microbiota composition in humans is influenced by smoking. The characteristics of observed microbial shifts suggest a potential mechanistic association to alterations in body weight subsequent to smoking cessation. More importantly, regarding previously described microbial hallmarks of dysbiosis in inflammatory bowel diseases, a variety of observed microbial alterations after smoking cessation deserve further consideration in view of the divergent effect of smoking on the clinical course of Crohn's disease and ulcerative colitis.

  16. Testicular Dysgenesis Syndrome and the Estrogen Hypothesis: A Quantitative Meta-Analysis

    PubMed Central

    Martin, Olwenn V.; Shialis, Tassos; Lester, John N.; Scrimshaw, Mark D.; Boobis, Alan R.; Voulvoulis, Nikolaos

    2008-01-01

    Background Male reproductive tract abnormalities such as hypospadias and cryptorchidism, and testicular cancer have been proposed to comprise a common syndrome together with impaired spermatogenesis with a common etiology resulting from the disruption of gonadal development during fetal life, the testicular dysgenesis syndrome (TDS). The hypothesis that in utero exposure to estrogenic agents could induce these disorders was first proposed in 1993. The only quantitative summary estimate of the association between prenatal exposure to estrogenic agents and testicular cancer was published over 10 years ago, and other systematic reviews of the association between estrogenic compounds, other than the potent pharmaceutical estrogen diethylstilbestrol (DES), and TDS end points have remained inconclusive. Objectives We conducted a quantitative meta-analysis of the association between the end points related to TDS and prenatal exposure to estrogenic agents. Inclusion in this analysis was based on mechanistic criteria, and the plausibility of an estrogen receptor (ER)-α–mediated mode of action was specifically explored. Results We included in this meta-analysis eight studies investigating the etiology of hypospadias and/or cryptorchidism that had not been identified in previous systematic reviews. Four additional studies of pharmaceutical estrogens yielded a statistically significant updated summary estimate for testicular cancer. Conclusions The doubling of the risk ratios for all three end points investigated after DES exposure is consistent with a shared etiology and the TDS hypothesis but does not constitute evidence of an estrogenic mode of action. Results of the subset analyses point to the existence of unidentified sources of heterogeneity between studies or within the study population. PMID:18288311

  17. Testicular dysgenesis syndrome and the estrogen hypothesis: a quantitative meta-analysis.

    PubMed

    Martin, Olwenn V; Shialis, Tassos; Lester, John N; Scrimshaw, Mark D; Boobis, Alan R; Voulvoulis, Nikolaos

    2008-02-01

    Male reproductive tract abnormalities such as hypospadias and cryptorchidism, and testicular cancer have been proposed to comprise a common syndrome together with impaired spermatogenesis with a common etiology resulting from the disruption of gonadal development during fetal life, the testicular dysgenesis syndrome (TDS). The hypothesis that in utero exposure to estrogenic agents could induce these disorders was first proposed in 1993. The only quantitative summary estimate of the association between prenatal exposure to estrogenic agents and testicular cancer was published over 10 years ago, and other systematic reviews of the association between estrogenic compounds, other than the potent pharmaceutical estrogen diethylstilbestrol (DES), and TDS end points have remained inconclusive. We conducted a quantitative meta-analysis of the association between the end points related to TDS and prenatal exposure to estrogenic agents. Inclusion in this analysis was based on mechanistic criteria, and the plausibility of an estrogen receptor (ER)-alpha-mediated mode of action was specifically explored. We included in this meta-analysis eight studies investigating the etiology of hypospadias and/or cryptorchidism that had not been identified in previous systematic reviews. Four additional studies of pharmaceutical estrogens yielded a statistically significant updated summary estimate for testicular cancer. The doubling of the risk ratios for all three end points investigated after DES exposure is consistent with a shared etiology and the TDS hypothesis but does not constitute evidence of an estrogenic mode of action. Results of the subset analyses point to the existence of unidentified sources of heterogeneity between studies or within the study population.

  18. Comparing the MRI-based Goutallier Classification to an experimental quantitative MR spectroscopic fat measurement of the supraspinatus muscle.

    PubMed

    Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk

    2016-08-22

    The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of

  19. Existing pavement input information for the mechanistic-empirical pavement design guide.

    DOT National Transportation Integrated Search

    2009-02-01

    The objective of this study is to systematically evaluate the Iowa Department of Transportations (DOTs) existing Pavement Management Information System (PMIS) with respect to the input information required for Mechanistic-Empirical Pavement Des...

  20. Calibrating the mechanistic-empirical pavement design guide for Kansas : [technical summary].

    DOT National Transportation Integrated Search

    2015-04-01

    The Kansas Department of Transportation (KDOT) is moving toward the implementation : of the new American Association of State Highway and Transportation Officials : (AASHTO) Mechanistic-Empirical Pavement Design Guide (MEPDG) for pavement : design. T...

  1. A gold nanoparticle-based semi-quantitative and quantitative ultrasensitive paper sensor for the detection of twenty mycotoxins

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai

    2016-02-01

    A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan

  2. Hidden Hydride Transfer as a Decisive Mechanistic Step in the Reactions of the Unligated Gold Carbide [AuC]+ with Methane under Ambient Conditions.

    PubMed

    Li, Jilai; Zhou, Shaodong; Schlangen, Maria; Weiske, Thomas; Schwarz, Helmut

    2016-10-10

    The reactivity of the cationic gold carbide [AuC] + (bearing an electrophilic carbon atom) towards methane has been studied using Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS). The product pairs generated, that is, Au + /C 2 H 4 , [Au(C 2 H 2 )] + /H 2 , and [C 2 H 3 ] + /AuH, point to the breaking and making of C-H, C-C, and H-H bonds under single-collision conditions. The mechanisms of these rather efficient reactions have been elucidated by high-level quantum-chemical calculations. As a major result, based on molecular orbital and NBO-based charge analysis, an unprecedented hydride transfer from methane to the carbon atom of [AuC] + has been identified as a key step. Also, the origin of this novel mechanistic scenario has been addressed. The mechanistic insights derived from this study may provide guidance for the rational design of carbon-based catalysts. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Generative Mechanistic Explanation Building in Undergraduate Molecular and Cellular Biology

    ERIC Educational Resources Information Center

    Southard, Katelyn M.; Espindola, Melissa R.; Zaepfel, Samantha D.; Bolger, Molly S.

    2017-01-01

    When conducting scientific research, experts in molecular and cellular biology (MCB) use specific reasoning strategies to construct mechanistic explanations for the underlying causal features of molecular phenomena. We explored how undergraduate students applied this scientific practice in MCB. Drawing from studies of explanation building among…

  4. DMD-based quantitative phase microscopy and optical diffraction tomography

    NASA Astrophysics Data System (ADS)

    Zhou, Renjie

    2018-02-01

    Digital micromirror devices (DMDs), which offer high speed and high degree of freedoms in steering light illuminations, have been increasingly applied to optical microscopy systems in recent years. Lately, we introduced DMDs into digital holography to enable new imaging modalities and break existing imaging limitations. In this paper, we will first present our progress in using DMDs for demonstrating laser-illumination Fourier ptychographic microscopy (FPM) with shotnoise limited detection. After that, we will present a novel common-path quantitative phase microscopy (QPM) system based on using a DMD. Building on those early developments, a DMD-based high speed optical diffraction tomography (ODT) system has been recently demonstrated, and the results will also be presented. This ODT system is able to achieve video-rate 3D refractive-index imaging, which can potentially enable observations of high-speed 3D sample structural changes.

  5. A sampling framework for incorporating quantitative mass spectrometry data in protein interaction analysis.

    PubMed

    Tucker, George; Loh, Po-Ru; Berger, Bonnie

    2013-10-04

    Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely

  6. Applications of Microfluidics in Quantitative Biology.

    PubMed

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  7. A versatile quantitation platform based on platinum nanoparticles incorporated volumetric bar-chart chip for highly sensitive assays.

    PubMed

    Wang, Yuzhen; Zhu, Guixian; Qi, Wenjin; Li, Ying; Song, Yujun

    2016-11-15

    Platinum nanoparticles incorporated volumetric bar-chart chip (PtNPs-V-Chip) is able to be used for point-of-care tests by providing quantitative and visualized readout without any assistance from instruments, data processing, or graphic plotting. To improve the sensitivity of PtNPs-V-Chip, hybridization chain reaction was employed in this quantitation platform for highly sensitive assays that can detect as low as 16 pM Ebola Virus DNA, 0.01ng/mL carcinoembryonic antigen (CEA), and the 10 HER2-expressing cancer cells. Based on this amplified strategy, a 100-fold decrease of detection limit was achieved for DNA by improving the number of platinum nanoparticle catalyst for the captured analyte. This quantitation platform can also distinguish single base mismatch of DNA hybridization and observe the concentration threshold of CEA. The new strategy lays the foundation for this quantitation platform to be applied in forensic analysis, biothreat detection, clinical diagnostics and drug screening. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Is the time right for quantitative public health guidelines on sitting? A narrative review of sedentary behaviour research paradigms and findings.

    PubMed

    Stamatakis, Emmanuel; Ekelund, Ulf; Ding, Ding; Hamer, Mark; Bauman, Adrian E; Lee, I-Min

    2018-06-10

    Sedentary behaviour (SB) has been proposed as an 'independent' risk factor for chronic disease risk, attracting much research and media attention. Many countries have included generic, non-quantitative reductions in SB in their public health guidelines and calls for quantitative SB targets are increasing. The aim of this narrative review is to critically evaluate key evidence areas relating to the development of guidance on sitting for adults. We carried out a non-systematic narrative evidence synthesis across seven key areas: (1) definition of SB, (2) independence of sitting from physical activity, (3) use of television viewing as a proxy of sitting, (4) interpretation of SB evidence, (5) evidence on 'sedentary breaks', (6) evidence on objectively measured sedentary SB and mortality and (7) dose response of sitting and mortality/cardiovascular disease. Despite research progress, we still know little about the independent detrimental health effects of sitting, and the possibility that sitting is mostly the inverse of physical activity remains. Unresolved issues include an unclear definition, inconsistencies between mechanistic and epidemiological studies, over-reliance on surrogate outcomes, a very weak epidemiological evidence base to support the inclusion of 'sedentary breaks' in guidelines, reliance on self-reported sitting measures, and misinterpretation of data whereby methodologically inconsistent associations are claimed to be strong evidence. In conclusion, public health guidance requires a consistent evidence base but this is lacking for SB. The development of quantitative SB guidance, using an underdeveloped evidence base, is premature; any further recommendations for sedentary behaviour require development of the evidence base and refinement of the research paradigms used in the field. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  9. Redox-based epigenetic status in drug addiction: a potential contributor to gene priming and a mechanistic rationale for metabolic intervention

    PubMed Central

    Trivedi, Malav S.; Deth, Richard

    2015-01-01

    Alcohol and other drugs of abuse, including psychostimulants and opioids, can induce epigenetic changes: a contributing factor for drug addiction, tolerance, and associated withdrawal symptoms. DNA methylation is a major epigenetic mechanism and it is one of more than 200 methylation reactions supported by methyl donor S-adenosylmethionine (SAM). Levels of SAM are controlled by cellular redox status via the folate and vitamin B12-dependent enzyme methionine synthase (MS). For example, under oxidative conditions MS is inhibited, diverting its substrate homocysteine (HCY) to the trans sulfuration pathway. Alcohol, dopamine, and morphine, can alter intracellular levels of glutathione (GSH)-based cellular redox status, subsequently affecting SAM levels and DNA methylation status. Here, existing evidence is presented in a coherent manner to propose a novel hypothesis implicating the involvement of redox-based epigenetic changes in drug addiction. Further, we discuss how a “gene priming” phenomenon can contribute to the maintenance of redox and methylation status homeostasis under various stimuli including drugs of abuse. Additionally, a new mechanistic rationale for the use of metabolic interventions/redox-replenishers as symptomatic treatment of alcohol and other drug addiction and associated withdrawal symptoms is also provided. Hence, the current review article strengthens the hypothesis that neuronal metabolism has a critical bidirectional coupling with epigenetic changes in drug addiction exemplified by the link between redox-based metabolic changes and resultant epigenetic consequences under the effect of drugs of abuse. PMID:25657617

  10. Redox-based epigenetic status in drug addiction: a potential contributor to gene priming and a mechanistic rationale for metabolic intervention.

    PubMed

    Trivedi, Malav S; Deth, Richard

    2014-01-01

    Alcohol and other drugs of abuse, including psychostimulants and opioids, can induce epigenetic changes: a contributing factor for drug addiction, tolerance, and associated withdrawal symptoms. DNA methylation is a major epigenetic mechanism and it is one of more than 200 methylation reactions supported by methyl donor S-adenosylmethionine (SAM). Levels of SAM are controlled by cellular redox status via the folate and vitamin B12-dependent enzyme methionine synthase (MS). For example, under oxidative conditions MS is inhibited, diverting its substrate homocysteine (HCY) to the trans sulfuration pathway. Alcohol, dopamine, and morphine, can alter intracellular levels of glutathione (GSH)-based cellular redox status, subsequently affecting SAM levels and DNA methylation status. Here, existing evidence is presented in a coherent manner to propose a novel hypothesis implicating the involvement of redox-based epigenetic changes in drug addiction. Further, we discuss how a "gene priming" phenomenon can contribute to the maintenance of redox and methylation status homeostasis under various stimuli including drugs of abuse. Additionally, a new mechanistic rationale for the use of metabolic interventions/redox-replenishers as symptomatic treatment of alcohol and other drug addiction and associated withdrawal symptoms is also provided. Hence, the current review article strengthens the hypothesis that neuronal metabolism has a critical bidirectional coupling with epigenetic changes in drug addiction exemplified by the link between redox-based metabolic changes and resultant epigenetic consequences under the effect of drugs of abuse.

  11. USE OF MECHANISTIC DATA TO HELP DEFINE DOSE-RESPONSE CURVES

    EPA Science Inventory

    Use of Mechanistic Data to Help Define Dose-Response Curves

    The cancer risk assessment process described by the U.S. EPA necessitates a description of the dose-response curve for tumors in humans at low (environmental) exposures. This description can either be a default l...

  12. Dose-response relationships and extrapolation in toxicology - Mechanistic and statistical considerations

    EPA Science Inventory

    Controversy on toxicological dose-response relationships and low-dose extrapolation of respective risks is often the consequence of misleading data presentation, lack of differentiation between types of response variables, and diverging mechanistic interpretation. In this chapter...

  13. Implementation of the AASHTO mechanistic-empirical pavement design guide for Colorado.

    DOT National Transportation Integrated Search

    2000-01-01

    The objective of this project was to integrate the American Association of State Highway and Transportation Officials (AASHTO) Mechanistic-Empirical Pavement Design Guide, Interim Edition: A Manual of Practice and its accompanying software into the d...

  14. An Alu-based, MGB Eclipse real-time PCR method for quantitation of human DNA in forensic samples.

    PubMed

    Nicklas, Janice A; Buel, Eric

    2005-09-01

    The forensic community needs quick, reliable methods to quantitate human DNA in crime scene samples to replace the laborious and imprecise slot blot method. A real-time PCR based method has the possibility of allowing development of a faster and more quantitative assay. Alu sequences are primate-specific and are found in many copies in the human genome, making these sequences an excellent target or marker for human DNA. This paper describes the development of a real-time Alu sequence-based assay using MGB Eclipse primers and probes. The advantages of this assay are simplicity, speed, less hands-on-time and automated quantitation, as well as a large dynamic range (128 ng/microL to 0.5 pg/microL).

  15. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  16. Improving membrane based multiplex immunoassays for semi-quantitative detection of multiple cytokines in a single sample

    PubMed Central

    2014-01-01

    Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797

  17. "Standards"-Based Mathematics Curricula and the Promotion of Quantitative Literacy in Elementary School

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2015-01-01

    Background: Prior research has shown that students taught using "Standards"-based mathematics curricula tend to outperform students on measures of mathematics achievement. However, little research has focused particularly on the promotion of student quantitative literacy (QLT). In this study, the potential influence of the…

  18. Mechanistic Investigations into the Application of Sulfoxides in Carbohydrate Synthesis

    PubMed Central

    Brabham, Robin

    2016-01-01

    Abstract The utility of sulfoxides in a diverse range of transformations in the field of carbohydrate chemistry has seen rapid growth since the first introduction of a sulfoxide as a glycosyl donor in 1989. Sulfoxides have since developed into more than just anomeric leaving groups, and today have multiple roles in glycosylation reactions. These include as activators for thioglycosides, hemiacetals, and glycals, and as precursors to glycosyl triflates, which are essential for stereoselective β‐mannoside synthesis, and bicyclic sulfonium ions that facilitate the stereoselective synthesis of α‐glycosides. In this review we highlight the mechanistic investigations undertaken in this area, often outlining strategies employed to differentiate between multiple proposed reaction pathways, and how the conclusions of these investigations have and continue to inform upon the development of more efficient transformations in sulfoxide‐based carbohydrate synthesis. PMID:26744250

  19. Acrylamide formation in food: a mechanistic perspective.

    PubMed

    Yaylayan, Varoujan A; Stadler, Richard H

    2005-01-01

    Earliest reports on the origin of acrylamide in food have confirmed asparagine as the main amino acid responsible for its formation. Available evidence suggests that sugars and other carbonyl compounds play a specific role in the decarboxylation process of asparagine, a necessary step in the generation of acrylamide. It has been proposed that Schiff base intermediate formed between asparagine and the sugar provides a low energy alternative to the decarboxylation from the intact Amadori product through generation and decomposition of oxazolidin-5-one intermediate, leading to the formation of a relatively stable azomethine ylide. Literature data indicate the propensity of such protonated ylides to undergo irreversible 1,2-prototropic shift and produce, in this case, decarboxylated Schiff bases which can easily rearrange into corresponding Amadori products. Decarboxylated Amadori products can either undergo the well known beta-elimination process initiated by the sugar moiety to produce 3-aminopropanamide and 1-deoxyglucosone or undergo 1,2-elimination initiated by the amino acid moiety to directly generate acrylamide. On the other hand, the Schiff intermediate can either hydrolyze and release 3-aminopropanamide or similarly undergo amino acid initiated 1,2-elimination to directly form acrylamide. Other thermolytic pathways to acrylamide--considered marginal at this stage--via the Strecker aldehyde, acrolein, and acrylic acid, are also addressed. Despite significant progress in the understanding of the mechanistic aspects of acrylamide formation, concrete evidence for the role of the different proposed intermediates in foods is still lacking.

  20. The quantitative architecture of centromeric chromatin

    PubMed Central

    Bodor, Dani L; Mata, João F; Sergeev, Mikhail; David, Ana Filipa; Salimian, Kevan J; Panchenko, Tanya; Cleveland, Don W; Black, Ben E; Shah, Jagesh V; Jansen, Lars ET

    2014-01-01

    The centromere, responsible for chromosome segregation during mitosis, is epigenetically defined by CENP-A containing chromatin. The amount of centromeric CENP-A has direct implications for both the architecture and epigenetic inheritance of centromeres. Using complementary strategies, we determined that typical human centromeres contain ∼400 molecules of CENP-A, which is controlled by a mass-action mechanism. This number, despite representing only ∼4% of all centromeric nucleosomes, forms a ∼50-fold enrichment to the overall genome. In addition, although pre-assembled CENP-A is randomly segregated during cell division, this amount of CENP-A is sufficient to prevent stochastic loss of centromere function and identity. Finally, we produced a statistical map of CENP-A occupancy at a human neocentromere and identified nucleosome positions that feature CENP-A in a majority of cells. In summary, we present a quantitative view of the centromere that provides a mechanistic framework for both robust epigenetic inheritance of centromeres and the paucity of neocentromere formation. DOI: http://dx.doi.org/10.7554/eLife.02137.001 PMID:25027692

  1. A Quantitative ADME-base Tool for Exploring Human ...

    EPA Pesticide Factsheets

    Exposure to a wide range of chemicals through our daily habits and routines is ubiquitous and largely unavoidable within modern society. The potential for human exposure, however, has not been quantified for the vast majority of chemicals with wide commercial use. Creative advances in exposure science are needed to support efficient and effective evaluation and management of chemical risks, particularly for chemicals in consumer products. The U.S. Environmental Protection Agency Office of Research and Development is developing, or collaborating in the development of, scientifically-defensible methods for making quantitative or semi-quantitative exposure predictions. The Exposure Prioritization (Ex Priori) model is a simplified, quantitative visual dashboard that provides a rank-ordered internalized dose metric to simultaneously explore exposures across chemical space (not chemical by chemical). Diverse data streams are integrated within the interface such that different exposure scenarios for “individual,” “population,” or “professional” time-use profiles can be interchanged to tailor exposure and quantitatively explore multi-chemical signatures of exposure, internalized dose (uptake), body burden, and elimination. Ex Priori has been designed as an adaptable systems framework that synthesizes knowledge from various domains and is amenable to new knowledge/information. As such, it algorithmically captures the totality of exposure across pathways. It

  2. Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models

    USGS Publications Warehouse

    Hooten, Mevin B.; Leeds, William B.; Fiechter, Jerome; Wikle, Christopher K.

    2011-01-01

    We present an approach for estimating physical parameters in nonlinear models that relies on an approximation to the mechanistic model itself for computational efficiency. The proposed methodology is validated and applied in two different modeling scenarios: (a) Simulation and (b) lower trophic level ocean ecosystem model. The approach we develop relies on the ability to predict right singular vectors (resulting from a decomposition of computer model experimental output) based on the computer model input and an experimental set of parameters. Critically, we model the right singular vectors in terms of the model parameters via a nonlinear statistical model. Specifically, we focus our attention on first-order models of these right singular vectors rather than the second-order (covariance) structure.

  3. Coupling machine learning with mechanistic models to study runoff production and river flow at the hillslope scale

    NASA Astrophysics Data System (ADS)

    Marçais, J.; Gupta, H. V.; De Dreuzy, J. R.; Troch, P. A. A.

    2016-12-01

    Geomorphological structure and geological heterogeneity of hillslopes are major controls on runoff responses. The diversity of hillslopes (morphological shapes and geological structures) on one hand, and the highly non linear runoff mechanism response on the other hand, make it difficult to transpose what has been learnt at one specific hillslope to another. Therefore, making reliable predictions on runoff appearance or river flow for a given hillslope is a challenge. Applying a classic model calibration (based on inverse problems technique) requires doing it for each specific hillslope and having some data available for calibration. When applied to thousands of cases it cannot always be promoted. Here we propose a novel modeling framework based on coupling process based models with data based approach. First we develop a mechanistic model, based on hillslope storage Boussinesq equations (Troch et al. 2003), able to model non linear runoff responses to rainfall at the hillslope scale. Second we set up a model database, representing thousands of non calibrated simulations. These simulations investigate different hillslope shapes (real ones obtained by analyzing 5m digital elevation model of Brittany and synthetic ones), different hillslope geological structures (i.e. different parametrizations) and different hydrologic forcing terms (i.e. different infiltration chronicles). Then, we use this model library to train a machine learning model on this physically based database. Machine learning model performance is then assessed by a classic validating phase (testing it on new hillslopes and comparing machine learning with mechanistic outputs). Finally we use this machine learning model to learn what are the hillslope properties controlling runoffs. This methodology will be further tested combining synthetic datasets with real ones.

  4. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Mathematical Description and Mechanistic Reasoning: A Pathway toward STEM Integration

    ERIC Educational Resources Information Center

    Weinberg, Paul J.

    2017-01-01

    Because reasoning about mechanism is critical to disciplined inquiry in science, technology, engineering, and mathematics (STEM) domains, this study focuses on ways to support the development of this form of reasoning. This study attends to how mechanistic reasoning is constituted through mathematical description. This study draws upon Smith's…

  6. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  7. Mechanistic Basis of Cocrystal Dissolution Advantage.

    PubMed

    Cao, Fengjuan; Amidon, Gordon L; Rodríguez-Hornedo, Naír; Amidon, Gregory E

    2018-01-01

    Current interest in cocrystal development resides in the advantages that the cocrystal may have in solubility and dissolution compared with the parent drug. This work provides a mechanistic analysis and comparison of the dissolution behavior of carbamazepine (CBZ) and its 2 cocrystals, carbamazepine-saccharin (CBZ-SAC) and carbamazepine-salicylic acid (CBZ-SLC) under the influence of pH and micellar solubilization. A simple mathematical equation is derived based on the mass transport analyses to describe the dissolution advantage of cocrystals. The dissolution advantage is the ratio of the cocrystal flux to drug flux and is defined as the solubility advantage (cocrystal to drug solubility ratio) times the diffusivity advantage (cocrystal to drug diffusivity ratio). In this work, the effective diffusivity of CBZ in the presence of surfactant was determined to be different and less than those of the cocrystals. The higher effective diffusivity of drug from the dissolved cocrystals, the diffusivity advantage, can impart a dissolution advantage to cocrystals with lower solubility than the parent drug while still maintaining thermodynamic stability. Dissolution conditions where cocrystals can display both thermodynamic stability and a dissolution advantage can be obtained from the mass transport models, and this information is useful for both cocrystal selection and formulation development. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  8. Comparison of culture-based, vital stain and PMA-qPCR methods for the quantitative detection of viable hookworm ova.

    PubMed

    Gyawali, P; Sidhu, J P S; Ahmed, W; Jagals, P; Toze, S

    2017-06-01

    Accurate quantitative measurement of viable hookworm ova from environmental samples is the key to controlling hookworm re-infections in the endemic regions. In this study, the accuracy of three quantitative detection methods [culture-based, vital stain and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR)] was evaluated by enumerating 1,000 ± 50 Ancylostoma caninum ova in the laboratory. The culture-based method was able to quantify an average of 397 ± 59 viable hookworm ova. Similarly, vital stain and PMA-qPCR methods quantified 644 ± 87 and 587 ± 91 viable ova, respectively. The numbers of viable ova estimated by the culture-based method were significantly (P < 0.05) lower than vital stain and PMA-qPCR methods. Therefore, both PMA-qPCR and vital stain methods appear to be suitable for the quantitative detection of viable hookworm ova. However, PMA-qPCR would be preferable over the vital stain method in scenarios where ova speciation is needed.

  9. Design of cinnamaldehyde amino acid Schiff base compounds based on the quantitative structure-activity relationship.

    PubMed

    Wang, Hui; Jiang, Mingyue; Li, Shujun; Hse, Chung-Yun; Jin, Chunde; Sun, Fangli; Li, Zhuo

    2017-09-01

    Cinnamaldehyde amino acid Schiff base (CAAS) is a new class of safe, bioactive compounds which could be developed as potential antifungal agents for fungal infections. To design new cinnamaldehyde amino acid Schiff base compounds with high bioactivity, the quantitative structure-activity relationships (QSARs) for CAAS compounds against Aspergillus niger ( A. niger ) and Penicillium citrinum (P. citrinum) were analysed. The QSAR models ( R 2  = 0.9346 for A. niger , R 2  = 0.9590 for P. citrinum, ) were constructed and validated. The models indicated that the molecular polarity and the Max atomic orbital electronic population had a significant effect on antifungal activity. Based on the best QSAR models, two new compounds were designed and synthesized. Antifungal activity tests proved that both of them have great bioactivity against the selected fungi.

  10. Design of cinnamaldehyde amino acid Schiff base compounds based on the quantitative structure–activity relationship

    PubMed Central

    Wang, Hui; Jiang, Mingyue; Hse, Chung-Yun; Jin, Chunde; Sun, Fangli; Li, Zhuo

    2017-01-01

    Cinnamaldehyde amino acid Schiff base (CAAS) is a new class of safe, bioactive compounds which could be developed as potential antifungal agents for fungal infections. To design new cinnamaldehyde amino acid Schiff base compounds with high bioactivity, the quantitative structure–activity relationships (QSARs) for CAAS compounds against Aspergillus niger (A. niger) and Penicillium citrinum (P. citrinum) were analysed. The QSAR models (R2 = 0.9346 for A. niger, R2 = 0.9590 for P. citrinum,) were constructed and validated. The models indicated that the molecular polarity and the Max atomic orbital electronic population had a significant effect on antifungal activity. Based on the best QSAR models, two new compounds were designed and synthesized. Antifungal activity tests proved that both of them have great bioactivity against the selected fungi. PMID:28989758

  11. Contrast-enhanced spectral mammography based on a photon-counting detector: quantitative accuracy and radiation dose

    NASA Astrophysics Data System (ADS)

    Lee, Seungwan; Kang, Sooncheol; Eom, Jisoo

    2017-03-01

    Contrast-enhanced mammography has been used to demonstrate functional information about a breast tumor by injecting contrast agents. However, a conventional technique with a single exposure degrades the efficiency of tumor detection due to structure overlapping. Dual-energy techniques with energy-integrating detectors (EIDs) also cause an increase of radiation dose and an inaccuracy of material decomposition due to the limitations of EIDs. On the other hands, spectral mammography with photon-counting detectors (PCDs) is able to resolve the issues induced by the conventional technique and EIDs using their energy-discrimination capabilities. In this study, the contrast-enhanced spectral mammography based on a PCD was implemented by using a polychromatic dual-energy model, and the proposed technique was compared with the dual-energy technique with an EID in terms of quantitative accuracy and radiation dose. The results showed that the proposed technique improved the quantitative accuracy as well as reduced radiation dose comparing to the dual-energy technique with an EID. The quantitative accuracy of the contrast-enhanced spectral mammography based on a PCD was slightly improved as a function of radiation dose. Therefore, the contrast-enhanced spectral mammography based on a PCD is able to provide useful information for detecting breast tumors and improving diagnostic accuracy.

  12. Kinetic and mechanistic reactivity. Isoprene impact on ozone levels in an urban area near Tijuca Forest, Rio de Janeiro.

    PubMed

    da Silva, Cleyton Martins; da Silva, Luane Lima; Corrêa, Sergio Machado; Arbilla, Graciela

    2016-12-01

    Volatile organic compounds (VOCs) play a central role in atmospheric chemistry. In this work, the kinetic and mechanistic reactivities of VOCs are analyzed, and the contribution of the organic compounds emitted by anthropogenic and natural sources is estimated. VOCs react with hydroxyl radicals and other photochemical oxidants, such as ozone and nitrate radicals, which cause the conversion of NO to NO 2 in various potential reaction paths, including photolysis, to form oxygen atoms, which generate ozone. The kinetic reactivity was evaluated based on the reaction coefficients for hydroxyl radicals with VOCs. The mechanistic reactivity was estimated using a detailed mechanism and the incremental reactivity scale that Carter proposed. Different scenarios were proposed and discussed, and a minimum set of compounds, which may describe the tropospheric reactivity in the studied area, was determined. The role of isoprene was analyzed in terms of its contribution to ozone formation.

  13. [Quantitative classification-based occupational health management for electroplating enterprises in Baoan District of Shenzhen, China].

    PubMed

    Zhang, Sheng; Huang, Jinsheng; Yang, Baigbing; Lin, Binjie; Xu, Xinyun; Chen, Jinru; Zhao, Zhuandi; Tu, Xiaozhi; Bin, Haihua

    2014-04-01

    To improve the occupational health management levels in electroplating enterprises with quantitative classification measures and to provide a scientific basis for the prevention and control of occupational hazards in electroplating enterprises and the protection of workers' health. A quantitative classification table was created for the occupational health management in electroplating enterprises. The evaluation indicators included 6 items and 27 sub-items, with a total score of 100 points. Forty electroplating enterprises were selected and scored according to the quantitative classification table. These electroplating enterprises were classified into grades A, B, and C based on the scores. Among 40 electroplating enterprises, 11 (27.5%) had scores of >85 points (grade A), 23 (57.5%) had scores of 60∼85 points (grade B), and 6 (15.0%) had scores of <60 points (grade C). Quantitative classification management for electroplating enterprises is a valuable attempt, which is helpful for the supervision and management by the health department and provides an effective method for the self-management of enterprises.

  14. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  15. Precise Quantitation of MicroRNA in a Single Cell with Droplet Digital PCR Based on Ligation Reaction.

    PubMed

    Tian, Hui; Sun, Yuanyuan; Liu, Chenghui; Duan, Xinrui; Tang, Wei; Li, Zhengping

    2016-12-06

    MicroRNA (miRNA) analysis in a single cell is extremely important because it allows deep understanding of the exact correlation between the miRNAs and cell functions. Herein, we wish to report a highly sensitive and precisely quantitative assay for miRNA detection based on ligation-based droplet digital polymerase chain reaction (ddPCR), which permits the quantitation of miRNA in a single cell. In this ligation-based ddPCR assay, two target-specific oligonucleotide probes can be simply designed to be complementary to the half-sequence of the target miRNA, respectively, which avoids the sophisticated design of reverse transcription and provides high specificity to discriminate a single-base difference among miRNAs with simple operations. After the miRNA-templated ligation, the ddPCR partitions individual ligated products into a water-in-oil droplet and digitally counts the fluorescence-positive and negative droplets after PCR amplification for quantification of the target molecules, which possesses the power of precise quantitation and robustness to variation in PCR efficiency. By integrating the advantages of the precise quantification of ddPCR and the simplicity of the ligation-based PCR, the proposed method can sensitively measure let-7a miRNA with a detection limit of 20 aM (12 copies per microliter), and even a single-base difference can be discriminated in let-7 family members. More importantly, due to its high selectivity and sensitivity, the proposed method can achieve precise quantitation of miRNAs in single-cell lysate. Therefore, the ligation-based ddPCR assay may serve as a useful tool to exactly reveal the miRNAs' actions in a single cell, which is of great importance for the study of miRNAs' biofunction as well as for the related biomedical studies.

  16. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe.

    PubMed

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-09

    Accurate quantitation of intracellular pH (pH i ) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pH i sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pH i . Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pH i , in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF 4 :Yb 3+ , Tm 3+ UCNPs were used as pH i response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pH i value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pH i related areas and development of the intracellular drug delivery systems.

  17. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    NASA Astrophysics Data System (ADS)

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.

  18. A web-based quantitative signal detection system on adverse drug reaction in China.

    PubMed

    Li, Chanjuan; Xia, Jielai; Deng, Jianxiong; Chen, Wenge; Wang, Suzhen; Jiang, Jing; Chen, Guanquan

    2009-07-01

    To establish a web-based quantitative signal detection system for adverse drug reactions (ADRs) based on spontaneous reporting to the Guangdong province drug-monitoring database in China. Using Microsoft Visual Basic and Active Server Pages programming languages and SQL Server 2000, a web-based system with three software modules was programmed to perform data preparation and association detection, and to generate reports. Information component (IC), the internationally recognized measure of disproportionality for quantitative signal detection, was integrated into the system, and its capacity for signal detection was tested with ADR reports collected from 1 January 2002 to 30 June 2007 in Guangdong. A total of 2,496 associations including known signals were mined from the test database. Signals (e.g., cefradine-induced hematuria) were found early by using the IC analysis. In addition, 291 drug-ADR associations were alerted for the first time in the second quarter of 2007. The system can be used for the detection of significant associations from the Guangdong drug-monitoring database and could be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs for the first time in China.

  19. A quantitative link between face discrimination deficits and neuronal selectivity for faces in autism☆

    PubMed Central

    Jiang, Xiong; Bollich, Angela; Cox, Patrick; Hyder, Eric; James, Joette; Gowani, Saqib Ali; Hadjikhani, Nouchine; Blanz, Volker; Manoach, Dara S.; Barton, Jason J.S.; Gaillard, William D.; Riesenhuber, Maximilian

    2013-01-01

    Individuals with Autism Spectrum Disorder (ASD) appear to show a general face discrimination deficit across a range of tasks including social–emotional judgments as well as identification and discrimination. However, functional magnetic resonance imaging (fMRI) studies probing the neural bases of these behavioral differences have produced conflicting results: while some studies have reported reduced or no activity to faces in ASD in the Fusiform Face Area (FFA), a key region in human face processing, others have suggested more typical activation levels, possibly reflecting limitations of conventional fMRI techniques to characterize neuron-level processing. Here, we test the hypotheses that face discrimination abilities are highly heterogeneous in ASD and are mediated by FFA neurons, with differences in face discrimination abilities being quantitatively linked to variations in the estimated selectivity of face neurons in the FFA. Behavioral results revealed a wide distribution of face discrimination performance in ASD, ranging from typical performance to chance level performance. Despite this heterogeneity in perceptual abilities, individual face discrimination performance was well predicted by neural selectivity to faces in the FFA, estimated via both a novel analysis of local voxel-wise correlations, and the more commonly used fMRI rapid adaptation technique. Thus, face processing in ASD appears to rely on the FFA as in typical individuals, differing quantitatively but not qualitatively. These results for the first time mechanistically link variations in the ASD phenotype to specific differences in the typical face processing circuit, identifying promising targets for interventions. PMID:24179786

  20. Quantitation of DNA adducts by stable isotope dilution mass spectrometry

    PubMed Central

    Tretyakova, Natalia; Goggin, Melissa; Janis, Gregory

    2012-01-01

    Exposure to endogenous and exogenous chemicals can lead to the formation of structurally modified DNA bases (DNA adducts). If not repaired, these nucleobase lesions can cause polymerase errors during DNA replication, leading to heritable mutations potentially contributing to the development of cancer. Due to their critical role in cancer initiation, DNA adducts represent mechanism-based biomarkers of carcinogen exposure, and their quantitation is particularly useful for cancer risk assessment. DNA adducts are also valuable in mechanistic studies linking tumorigenic effects of environmental and industrial carcinogens to specific electrophilic species generated from their metabolism. While multiple experimental methodologies have been developed for DNA adduct analysis in biological samples – including immunoassay, HPLC, and 32P-postlabeling – isotope dilution high performance liquid chromatography-electrospray ionization-tandem mass spectrometry (HPLC-ESI-MS/MS) generally has superior selectivity, sensitivity, accuracy, and reproducibility. As typical DNA adducts concentrations in biological samples are between 0.01 – 10 adducts per 108 normal nucleotides, ultrasensitive HPLC-ESI-MS/MS methodologies are required for their analysis. Recent developments in analytical separations and biological mass spectrometry – especially nanoflow HPLC, nanospray ionization MS, chip-MS, and high resolution MS – have pushed the limits of analytical HPLC-ESI-MS/MS methodologies for DNA adducts, allowing researchers to accurately measure their concentrations in biological samples from patients treated with DNA alkylating drugs and in populations exposed to carcinogens from urban air, drinking water, cooked food, alcohol, and cigarette smoke. PMID:22827593

  1. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  2. Mechanistic explanation, cognitive systems demarcation, and extended cognition.

    PubMed

    van Eck, Dingmar; Looren de Jong, Huib

    2016-10-01

    Approaches to the Internalism-Externalism controversy in the philosophy of mind often involve both (broadly) metaphysical and explanatory considerations. Whereas originally most emphasis seems to have been placed on metaphysical concerns, recently the explanation angle is getting more attention. Explanatory considerations promise to offer more neutral grounds for cognitive systems demarcation than (broadly) metaphysical ones. However, it has been argued that explanation-based approaches are incapable of determining the plausibility of internalist-based conceptions of cognition vis-à-vis externalist ones. On this perspective, improved metaphysics is the route along which to solve the Internalist-Externalist stalemate. In this paper we challenge this claim. Although we agree that explanation-orientated approaches have indeed so far failed to deliver solid means for cognitive system demarcation, we elaborate a more promising explanation-oriented framework to address this issue. We argue that the mutual manipulability account of constitutive relevance in mechanisms, extended with the criterion of 'fat-handedness', is capable of plausibly addressing the cognitive systems demarcation problem, and thus able to decide on the explanatory traction of Internalist vs. Externalist conceptions, on a case-by-case basis. Our analysis also highlights why some other recent mechanistic takes on the problem of cognitive systems demarcation have been unsuccessful. We illustrate our claims with a case on gestures and learning. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Network-based discovery through mechanistic systems biology. Implications for applications--SMEs and drug discovery: where the action is.

    PubMed

    Benson, Neil

    2015-08-01

    Phase II attrition remains the most important challenge for drug discovery. Tackling the problem requires improved understanding of the complexity of disease biology. Systems biology approaches to this problem can, in principle, deliver this. This article reviews the reports of the application of mechanistic systems models to drug discovery questions and discusses the added value. Although we are on the journey to the virtual human, the length, path and rate of learning from this remain an open question. Success will be dependent on the will to invest and make the most of the insight generated along the way. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. CHEMICAL MUTAGENESIS AND CARCINOGENESIS: INCORPORATION OF MECHANISTIC DATA INTO RISK ASSESSMENT

    EPA Science Inventory

    CHEMICAL MUTAGENESIS AND CARCINOGENESIS: INCORPORATION OF MECHANISTIC DATA INTO RISK ASSESSMENT

    The current understanding of cancer as a genetic disease, requiring a specific set of genomic alterations for a normal cell to form a metastatic tumor, has provided the oppor...

  5. Quantitative fluorescence nanoscopy for cancer biomedicine

    NASA Astrophysics Data System (ADS)

    Huang, Tao; Nickerson, Andrew; Peters, Alec; Nan, Xiaolin

    2015-08-01

    Cancer is a major health threat worldwide. Options for targeted cancer therapy, however, are often limited, in a large part due to our incomplete understanding of how key processes including oncogenesis and drug response are mediated at the molecular level. New imaging techniques for visualizing biomolecules and their interactions at the nanometer and single molecule scales, collectively named fluorescence nanoscopy, hold the promise to transform biomedical research by providing direct mechanistic insight into cellular processes. We discuss the principles of quantitative single-molecule localization microscopy (SMLM), a subset of fluorescence nanoscopy, and their applications to cancer biomedicine. In particular, we will examine oncogenesis and drug resistance mediated by mutant Ras, which is associated with ~1/3 of all human cancers but has remained an intractable drug target. At ~20 nm spatial and single-molecule stoichiometric resolutions, SMLM clearly showed that mutant Ras must form dimers to activate its effector pathways and drive oncogenesis. SMLM further showed that the Raf kinase, one of the most important effectors of Ras, also forms dimers upon activation by Ras. Moreover, treatment of cells expressing wild type Raf with Raf inhibitors induces Raf dimer formation in a manner dependent on Ras dimerization. Together, these data suggest that Ras dimers mediate oncogenesis and drug resistance in tumors with hyperactive Ras and can potentially be targeted for cancer therapy. We also discuss recent advances in SMLM that enable simultaneous imaging of multiple biomolecules and their interactions at the nanoscale. Our work demonstrates the power of quantitative SMLM in cancer biomedicine.

  6. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  7. Quantitative breast tissue characterization using grating-based x-ray phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Willner, M.; Herzen, J.; Grandl, S.; Auweter, S.; Mayr, D.; Hipp, A.; Chabior, M.; Sarapata, A.; Achterhold, K.; Zanette, I.; Weitkamp, T.; Sztrókay, A.; Hellerhoff, K.; Reiser, M.; Pfeiffer, F.

    2014-04-01

    X-ray phase-contrast imaging has received growing interest in recent years due to its high capability in visualizing soft tissue. Breast imaging became the focus of particular attention as it is considered the most promising candidate for a first clinical application of this contrast modality. In this study, we investigate quantitative breast tissue characterization using grating-based phase-contrast computed tomography (CT) at conventional polychromatic x-ray sources. Different breast specimens have been scanned at a laboratory phase-contrast imaging setup and were correlated to histopathology. Ascertained tumor types include phylloides tumor, fibroadenoma and infiltrating lobular carcinoma. Identified tissue types comprising adipose, fibroglandular and tumor tissue have been analyzed in terms of phase-contrast Hounsfield units and are compared to high-quality, high-resolution data obtained with monochromatic synchrotron radiation, as well as calculated values based on tabulated tissue properties. The results give a good impression of the method’s prospects and limitations for potential tumor detection and the associated demands on such a phase-contrast breast CT system. Furthermore, the evaluated quantitative tissue values serve as a reference for simulations and the design of dedicated phantoms for phase-contrast mammography.

  8. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    PubMed

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  9. New Mechanistic Models of Long Term Evolution of Microstructure and Mechanical Properties of Nickel Based Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruzic, Jamie J.; Evans, T. Matthew; Greaney, P. Alex

    The report describes the development of a discrete element method (DEM) based modeling approach to quantitatively predict deformation and failure of typical nickel based superalloys. A series of experimental data, including microstructure and mechanical property characterization at 600°C, was collected for a relatively simple, model solid solution Ni-20Cr alloy (Nimonic 75) to determine inputs for the model and provide data for model validation. Nimonic 75 was considered ideal for this study because it is a certified tensile and creep reference material. A series of new DEM modeling approaches were developed to capture the complexity of metal deformation, including cubic elasticmore » anisotropy and plastic deformation both with and without strain hardening. Our model approaches were implemented into a commercially available DEM code, PFC3D, that is commonly used by engineers. It is envisioned that once further developed, this new DEM modeling approach can be adapted to a wide range of engineering applications.« less

  10. Stable isotopic labeling-based quantitative targeted glycomics (i-QTaG).

    PubMed

    Kim, Kyoung-Jin; Kim, Yoon-Woo; Kim, Yun-Gon; Park, Hae-Min; Jin, Jang Mi; Hwan Kim, Young; Yang, Yung-Hun; Kyu Lee, Jun; Chung, Junho; Lee, Sun-Gu; Saghatelian, Alan

    2015-01-01

    Mass spectrometry (MS) analysis combined with stable isotopic labeling is a promising method for the relative quantification of aberrant glycosylation in diseases and disorders. We developed a stable isotopic labeling-based quantitative targeted glycomics (i-QTaG) technique for the comparative and quantitative analysis of total N-glycans using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). We established the analytical procedure with the chemical derivatizations (i.e., sialic acid neutralization and stable isotopic labeling) of N-glycans using a model glycoprotein (bovine fetuin). Moreover, the i-QTaG using MALDI-TOF MS was evaluated with various molar ratios (1:1, 1:2, 1:5) of (13) C6 /(12) C6 -2-aminobenzoic acid-labeled glycans from normal human serum. Finally, this method was applied to direct comparison of the total N-glycan profiles between normal human sera (n = 8) and prostate cancer patient sera (n = 17). The intensities of the N-glycan peaks from i-QTaG method showed a good linearity (R(2) > 0.99) with the amount of the bovine fetuin glycoproteins. The ratios of relative intensity between the isotopically 2-AA labeled N-glycans were close to the theoretical molar ratios (1:1, 1:2, 1:5). We also demonstrated that the up-regulation of the Lewis antigen (~82%) in sera from prostate cancer patients. In this proof-of-concept study, we demonstrated that the i-QTaG method, which enables to achieve a reliable comparative quantitation of total N-glycans via MALDI-TOF MS analysis, has the potential to diagnose and monitor alterations in glycosylation associated with disease states or biotherapeutics. © 2015 American Institute of Chemical Engineers.

  11. Systematic assessment of survey scan and MS2-based abundance strategies for label-free quantitative proteomics using high-resolution MS data.

    PubMed

    Tu, Chengjian; Li, Jun; Sheng, Quanhu; Zhang, Ming; Qu, Jun

    2014-04-04

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R(2) > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery.

  12. Systematic Assessment of Survey Scan and MS2-Based Abundance Strategies for Label-Free Quantitative Proteomics Using High-Resolution MS Data

    PubMed Central

    2015-01-01

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R2 > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery. PMID:24635752

  13. ReactionPredictor: prediction of complex chemical reactions at the mechanistic level using machine learning.

    PubMed

    Kayala, Matthew A; Baldi, Pierre

    2012-10-22

    Proposing reasonable mechanisms and predicting the course of chemical reactions is important to the practice of organic chemistry. Approaches to reaction prediction have historically used obfuscating representations and manually encoded patterns or rules. Here we present ReactionPredictor, a machine learning approach to reaction prediction that models elementary, mechanistic reactions as interactions between approximate molecular orbitals (MOs). A training data set of productive reactions known to occur at reasonable rates and yields and verified by inclusion in the literature or textbooks is derived from an existing rule-based system and expanded upon with manual curation from graduate level textbooks. Using this training data set of complex polar, hypervalent, radical, and pericyclic reactions, a two-stage machine learning prediction framework is trained and validated. In the first stage, filtering models trained at the level of individual MOs are used to reduce the space of possible reactions to consider. In the second stage, ranking models over the filtered space of possible reactions are used to order the reactions such that the productive reactions are the top ranked. The resulting model, ReactionPredictor, perfectly ranks polar reactions 78.1% of the time and recovers all productive reactions 95.7% of the time when allowing for small numbers of errors. Pericyclic and radical reactions are perfectly ranked 85.8% and 77.0% of the time, respectively, rising to >93% recovery for both reaction types with a small number of allowed errors. Decisions about which of the polar, pericyclic, or radical reaction type ranking models to use can be made with >99% accuracy. Finally, for multistep reaction pathways, we implement the first mechanistic pathway predictor using constrained tree-search to discover a set of reasonable mechanistic steps from given reactants to given products. Webserver implementations of both the single step and pathway versions of Reaction

  14. Fast charging technique for high power LiFePO4 batteries: A mechanistic analysis of aging

    NASA Astrophysics Data System (ADS)

    Anseán, D.; Dubarry, M.; Devie, A.; Liaw, B. Y.; García, V. M.; Viera, J. C.; González, M.

    2016-07-01

    One of the major issues hampering the acceptance of electric vehicles (EVs) is the anxiety associated with long charging time. Hence, the ability to fast charging lithium-ion battery (LIB) systems is gaining notable interest. However, fast charging is not tolerated by all LIB chemistries because it affects battery functionality and accelerates its aging processes. Here, we investigate the long-term effects of multistage fast charging on a commercial high power LiFePO4-based cell and compare it to another cell tested under standard charging. Coupling incremental capacity (IC) and IC peak area analysis together with mechanistic model simulations ('Alawa' toolbox with harvested half-cell data), we quantify the degradation modes that cause aging of the tested cells. The results show that the proposed fast charging technique caused similar aging effects as standard charging. The degradation is caused by a linear loss of lithium inventory, coupled with a less degree of linear loss of active material on the negative electrode. This study validates fast charging as a feasible mean of operation for this particular LIB chemistry and cell architecture. It also illustrates the benefits of a mechanistic approach to understand cell degradation on commercial cells.

  15. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    PubMed

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  16. Considerations in deriving quantitative cancer criteria for inorganic arsenic exposure via inhalation.

    PubMed

    Lewis, Ari S; Beyer, Leslie A; Zu, Ke

    2015-01-01

    The inhalation unit risk (IUR) that currently exists in the United States Environmental Protection Agency's (US EPA's) Integrated Risk Information System was developed in 1984 based on studies examining the relationship between respiratory cancer and arsenic exposure in copper smelters from two US locations: the copper smelter in Anaconda, Montana, and the American Smelting And Refining COmpany (ASARCO) smelter in Tacoma, Washington. Since US EPA last conducted its assessment, additional data have become available from epidemiology and mechanistic studies. In addition, the California Air Resources Board, Texas Commission of Environmental Quality, and Dutch Expert Committee on Occupational Safety have all conducted new risk assessments. All three analyses, which calculated IURs based on respiratory/lung cancer mortality, generated IURs that are lower (i.e., less restrictive) than the current US EPA value of 4.3×10(-3) (μg/m(3))(-1). The IURs developed by these agencies, which vary more than 20-fold, are based on somewhat different studies and use different methodologies to address uncertainties in the underlying datasets. Despite these differences, all were developed based on a cumulative exposure metric assuming a low-dose linear dose-response relationship. In this paper, we contrast and compare the analyses conducted by these agencies and critically evaluate strengths and limitations inherent in the data and methodologies used to develop quantitative risk estimates. In addition, we consider how these data could be best used to assess risk at much lower levels of arsenic in air, such as those experienced by the general public. Given that the mode of action for arsenic supports a threshold effect, and epidemiological evidence suggests that the arsenic concentration in air is a reliable predictor of lung/respiratory cancer risk, we developed a quantitative cancer risk analysis using a nonlinear threshold model. Applying a nonlinear model to occupational data, we

  17. Validation of pavement performance curves for the mechanistic-empirical pavement design guide.

    DOT National Transportation Integrated Search

    2009-02-01

    The objective of this research is to determine whether the nationally calibrated performance models used in the Mechanistic-Empirical : Pavement Design Guide (MEPDG) provide a reasonable prediction of actual field performance, and if the desired accu...

  18. Comparison of clinical semi-quantitative assessment of muscle fat infiltration with quantitative assessment using chemical shift-based water/fat separation in MR studies of the calf of post-menopausal women.

    PubMed

    Alizai, Hamza; Nardo, Lorenzo; Karampinos, Dimitrios C; Joseph, Gabby B; Yap, Samuel P; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M

    2012-07-01

    The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Sixty-two women (age 61 ± 6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P < 0.0001, R values ranging from 0.79 to 0.88). Goutallier grades 0-4 had a fat fraction ranging from 3.5 to 19%. Intra-observer and inter-observer agreement values of 0.83 and 0.81 were calculated for the semi-quantitative grading. Semi-quantitative grading of intramuscular fat and quantitative fat fraction were significantly correlated and both techniques had excellent reproducibility. However, the clinical grading was found to overestimate muscle fat. Fat infiltration of muscle commonly occurs in many metabolic and neuromuscular diseases. • Image-based semi-quantitative classifications for assessing fat infiltration are not well validated. • Quantitative MRI techniques provide an accurate assessment of muscle fat.

  19. Temporal Data Set Reduction Based on D-Optimality for Quantitative FLIM-FRET Imaging.

    PubMed

    Omer, Travis; Intes, Xavier; Hahn, Juergen

    2015-01-01

    Fluorescence lifetime imaging (FLIM) when paired with Förster resonance energy transfer (FLIM-FRET) enables the monitoring of nanoscale interactions in living biological samples. FLIM-FRET model-based estimation methods allow the quantitative retrieval of parameters such as the quenched (interacting) and unquenched (non-interacting) fractional populations of the donor fluorophore and/or the distance of the interactions. The quantitative accuracy of such model-based approaches is dependent on multiple factors such as signal-to-noise ratio and number of temporal points acquired when sampling the fluorescence decays. For high-throughput or in vivo applications of FLIM-FRET, it is desirable to acquire a limited number of temporal points for fast acquisition times. Yet, it is critical to acquire temporal data sets with sufficient information content to allow for accurate FLIM-FRET parameter estimation. Herein, an optimal experimental design approach based upon sensitivity analysis is presented in order to identify the time points that provide the best quantitative estimates of the parameters for a determined number of temporal sampling points. More specifically, the D-optimality criterion is employed to identify, within a sparse temporal data set, the set of time points leading to optimal estimations of the quenched fractional population of the donor fluorophore. Overall, a reduced set of 10 time points (compared to a typical complete set of 90 time points) was identified to have minimal impact on parameter estimation accuracy (≈5%), with in silico and in vivo experiment validations. This reduction of the number of needed time points by almost an order of magnitude allows the use of FLIM-FRET for certain high-throughput applications which would be infeasible if the entire number of time sampling points were used.

  20. Applying quantitative adiposity feature analysis models to predict benefit of bevacizumab-based chemotherapy in ovarian cancer patients

    NASA Astrophysics Data System (ADS)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2016-03-01

    How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.

  1. A Quantitative Corpus-Based Approach to English Spatial Particles: Conceptual Symmetry and Its Pedagogical Implications

    ERIC Educational Resources Information Center

    Chen, Alvin Cheng-Hsien

    2014-01-01

    The present study aims to investigate how conceptual symmetry plays a role in the use of spatial particles in English and to further examine its pedagogical implications via a corpus-based evaluation of the course books in senior high schools in Taiwan. More specifically, we adopt a quantitative corpus-based approach to investigate whether bipolar…

  2. Gold Nanoparticle Labeling Based ICP-MS Detection/Measurement of Bacteria, and Their Quantitative Photothermal Destruction

    PubMed Central

    Lin, Yunfeng

    2015-01-01

    Bacteria such as Salmonella and E. coli present a great challenge in public health care in today’s society. Protection of public safety against bacterial contamination and rapid diagnosis of infection require simple and fast assays for the detection and elimination of bacterial pathogens. After utilizing Salmonella DT104 as an example bacterial strain for our investigation, we report a rapid and sensitive assay for the qualitative and quantitative detection of bacteria by using antibody affinity binding, popcorn shaped gold nanoparticle (GNPOPs) labeling, surfance enchanced Raman spectroscopy (SERS), and inductively coupled plasma mass spectrometry (ICP-MS) detection. For qualitative analysis, our assay can detect Salmonella within 10 min by Raman spectroscopy; for quantitative analysis, our assay has the ability to measure as few as 100 Salmonella DT104 in a 1 mL sample (100 CFU/mL) within 40 min. Based on the quantitative detection, we investigated the quantitative destruction of Salmonella DT104, and the assay’s photothermal efficiency in order to reduce the amount of GNPOPs in the assay to ultimately to eliminate any potential side effects/toxicity to the surrounding cells in vivo. Results suggest that our assay may serve as a promising candidate for qualitative and quantitative detection and elimination of a variety of bacterial pathogens. PMID:26417447

  3. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; Zheng, Lu; Jiang, Zhanzhi; Ganesan, Vishal; Wang, Yayu; Lai, Keji

    2018-04-01

    We report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-field microwave imaging with small distance modulation.

  4. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Metal Fuel Radionuclide Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-02-01

    The development of an accurate and defensible mechanistic source term will be vital for the future licensing efforts of metal fuel, pool-type sodium fast reactors. To assist in the creation of a comprehensive mechanistic source term, the current effort sought to estimate the release fraction of radionuclides from metal fuel pins to the primary sodium coolant during fuel pin failures at a variety of temperature conditions. These release estimates were based on the findings of an extensive literature search, which reviewed past experimentation and reactor fuel damage accidents. Data sources for each radionuclide of interest were reviewed to establish releasemore » fractions, along with possible release dependencies, and the corresponding uncertainty levels. Although the current knowledge base is substantial, and radionuclide release fractions were established for the elements deemed important for the determination of offsite consequences following a reactor accident, gaps were found pertaining to several radionuclides. First, there is uncertainty regarding the transport behavior of several radionuclides (iodine, barium, strontium, tellurium, and europium) during metal fuel irradiation to high burnup levels. The migration of these radionuclides within the fuel matrix and bond sodium region can greatly affect their release during pin failure incidents. Post-irradiation examination of existing high burnup metal fuel can likely resolve this knowledge gap. Second, data regarding the radionuclide release from molten high burnup metal fuel in sodium is sparse, which makes the assessment of radionuclide release from fuel melting accidents at high fuel burnup levels difficult. This gap could be addressed through fuel melting experimentation with samples from the existing high burnup metal fuel inventory.« less

  5. Quantitative evaluation of simulated functional brain networks in graph theoretical analysis.

    PubMed

    Lee, Won Hee; Bullmore, Ed; Frangou, Sophia

    2017-02-01

    There is increasing interest in the potential of whole-brain computational models to provide mechanistic insights into resting-state brain networks. It is therefore important to determine the degree to which computational models reproduce the topological features of empirical functional brain networks. We used empirical connectivity data derived from diffusion spectrum and resting-state functional magnetic resonance imaging data from healthy individuals. Empirical and simulated functional networks, constrained by structural connectivity, were defined based on 66 brain anatomical regions (nodes). Simulated functional data were generated using the Kuramoto model in which each anatomical region acts as a phase oscillator. Network topology was studied using graph theory in the empirical and simulated data. The difference (relative error) between graph theory measures derived from empirical and simulated data was then estimated. We found that simulated data can be used with confidence to model graph measures of global network organization at different dynamic states and highlight the sensitive dependence of the solutions obtained in simulated data on the specified connection densities. This study provides a method for the quantitative evaluation and external validation of graph theory metrics derived from simulated data that can be used to inform future study designs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Characterization of unbound materials (soils/aggregates) for mechanistic-empirical pavement design guide.

    DOT National Transportation Integrated Search

    2009-02-01

    The resilient modulus (MR) input parameters in the Mechanistic-Empirical Pavement Design Guide (MEPDG) program have a significant effect on the projected pavement performance. The MEPDG program uses three different levels of inputs depending on the d...

  7. Proposed key characteristics of male reproductive toxicants as a method for organizing and screening mechanistic evidence for non-cancer outcomes.

    EPA Science Inventory

    The adoption of systematic review practices for risk assessment includes integration of evidence obtained from experimental, epidemiological, and mechanistic studies. Although mechanistic evidence plays an important role in mode of action analysis, the process of sorting and anal...

  8. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  9. Comparison of mechanistic transport cycle models of ABC exporters.

    PubMed

    Szöllősi, Dániel; Rose-Sperling, Dania; Hellmich, Ute A; Stockner, Thomas

    2018-04-01

    ABC (ATP binding cassette) transporters, ubiquitous in all kingdoms of life, carry out essential substrate transport reactions across cell membranes. Their transmembrane domains bind and translocate substrates and are connected to a pair of nucleotide binding domains, which bind and hydrolyze ATP to energize import or export of substrates. Over four decades of investigations into ABC transporters have revealed numerous details from atomic-level structural insights to their functional and physiological roles. Despite all these advances, a comprehensive understanding of the mechanistic principles of ABC transporter function remains elusive. The human multidrug resistance transporter ABCB1, also referred to as P-glycoprotein (P-gp), is one of the most intensively studied ABC exporters. Using ABCB1 as the reference point, we aim to compare the dominating mechanistic models of substrate transport and ATP hydrolysis for ABC exporters and to highlight the experimental and computational evidence in their support. In particular, we point out in silico studies that enhance and complement available biochemical data. "This article is part of a Special Issue entitled: Beyond the Structure-Function Horizon of Membrane Proteins edited by Ute Hellmich, Rupak Doshi and Benjamin McIlwain." Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  11. Mechanistic elucidation of the antitumor properties of withaferin A in breast cancer

    PubMed Central

    Nagalingam, Arumugam; Kuppusamy, Panjamurthy; Singh, Shivendra V.; Sharma, Dipali; Saxena, Neeraj K.

    2014-01-01

    Withaferin A (WFA) is a steroidal lactone with antitumor effects manifested at multiple levels which are mechanistically obscure. Using a phospho-kinase screening array, we discovered that WFA activated phosphorylation of the S6 kinase RSK in breast cancer cells. Pursuing this observation, we defined activation of ERK-RSK and Elk1-CHOP kinase pathways in upregulating transcription of the death receptor DR5. Through this route, WFA acted as an effective DR5 activator capable of potentiating the biological effects of celecoxib, etoposide and TRAIL. Accordingly, WFA treatment inhibited breast tumor formation in xenograft and MMTV-neu mouse models in a manner associated with activation of the ERK/RSK axis, DR5 upregulation and elevated nuclear accumulation of Elk1 and CHOP. Together, our results offer mechanistic insight into how WFA inhibits breast tumor growth. PMID:24732433

  12. Assessment of glycemic response to an oral glucokinase activator in a proof of concept study: application of a semi-mechanistic, integrated glucose-insulin-glucagon model.

    PubMed

    Schneck, Karen B; Zhang, Xin; Bauer, Robert; Karlsson, Mats O; Sinha, Vikram P

    2013-02-01

    A proof of concept study was conducted to investigate the safety and tolerability of a novel oral glucokinase activator, LY2599506, during multiple dose administration to healthy volunteers and subjects with Type 2 diabetes mellitus (T2DM). To analyze the study data, a previously established semi-mechanistic integrated glucose-insulin model was extended to include characterization of glucagon dynamics. The model captured endogenous glucose and insulin dynamics, including the amplifying effects of glucose on insulin production and of insulin on glucose elimination, as well as the inhibitory influence of glucose and insulin on hepatic glucose production. The hepatic glucose production in the model was increased by glucagon and glucagon production was inhibited by elevated glucose concentrations. The contribution of exogenous factors to glycemic response, such as ingestion of carbohydrates in meals, was also included in the model. The effect of LY2599506 on glucose homeostasis in subjects with T2DM was investigated by linking a one-compartment, pharmacokinetic model to the semi-mechanistic, integrated glucose-insulin-glucagon system. Drug effects were included on pancreatic insulin secretion and hepatic glucose production. The relationships between LY2599506, glucose, insulin, and glucagon concentrations were described quantitatively and consequently, the improved understanding of the drug-response system could be used to support further clinical study planning during drug development, such as dose selection.

  13. Millifluidics for Chemical Synthesis and Time-resolved Mechanistic Studies

    PubMed Central

    Krishna, Katla Sai; Biswas, Sanchita; Navin, Chelliah V.; Yamane, Dawit G.; Miller, Jeffrey T.; Kumar, Challa S.S.R.

    2013-01-01

    Procedures utilizing millifluidic devices for chemical synthesis and time-resolved mechanistic studies are described by taking three examples. In the first, synthesis of ultra-small copper nanoclusters is described. The second example provides their utility for investigating time resolved kinetics of chemical reactions by analyzing gold nanoparticle formation using in situ X-ray absorption spectroscopy. The final example demonstrates continuous flow catalysis of reactions inside millifluidic channel coated with nanostructured catalyst. PMID:24327099

  14. Smartphone based hand-held quantitative phase microscope using the transport of intensity equation method.

    PubMed

    Meng, Xin; Huang, Huachuan; Yan, Keding; Tian, Xiaolin; Yu, Wei; Cui, Haoyang; Kong, Yan; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-12-20

    In order to realize high contrast imaging with portable devices for potential mobile healthcare, we demonstrate a hand-held smartphone based quantitative phase microscope using the transport of intensity equation method. With a cost-effective illumination source and compact microscope system, multi-focal images of samples can be captured by the smartphone's camera via manual focusing. Phase retrieval is performed using a self-developed Android application, which calculates sample phases from multi-plane intensities via solving the Poisson equation. We test the portable microscope using a random phase plate with known phases, and to further demonstrate its performance, a red blood cell smear, a Pap smear and monocot root and broad bean epidermis sections are also successfully imaged. Considering its advantages as an accurate, high-contrast, cost-effective and field-portable device, the smartphone based hand-held quantitative phase microscope is a promising tool which can be adopted in the future in remote healthcare and medical diagnosis.

  15. Quantitative dispersion microscopy

    PubMed Central

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Dasari, Ramachandra R.; Feld, Michael

    2010-01-01

    Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live cells. The measured dispersion of living HeLa cells is found to be around 1.088, which agrees well with that measured directly for protein solutions using total internal reflection. This technique, together with the dry mass and morphology measurements provided by quantitative phase microscopy, could prove to be a useful tool for distinguishing different types of biomaterials and studying spatial inhomogeneities of biological samples. PMID:21113234

  16. Refining Intervention Targets in Family-Based Research: Lessons From Quantitative Behavioral Genetics

    PubMed Central

    Leve, Leslie D.; Harold, Gordon T.; Ge, Xiaojia; Neiderhiser, Jenae M.; Patterson, Gerald

    2010-01-01

    The results from a large body of family-based research studies indicate that modifying the environment (specifically dimensions of the social environment) through intervention is an effective mechanism for achieving positive outcomes. Parallel to this work is a growing body of evidence from genetically informed studies indicating that social environmental factors are central to enhancing or offsetting genetic influences. Increased precision in the understanding of the role of the social environment in offsetting genetic risk might provide new information about environmental mechanisms that could be applied to prevention science. However, at present, the multifaceted conceptualization of the environment in prevention science is mismatched with the more limited measurement of the environment in many genetically informed studies. A framework for translating quantitative behavioral genetic research to inform the development of preventive interventions is presented in this article. The measurement of environmental indices amenable to modification is discussed within the context of quantitative behavioral genetic studies. In particular, emphasis is placed on the necessary elements that lead to benefits in prevention science, specifically the development of evidence-based interventions. An example from an ongoing prospective adoption study is provided to illustrate the potential of this translational process to inform the selection of preventive intervention targets. PMID:21188273

  17. A mechanistic assessment of nutrient flushing at the catchment scale

    Treesearch

    Willem J. van Verseveld; Jeffrey J. McDonnell; Kate Lajtha

    2008-01-01

    This paper mechanistically assesses the flushing mechanism of DOC, DON, and DIN at the hillslope and catchment scales during two storm events, in a small catchment (WS10), H.J. Andrews Experimental Forest in the western Cascade Mountains of Oregon. Using a combination of natural tracer and hydrometric data, and end-member mixing analysis, we were able to describe the...

  18. Identifying the genes underlying quantitative traits: a rationale for the QTN programme.

    PubMed

    Lee, Young Wha; Gould, Billie A; Stinchcombe, John R

    2014-01-01

    The goal of identifying the genes or even nucleotides underlying quantitative and adaptive traits has been characterized as the 'QTN programme' and has recently come under severe criticism. Part of the reason for this criticism is that much of the QTN programme has asserted that finding the genes and nucleotides for adaptive and quantitative traits is a fundamental goal, without explaining why it is such a hallowed goal. Here we outline motivations for the QTN programme that offer general insight, regardless of whether QTNs are of large or small effect, and that aid our understanding of the mechanistic dynamics of adaptive evolution. We focus on five areas: (i) vertical integration of insight across different levels of biological organization, (ii) genetic parallelism and the role of pleiotropy in shaping evolutionary dynamics, (iii) understanding the forces maintaining genetic variation in populations, (iv) distinguishing between adaptation from standing variation and new mutation, and (v) the role of genomic architecture in facilitating adaptation. We argue that rather than abandoning the QTN programme, we should refocus our efforts on topics where molecular data will be the most effective for testing hypotheses about phenotypic evolution.

  19. Identifying the genes underlying quantitative traits: a rationale for the QTN programme

    PubMed Central

    Lee, Young Wha; Gould, Billie A.; Stinchcombe, John R.

    2014-01-01

    The goal of identifying the genes or even nucleotides underlying quantitative and adaptive traits has been characterized as the ‘QTN programme’ and has recently come under severe criticism. Part of the reason for this criticism is that much of the QTN programme has asserted that finding the genes and nucleotides for adaptive and quantitative traits is a fundamental goal, without explaining why it is such a hallowed goal. Here we outline motivations for the QTN programme that offer general insight, regardless of whether QTNs are of large or small effect, and that aid our understanding of the mechanistic dynamics of adaptive evolution. We focus on five areas: (i) vertical integration of insight across different levels of biological organization, (ii) genetic parallelism and the role of pleiotropy in shaping evolutionary dynamics, (iii) understanding the forces maintaining genetic variation in populations, (iv) distinguishing between adaptation from standing variation and new mutation, and (v) the role of genomic architecture in facilitating adaptation. We argue that rather than abandoning the QTN programme, we should refocus our efforts on topics where molecular data will be the most effective for testing hypotheses about phenotypic evolution. PMID:24790125

  20. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  1. PCR-free quantitative detection of genetically modified organism from raw materials – A novel electrochemiluminescence-based bio-barcode method

    PubMed Central

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R.

    2018-01-01

    Bio-barcode assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio-barcode assay requires lengthy experimental procedures including the preparation and release of barcode DNA probes from the target-nanoparticle complex, and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio-barcode assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2’2’-bipyridyl) ruthenium (TBR)-labele barcode DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products. PMID:18386909

  2. The use of mechanistic descriptions of algal growth and zooplankton grazing in an estuarine eutrophication model

    NASA Astrophysics Data System (ADS)

    Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.

    2003-03-01

    A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.

  3. VIPAR, a quantitative approach to 3D histopathology applied to lymphatic malformations

    PubMed Central

    Hägerling, René; Drees, Dominik; Scherzinger, Aaron; Dierkes, Cathrin; Martin-Almedina, Silvia; Butz, Stefan; Gordon, Kristiana; Schäfers, Michael; Hinrichs, Klaus; Vestweber, Dietmar; Goerge, Tobias; Mansour, Sahar; Mortimer, Peter S.

    2017-01-01

    BACKGROUND. Lack of investigatory and diagnostic tools has been a major contributing factor to the failure to mechanistically understand lymphedema and other lymphatic disorders in order to develop effective drug and surgical therapies. One difficulty has been understanding the true changes in lymph vessel pathology from standard 2D tissue sections. METHODS. VIPAR (volume information-based histopathological analysis by 3D reconstruction and data extraction), a light-sheet microscopy–based approach for the analysis of tissue biopsies, is based on digital reconstruction and visualization of microscopic image stacks. VIPAR allows semiautomated segmentation of the vasculature and subsequent nonbiased extraction of characteristic vessel shape and connectivity parameters. We applied VIPAR to analyze biopsies from healthy lymphedematous and lymphangiomatous skin. RESULTS. Digital 3D reconstruction provided a directly visually interpretable, comprehensive representation of the lymphatic and blood vessels in the analyzed tissue volumes. The most conspicuous features were disrupted lymphatic vessels in lymphedematous skin and a hyperplasia (4.36-fold lymphatic vessel volume increase) in the lymphangiomatous skin. Both abnormalities were detected by the connectivity analysis based on extracted vessel shape and structure data. The quantitative evaluation of extracted data revealed a significant reduction of lymphatic segment length (51.3% and 54.2%) and straightness (89.2% and 83.7%) for lymphedematous and lymphangiomatous skin, respectively. Blood vessel length was significantly increased in the lymphangiomatous sample (239.3%). CONCLUSION. VIPAR is a volume-based tissue reconstruction data extraction and analysis approach that successfully distinguished healthy from lymphedematous and lymphangiomatous skin. Its application is not limited to the vascular systems or skin. FUNDING. Max Planck Society, DFG (SFB 656), and Cells-in-Motion Cluster of Excellence EXC 1003. PMID

  4. VIPAR, a quantitative approach to 3D histopathology applied to lymphatic malformations.

    PubMed

    Hägerling, René; Drees, Dominik; Scherzinger, Aaron; Dierkes, Cathrin; Martin-Almedina, Silvia; Butz, Stefan; Gordon, Kristiana; Schäfers, Michael; Hinrichs, Klaus; Ostergaard, Pia; Vestweber, Dietmar; Goerge, Tobias; Mansour, Sahar; Jiang, Xiaoyi; Mortimer, Peter S; Kiefer, Friedemann

    2017-08-17

    Lack of investigatory and diagnostic tools has been a major contributing factor to the failure to mechanistically understand lymphedema and other lymphatic disorders in order to develop effective drug and surgical therapies. One difficulty has been understanding the true changes in lymph vessel pathology from standard 2D tissue sections. VIPAR (volume information-based histopathological analysis by 3D reconstruction and data extraction), a light-sheet microscopy-based approach for the analysis of tissue biopsies, is based on digital reconstruction and visualization of microscopic image stacks. VIPAR allows semiautomated segmentation of the vasculature and subsequent nonbiased extraction of characteristic vessel shape and connectivity parameters. We applied VIPAR to analyze biopsies from healthy lymphedematous and lymphangiomatous skin. Digital 3D reconstruction provided a directly visually interpretable, comprehensive representation of the lymphatic and blood vessels in the analyzed tissue volumes. The most conspicuous features were disrupted lymphatic vessels in lymphedematous skin and a hyperplasia (4.36-fold lymphatic vessel volume increase) in the lymphangiomatous skin. Both abnormalities were detected by the connectivity analysis based on extracted vessel shape and structure data. The quantitative evaluation of extracted data revealed a significant reduction of lymphatic segment length (51.3% and 54.2%) and straightness (89.2% and 83.7%) for lymphedematous and lymphangiomatous skin, respectively. Blood vessel length was significantly increased in the lymphangiomatous sample (239.3%). VIPAR is a volume-based tissue reconstruction data extraction and analysis approach that successfully distinguished healthy from lymphedematous and lymphangiomatous skin. Its application is not limited to the vascular systems or skin. Max Planck Society, DFG (SFB 656), and Cells-in-Motion Cluster of Excellence EXC 1003.

  5. Benchmarking B-Cell Epitope Prediction with Quantitative Dose-Response Data on Antipeptide Antibodies: Towards Novel Pharmaceutical Product Development

    PubMed Central

    Caoili, Salvador Eugenio C.

    2014-01-01

    B-cell epitope prediction can enable novel pharmaceutical product development. However, a mechanistically framed consensus has yet to emerge on benchmarking such prediction, thus presenting an opportunity to establish standards of practice that circumvent epistemic inconsistencies of casting the epitope prediction task as a binary-classification problem. As an alternative to conventional dichotomous qualitative benchmark data, quantitative dose-response data on antibody-mediated biological effects are more meaningful from an information-theoretic perspective in the sense that such effects may be expressed as probabilities (e.g., of functional inhibition by antibody) for which the Shannon information entropy (SIE) can be evaluated as a measure of informativeness. Accordingly, half-maximal biological effects (e.g., at median inhibitory concentrations of antibody) correspond to maximally informative data while undetectable and maximal biological effects correspond to minimally informative data. This applies to benchmarking B-cell epitope prediction for the design of peptide-based immunogens that elicit antipeptide antibodies with functionally relevant cross-reactivity. Presently, the Immune Epitope Database (IEDB) contains relatively few quantitative dose-response data on such cross-reactivity. Only a small fraction of these IEDB data is maximally informative, and many more of them are minimally informative (i.e., with zero SIE). Nevertheless, the numerous qualitative data in IEDB suggest how to overcome the paucity of informative benchmark data. PMID:24949474

  6. Fidelity in Animal Modeling: Prerequisite for a Mechanistic Research Front Relevant to the Inflammatory Incompetence of Acute Pediatric Malnutrition.

    PubMed

    Woodward, Bill

    2016-04-11

    Inflammatory incompetence is characteristic of acute pediatric protein-energy malnutrition, but its underlying mechanisms remain obscure. Perhaps substantially because the research front lacks the driving force of a scholarly unifying hypothesis, it is adrift and research activity is declining. A body of animal-based research points to a unifying paradigm, the Tolerance Model, with some potential to offer coherence and a mechanistic impetus to the field. However, reasonable skepticism prevails regarding the relevance of animal models of acute pediatric malnutrition; consequently, the fundamental contributions of the animal-based component of this research front are largely overlooked. Design-related modifications to improve the relevance of animal modeling in this research front include, most notably, prioritizing essential features of pediatric malnutrition pathology rather than dietary minutiae specific to infants and children, selecting windows of experimental animal development that correspond to targeted stages of pediatric immunological ontogeny, and controlling for ontogeny-related confounders. In addition, important opportunities are presented by newer tools including the immunologically humanized mouse and outbred stocks exhibiting a magnitude of genetic heterogeneity comparable to that of human populations. Sound animal modeling is within our grasp to stimulate and support a mechanistic research front relevant to the immunological problems that accompany acute pediatric malnutrition.

  7. Effective Heart Disease Detection Based on Quantitative Computerized Traditional Chinese Medicine Using Representation Based Classifiers.

    PubMed

    Shu, Ting; Zhang, Bob; Tang, Yuan Yan

    2017-01-01

    At present, heart disease is the number one cause of death worldwide. Traditionally, heart disease is commonly detected using blood tests, electrocardiogram, cardiac computerized tomography scan, cardiac magnetic resonance imaging, and so on. However, these traditional diagnostic methods are time consuming and/or invasive. In this paper, we propose an effective noninvasive computerized method based on facial images to quantitatively detect heart disease. Specifically, facial key block color features are extracted from facial images and analyzed using the Probabilistic Collaborative Representation Based Classifier. The idea of facial key block color analysis is founded in Traditional Chinese Medicine. A new dataset consisting of 581 heart disease and 581 healthy samples was experimented by the proposed method. In order to optimize the Probabilistic Collaborative Representation Based Classifier, an analysis of its parameters was performed. According to the experimental results, the proposed method obtains the highest accuracy compared with other classifiers and is proven to be effective at heart disease detection.

  8. Global QBO in circulation and ozone. Part 2: A simple mechanistic model

    NASA Technical Reports Server (NTRS)

    Tung, K. K.; Yang, H.

    1994-01-01

    Although the phenomenon of equatorial quasi-biennial oscillation is relatively well understood, the problem of how the equatorially confined quasi-biennial oscillation (QBO) wave forcing can induce a signal in the extratropics of comparable or larger magnitude remains unsolved. A simple mechanistic model is constructed to provide a quantitative test of the hypothesis that the phenomenon of extratropical QBO is mainly caused by an anomalous seasonal circulation induced by an anomalous Eliassen-Palm (E-P) flux divergence. The anomaly in E-P flux divergence may be caused in turn by the relative poleward and downward shift of the region of irreversible mixing (breaking) of the extratropical planetary waves during the easterly phase of the equatorial QBO as compared to its westerly phase. The hemispheric nature of the anomaly wave forcing in solstice seasons (viz., no wave breaking in the summer hemisphere) induces a global circulation anomaly that projects predominantly into the first few zonal Hough modes of Plumb. Such a global QBO circulation pattern, although difficult to measure directly, is reflected in the distribution of stratospheric tracers transported by it. Our model produces a global pattern of QBO anomaly in column ozone that appears to account for much of the unfiltered interannual variability in the column ozone observed by the total ozone mapping spectrometer (TOMS) instrument aboard the Nimbus satellite. Furthermore, the model produces the characteristic spectrum of the observation with peaks at periods of 20 and 30 months.

  9. Global QBO in circulation and ozone. Part 2: A simple mechanistic model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tung, K.K.; Yang, H.

    1994-10-01

    Although the phenomenon of equatorial quasi-biennial oscillation is relatively well understood, the problem of how the equatorially confined quasi-biennial oscillation (QBO) wave forcing can induce a signal in the extratropics of comparable or larger magnitude remains unsolved. A simple mechanistic model is constructed to provide a quantitative test of the hypothesis that the phenomenon of extratropical QBO is mainly caused by an anomalous seasonal circulation induced by an anomalous Eliassen-Palm (E-P) flux divergence. The anomaly in E-P flux divergence may be caused in turn by the relative poleward and downward shift of the region of irreversible mixing (breaking) of themore » extratropical planetary waves during the easterly phase of the equatorial QBO as compared to its westerly phase. The hemispheric nature of the anomaly wave forcing in solstice seasons (viz., no wave breaking in the summer hemisphere) induces a global circulation anomaly that projects predominantly into the first few zonal Hough modes of Plumb. Such a global QBO circulation pattern, although difficult to measure directly, is reflected in the distribution of stratospheric tracers transported by it. Our model produces a global pattern of QBO anomaly in column ozone that appears to account for much of the unfiltered interannual variability in the column ozone observed by the total ozone mapping spectrometer (TOMS) instrument aboard the Nimbus satellite. Furthermore, the model produces the characteristic spectrum of the observation with peaks at periods of 20 and 30 months.« less

  10. Efficient Generation and Selection of Virtual Populations in Quantitative Systems Pharmacology Models.

    PubMed

    Allen, R J; Rieger, T R; Musante, C J

    2016-03-01

    Quantitative systems pharmacology models mechanistically describe a biological system and the effect of drug treatment on system behavior. Because these models rarely are identifiable from the available data, the uncertainty in physiological parameters may be sampled to create alternative parameterizations of the model, sometimes termed "virtual patients." In order to reproduce the statistics of a clinical population, virtual patients are often weighted to form a virtual population that reflects the baseline characteristics of the clinical cohort. Here we introduce a novel technique to efficiently generate virtual patients and, from this ensemble, demonstrate how to select a virtual population that matches the observed data without the need for weighting. This approach improves confidence in model predictions by mitigating the risk that spurious virtual patients become overrepresented in virtual populations.

  11. Efficient Generation and Selection of Virtual Populations in Quantitative Systems Pharmacology Models

    PubMed Central

    Rieger, TR; Musante, CJ

    2016-01-01

    Quantitative systems pharmacology models mechanistically describe a biological system and the effect of drug treatment on system behavior. Because these models rarely are identifiable from the available data, the uncertainty in physiological parameters may be sampled to create alternative parameterizations of the model, sometimes termed “virtual patients.” In order to reproduce the statistics of a clinical population, virtual patients are often weighted to form a virtual population that reflects the baseline characteristics of the clinical cohort. Here we introduce a novel technique to efficiently generate virtual patients and, from this ensemble, demonstrate how to select a virtual population that matches the observed data without the need for weighting. This approach improves confidence in model predictions by mitigating the risk that spurious virtual patients become overrepresented in virtual populations. PMID:27069777

  12. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  13. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE PAGES

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; ...

    2018-04-01

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  14. Generating Linear Equations Based on Quantitative Reasoning

    ERIC Educational Resources Information Center

    Lee, Mi Yeon

    2017-01-01

    The Common Core's Standards for Mathematical Practice encourage teachers to develop their students' ability to reason abstractly and quantitatively by helping students make sense of quantities and their relationships within problem situations. The seventh-grade content standards include objectives pertaining to developing linear equations in…

  15. The coefficient of restitution of pressurized balls: a mechanistic model

    NASA Astrophysics Data System (ADS)

    Georgallas, Alex; Landry, Gaëtan

    2016-01-01

    Pressurized, inflated balls used in professional sports are regulated so that their behaviour upon impact can be anticipated and allow the game to have its distinctive character. However, the dynamics governing the impacts of such balls, even on stationary hard surfaces, can be extremely complex. The energy transformations, which arise from the compression of the gas within the ball and from the shear forces associated with the deformation of the wall, are examined in this paper. We develop a simple mechanistic model of the dependence of the coefficient of restitution, e, upon both the gauge pressure, P_G, of the gas and the shear modulus, G, of the wall. The model is validated using the results from a simple series of experiments using three different sports balls. The fits to the data are extremely good for P_G > 25 kPa and consistent values are obtained for the value of G for the wall material. As far as the authors can tell, this simple, mechanistic model of the pressure dependence of the coefficient of restitution is the first in the literature. *%K Coefficient of Restitution, Dynamics, Inflated Balls, Pressure, Impact Model

  16. Benchmarking successional progress in a quantitative food web.

    PubMed

    Boit, Alice; Gaedke, Ursula

    2014-01-01

    Central to ecology and ecosystem management, succession theory aims to mechanistically explain and predict the assembly and development of ecological communities. Yet processes at lower hierarchical levels, e.g. at the species and functional group level, are rarely mechanistically linked to the under-investigated system-level processes which drive changes in ecosystem properties and functioning and are comparable across ecosystems. As a model system for secondary succession, seasonal plankton succession during the growing season is readily observable and largely driven autogenically. We used a long-term dataset from large, deep Lake Constance comprising biomasses, auto- and heterotrophic production, food quality, functional diversity, and mass-balanced food webs of the energy and nutrient flows between functional guilds of plankton and partly fish. Extracting population- and system-level indices from this dataset, we tested current hypotheses about the directionality of successional progress which are rooted in ecosystem theory, the metabolic theory of ecology, quantitative food web theory, thermodynamics, and information theory. Our results indicate that successional progress in Lake Constance is quantifiable, passing through predictable stages. Mean body mass, functional diversity, predator-prey weight ratios, trophic positions, system residence times of carbon and nutrients, and the complexity of the energy flow patterns increased during succession. In contrast, both the mass-specific metabolic activity and the system export decreased, while the succession rate exhibited a bimodal pattern. The weighted connectance introduced here represents a suitable index for assessing the evenness and interconnectedness of energy flows during succession. Diverging from earlier predictions, ascendency and eco-exergy did not increase during succession. Linking aspects of functional diversity to metabolic theory and food web complexity, we reconcile previously disjoint bodies of

  17. Benchmarking Successional Progress in a Quantitative Food Web

    PubMed Central

    Boit, Alice; Gaedke, Ursula

    2014-01-01

    Central to ecology and ecosystem management, succession theory aims to mechanistically explain and predict the assembly and development of ecological communities. Yet processes at lower hierarchical levels, e.g. at the species and functional group level, are rarely mechanistically linked to the under-investigated system-level processes which drive changes in ecosystem properties and functioning and are comparable across ecosystems. As a model system for secondary succession, seasonal plankton succession during the growing season is readily observable and largely driven autogenically. We used a long-term dataset from large, deep Lake Constance comprising biomasses, auto- and heterotrophic production, food quality, functional diversity, and mass-balanced food webs of the energy and nutrient flows between functional guilds of plankton and partly fish. Extracting population- and system-level indices from this dataset, we tested current hypotheses about the directionality of successional progress which are rooted in ecosystem theory, the metabolic theory of ecology, quantitative food web theory, thermodynamics, and information theory. Our results indicate that successional progress in Lake Constance is quantifiable, passing through predictable stages. Mean body mass, functional diversity, predator-prey weight ratios, trophic positions, system residence times of carbon and nutrients, and the complexity of the energy flow patterns increased during succession. In contrast, both the mass-specific metabolic activity and the system export decreased, while the succession rate exhibited a bimodal pattern. The weighted connectance introduced here represents a suitable index for assessing the evenness and interconnectedness of energy flows during succession. Diverging from earlier predictions, ascendency and eco-exergy did not increase during succession. Linking aspects of functional diversity to metabolic theory and food web complexity, we reconcile previously disjoint bodies of

  18. Digital micromirror device-based common-path quantitative phase imaging.

    PubMed

    Zheng, Cheng; Zhou, Renjie; Kuang, Cuifang; Zhao, Guangyuan; Yaqoob, Zahid; So, Peter T C

    2017-04-01

    We propose a novel common-path quantitative phase imaging (QPI) method based on a digital micromirror device (DMD). The DMD is placed in a plane conjugate to the objective back-aperture plane for the purpose of generating two plane waves that illuminate the sample. A pinhole is used in the detection arm to filter one of the beams after sample to create a reference beam. Additionally, a transmission-type liquid crystal device, placed at the objective back-aperture plane, eliminates the specular reflection noise arising from all the "off" state DMD micromirrors, which is common in all DMD-based illuminations. We have demonstrated high sensitivity QPI, which has a measured spatial and temporal noise of 4.92 nm and 2.16 nm, respectively. Experiments with calibrated polystyrene beads illustrate the desired phase measurement accuracy. In addition, we have measured the dynamic height maps of red blood cell membrane fluctuations, showing the efficacy of the proposed system for live cell imaging. Most importantly, the DMD grants the system convenience in varying the interference fringe period on the camera to easily satisfy the pixel sampling conditions. This feature also alleviates the pinhole alignment complexity. We envision that the proposed DMD-based common-path QPI system will allow for system miniaturization and automation for a broader adaption.

  19. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  20. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  1. Students' Interpretations of Mechanistic Language in Organic Chemistry before Learning Reactions

    ERIC Educational Resources Information Center

    Galloway, Kelli R.; Stoyanovich, Carlee; Flynn, Alison B.

    2017-01-01

    Research on mechanistic thinking in organic chemistry has shown that students attribute little meaning to the electron-pushing (i.e., curved arrow) formalism. At the University of Ottawa, a new curriculum has been developed in which students are taught the electron-pushing formalism prior to instruction on specific reactions--this formalism is…

  2. Perspectives on the Application of Mechanistic Information in Chemical Hazard and Dose-Response Assessments

    EPA Science Inventory

    This overview summarizes several EPA Assessment publications reviewing approaches for applying mechanistic information in human health risk assessment and exploring opportunities for progress in this area.

  3. Characterization of material properties for mechanistic-empirical pavement design in Wyoming : final report.

    DOT National Transportation Integrated Search

    2016-12-01

    The Wyoming Department of Transportation (WYDOT) recently transitioned from the empirical AASHTO Design for Design of Pavement Structures to the Mechanistic Empirical Pavement Design Guide (MEPDG) as their standard pavement design procedure. A compre...

  4. Modeling Bird Migration under Climate Change: A Mechanistic Approach

    NASA Technical Reports Server (NTRS)

    Smith, James A.

    2009-01-01

    How will migrating birds respond to changes in the environment under climate change? What are the implications for migratory success under the various accelerated climate change scenarios as forecast by the Intergovernmental Panel on Climate Change? How will reductions or increased variability in the number or quality of wetland stop-over sites affect migratory bird species? The answers to these questions have important ramifications for conservation biology and wildlife management. Here, we describe the use of continental scale simulation modeling to explore how spatio-temporal changes along migratory flyways affect en-route migration success. We use an individually based, biophysical, mechanistic, bird migration model to simulate the movement of shorebirds in North America as a tool to study how such factors as drought and wetland loss may impact migratory success and modify migration patterns. Our model is driven by remote sensing and climate data and incorporates important landscape variables. The energy budget components of the model include resting, foraging, and flight, but presently predation is ignored. Results/Conclusions We illustrate our model by studying the spring migration of sandpipers through the Great Plains to their Arctic breeding grounds. Why many species of shorebirds have shown significant declines remains a puzzle. Shorebirds are sensitive to stop-over quality and spacing because of their need for frequent refueling stops and their opportunistic feeding patterns. We predict bird "hydrographs that is, stop-over frequency with latitude, that are in agreement with the literature. Mean stop-over durations predicted from our model for nominal cases also are consistent with the limited, but available data. For the shorebird species simulated, our model predicts that shorebirds exhibit significant plasticity and are able to shift their migration patterns in response to changing drought conditions. However, the question remains as to whether this

  5. Quantitative evaluation of analyte transport on microfluidic paper-based analytical devices (μPADs).

    PubMed

    Ota, Riki; Yamada, Kentaro; Suzuki, Koji; Citterio, Daniel

    2018-02-07

    The transport efficiency during capillary flow-driven sample transport on microfluidic paper-based analytical devices (μPADs) made from filter paper has been investigated for a selection of model analytes (Ni 2+ , Zn 2+ , Cu 2+ , PO 4 3- , bovine serum albumin, sulforhodamine B, amaranth) representing metal cations, complex anions, proteins and anionic molecules. For the first time, the transport of the analytical target compounds rather than the sample liquid, has been quantitatively evaluated by means of colorimetry and absorption spectrometry-based methods. The experiments have revealed that small paperfluidic channel dimensions, additional user operation steps (e.g. control of sample volume, sample dilution, washing step) as well as the introduction of sample liquid wicking areas allow to increase analyte transport efficiency. It is also shown that the interaction of analytes with the negatively charged cellulosic paper substrate surface is strongly influenced by the physico-chemical properties of the model analyte and can in some cases (Cu 2+ ) result in nearly complete analyte depletion during sample transport. The quantitative information gained through these experiments is expected to contribute to the development of more sensitive μPADs.

  6. Mechanistic studies related to the safety of Li/SOCl2 cells

    NASA Technical Reports Server (NTRS)

    Carter, B. J.; Williams, R. M.; Tsay, F. D.; Rodriguez, A.; Kim, S.; Evans, M. M.; Frank, H.

    1985-01-01

    Mechanistic studies of the reactions in Li-SOCl2 cells have been undertaken to improve understanding of the safety problems of these cells. The electrochemical reduction of 1.5M LiAlCl4/SOCl2 has been investigated using gas chromatography, electron spin resonance spectroscopy, and infrared spectroscopy. Cl2 and S2Cl2 have been identified as intermediates in the reduction of SOCl2, along with a radical species (g/xx/ = 2.004, g/yy/ = 2.016, g/zz/ = 2.008) and the proposed triplet ground-state dimer of this radical. SO2 and sulfur have been identified as products. Based upon these findings, a mechanism for the electrochemical reduction of 1.5M LiAlCl4/SOCl2 has been proposed, and its implications for safety of Li-SOCl2 cells during discharge to +0.5V at 25-30 C are discussed.

  7. Mechanistic origin of dragon-kings in a population of competing agents

    NASA Astrophysics Data System (ADS)

    Johnson, N.; Tivnan, B.

    2012-05-01

    We analyze the mechanistic origins of the extreme behaviors that arise in an idealized model of a population of competing agents, such as traders in a market. These extreme behaviors exhibit the defining characteristics of `dragon-kings'. Our model comprises heterogeneous agents who repeatedly compete for some limited resource, making binary choices based on the strategies that they have in their possession. It generalizes the well-known Minority Game by allowing agents whose strategies have not made accurate recent predictions, to step out of the competition until their strategies improve. This generates a complex dynamical interplay between the number V of active agents (mimicking market volume) and the imbalance D between the decisions made (mimicking excess demand). The wide spectrum of extreme behaviors which emerge, helps to explain why no unique relationship has been identified between the price and volume during real market crashes and rallies.

  8. Photoisomerization of ruthenium(ii) aquo complexes: mechanistic insights and application development.

    PubMed

    Hirahara, Masanari; Yagi, Masayuki

    2017-03-21

    Ruthenium(ii) complexes with polypyridyl ligands have been extensively studied as promising functional molecules due to their unique photochemical and photophysical properties as well as redox properties. In this context, we report the photoisomerization of distal-[Ru(tpy)(pynp)OH 2 ] 2+ (d-1) (tpy = 2,2';6',2''-terpyridine, pynp = 2-(2-pyridyl)-1,8-naphthyridine) to proximal-[Ru(tpy)(pynp)OH 2 ] 2+ (p-1), which has not been previously characterized for polypyridyl ruthenium(ii) aquo complexes. Herein, we review recent progress made by our group on the mechanistic insights and application developments related to the photoisomerization of polypyridyl ruthenium(ii) aquo complexes. We report a new strategic synthesis of dinuclear ruthenium(ii) complexes that can act as an active water oxidation catalyst, as well as the development of unique visible-light-responsive giant vesicles, both of which were achieved based on photoisomerization.

  9. Mechanistic Explanations for Restricted Evolutionary Paths That Emerge from Gene Regulatory Networks

    PubMed Central

    Cotterell, James; Sharpe, James

    2013-01-01

    The extent and the nature of the constraints to evolutionary trajectories are central issues in biology. Constraints can be the result of systems dynamics causing a non-linear mapping between genotype and phenotype. How prevalent are these developmental constraints and what is their mechanistic basis? Although this has been extensively explored at the level of epistatic interactions between nucleotides within a gene, or amino acids within a protein, selection acts at the level of the whole organism, and therefore epistasis between disparate genes in the genome is expected due to their functional interactions within gene regulatory networks (GRNs) which are responsible for many aspects of organismal phenotype. Here we explore epistasis within GRNs capable of performing a common developmental function – converting a continuous morphogen input into discrete spatial domains. By exploring the full complement of GRN wiring designs that are able to perform this function, we analyzed all possible mutational routes between functional GRNs. Through this study we demonstrate that mechanistic constraints are common for GRNs that perform even a simple function. We demonstrate a common mechanistic cause for such a constraint involving complementation between counter-balanced gene-gene interactions. Furthermore we show how such constraints can be bypassed by means of “permissive” mutations that buffer changes in a direct route between two GRN topologies that would normally be unviable. We show that such bypasses are common and thus we suggest that unlike what was observed in protein sequence-function relationships, the “tape of life” is less reproducible when one considers higher levels of biological organization. PMID:23613807

  10. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  11. A traffic data plan for mechanistic-empirical pavement designs (2002 pavement design guide).

    DOT National Transportation Integrated Search

    2003-01-01

    The Virginia Department of Transportation (VDOT) is preparing to implement the mechanistic-empirical pavement design methodology being developed under the National Cooperative Research Program's Project 1-37A, commonly referred to as the 2002 Pavemen...

  12. From Source to Sink: Mechanistic Reasoning Using the Electron-Pushing Formalism

    ERIC Educational Resources Information Center

    Bhattacharyya, Gautam

    2013-01-01

    Since the introduction of Morrison and Boyd's textbook in organic chemistry over 50 years ago, reaction mechanisms and mechanistic reasoning using the electron-pushing formalism (EPF) have become a mainstay of organic chemistry courses. In recent years there have even been several papers in this Journal and others detailing research on how…

  13. Pathophysiology of white-nose syndrome in bats: a mechanistic model linking wing damage to mortality

    USGS Publications Warehouse

    Warnecke, Lisa; Turner, James M.; Bollinger, Trent K.; Misra, Vikram; Cryan, Paul M.; Blehert, David S.; Wibbelt, Gudrun; Willis, Craig K.R.

    2013-01-01

    White-nose syndrome is devastating North American bat populations but we lack basic information on disease mechanisms. Altered blood physiology owing to epidermal invasion by the fungal pathogen Geomyces destructans (Gd) has been hypothesized as a cause of disrupted torpor patterns of affected hibernating bats, leading to mortality. Here, we present data on blood electrolyte concentration, haematology and acid–base balance of hibernating little brown bats, Myotis lucifugus, following experimental inoculation with Gd. Compared with controls, infected bats showed electrolyte depletion (i.e. lower plasma sodium), changes in haematology (i.e. increased haematocrit and decreased glucose) and disrupted acid–base balance (i.e. lower CO2 partial pressure and bicarbonate). These findings indicate hypotonic dehydration, hypovolaemia and metabolic acidosis. We propose a mechanistic model linking tissue damage to altered homeostasis and morbidity/mortality.

  14. Dynamic and accurate assessment of acetaminophen-induced hepatotoxicity by integrated photoacoustic imaging and mechanistic biomarkers in vivo.

    PubMed

    Brillant, Nathalie; Elmasry, Mohamed; Burton, Neal C; Rodriguez, Josep Monne; Sharkey, Jack W; Fenwick, Stephen; Poptani, Harish; Kitteringham, Neil R; Goldring, Christopher E; Kipar, Anja; Park, B Kevin; Antoine, Daniel J

    2017-10-01

    The prediction and understanding of acetaminophen (APAP)-induced liver injury (APAP-ILI) and the response to therapeutic interventions is complex. This is due in part to sensitivity and specificity limitations of currently used assessment techniques. Here we sought to determine the utility of integrating translational non-invasive photoacoustic imaging of liver function with mechanistic circulating biomarkers of hepatotoxicity with histological assessment to facilitate the more accurate and precise characterization of APAP-ILI and the efficacy of therapeutic intervention. Perturbation of liver function and cellular viability was assessed in C57BL/6J male mice by Indocyanine green (ICG) clearance (Multispectral Optoacoustic Tomography (MSOT)) and by measurement of mechanistic (miR-122, HMGB1) and established (ALT, bilirubin) circulating biomarkers in response to the acetaminophen and its treatment with acetylcysteine (NAC) in vivo. We utilised a 60% partial hepatectomy model as a situation of defined hepatic functional mass loss to compared acetaminophen-induced changes to. Integration of these mechanistic markers correlated with histological features of APAP hepatotoxicity in a time-dependent manner. They accurately reflected the onset and recovery from hepatotoxicity compared to traditional biomarkers and also reported the efficacy of NAC with high sensitivity. ICG clearance kinetics correlated with histological scores for acute liver damage for APAP (i.e. 3h timepoint; r=0.90, P<0.0001) and elevations in both of the mechanistic biomarkers, miR-122 (e.g. 6h timepoint; r=0.70, P=0.005) and HMGB1 (e.g. 6h timepoint; r=0.56, P=0.04). For the first time we report the utility of this non-invasive longitudinal imaging approach to provide direct visualisation of the liver function coupled with mechanistic biomarkers, in the same animal, allowing the investigation of the toxicological and pharmacological aspects of APAP-ILI and hepatic regeneration. Copyright © 2017

  15. Magnetophoresis for enhancing transdermal drug delivery: Mechanistic studies and patch design

    PubMed Central

    Murthy, S. Narasimha; Sammeta, Srinivasa M.; Bower, C.

    2017-01-01

    Magnetophoresis is a method of enhancement of drug permeation across the biological barriers by application of magnetic field. The present study investigated the mechanistic aspects of magnetophoretic transdermal drug delivery and also assessed the feasibility of designing a magnetophoretic transdermal patch system for the delivery of lidocaine. In vitro drug permeation studies were carried out across the porcine epidermis at different magnetic field strengths. The magnetophoretic drug permeation “flux enhancement factor” was found to increase with the applied magnetic field strength. The mechanistic studies revealed that the magnetic field applied in this study did not modulate permeability of the stratum corneum barrier. The predominant mechanism responsible for magnetically mediated drug permeation enhancement was found to be “magnetokinesis”. The octanol/water partition coefficient of drugs was also found to increase when exposed to the magnetic field. A reservoir type transdermal patch system with a magnetic backing was designed for in vivo studies. The dermal bioavailability (AUC0–6 h) from the magnetophoretic patch system in vivo, in rats was significantly higher than the similarly designed nonmagnetic control patch. PMID:20728484

  16. Quantitative analyses of tartaric acid based on terahertz time domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Cao, Binghua; Fan, Mengbao

    2010-10-01

    Terahertz wave is the electromagnetic spectrum situated between microwave and infrared wave. Quantitative analysis based on terahertz spectroscopy is very important for the application of terahertz techniques. But how to realize it is still under study. L-tartaric acid is widely used as acidulant in beverage, and other food, such as soft drinks, wine, candy, bread and some colloidal sweetmeats. In this paper, terahertz time-domain spectroscopy is applied to quantify the tartaric acid. Two methods are employed to process the terahertz spectra of different samples with different content of tartaric acid. The first one is linear regression combining correlation analysis. The second is partial least square (PLS), in which the absorption spectra in the 0.8-1.4THz region are used to quantify the tartaric acid. To compare the performance of these two principles, the relative error of the two methods is analyzed. For this experiment, the first method does better than the second one. But the first method is suitable for the quantitative analysis of materials which has obvious terahertz absorption peaks, while for material which has no obvious terahertz absorption peaks, the second one is more appropriate.

  17. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  18. Plasmonic Metasurfaces Based on Nanopin-Cavity Resonator for Quantitative Colorimetric Ricin Sensing.

    PubMed

    Fan, Jiao-Rong; Zhu, Jia; Wu, Wen-Gang; Huang, Yun

    2017-01-01

    In view of the toxic potential of a bioweapon threat, rapid visual recognition and sensing of ricin has been of considerable interest while remaining a challenging task up to date. In this study, a gold nanopin-based colorimetric sensor is developed realizing a multicolor variation for ricin qualitative recognition and analysis. It is revealed that such plasmonic metasurfaces based on nanopin-cavity resonator exhibit reflective color appearance, due to the excitation of standing-wave resonances of narrow bandwidth in visible region. This clear color variation is a consequence of the reflective color mixing defined by different resonant wavelengths. In addition, the colored metasurfaces appear sharp color difference in a narrow refractive index range, which makes them especially well-suited for sensing applications. Therefore, this antibody-functionalized nanopin-cavity biosensor features high sensitivity and fast response, allowing for visual quantitative ricin detection within the range of 10-120 ng mL -1 (0.15 × 10 -9 -1.8 × 10 -9 m), a limit of detection of 10 ng mL -1 , and the typical measurement time of less than 10 min. The on-chip integration of such nanopin metasurfaces to portable colorimetric microfluidic device may be envisaged for the quantitative studies of a variety of biochemical molecules. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    PubMed

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  20. Preparative and mechanistic studies toward the rational development of catalytic, enantioselective selenoetherification reactions.

    PubMed

    Denmark, Scott E; Kalyani, Dipannita; Collins, William R

    2010-11-10

    A systematic investigation into the Lewis base catalyzed, asymmetric, intramolecular selenoetherification of olefins is described. A critical challenge for the development of this process was the identification and suppression of racemization pathways available to arylseleniranium ion intermediates. This report details a thorough study of the influences of the steric and electronic modulation of the arylselenenyl group on the configurational stability of enantioenriched seleniranium ions. These studies show that the 2-nitrophenyl group attached to the selenium atom significantly attenuates the racemization of seleniranium ions. A variety of achiral Lewis bases catalyze the intramolecular selenoetherification of alkenes using N-(2-nitrophenylselenenyl)succinimide as the electrophile along with a Brønsted acid. Preliminary mechanistic studies suggest the intermediacy of ionic Lewis base-selenium(II) adducts. Most importantly, a broad survey of chiral Lewis bases revealed that 1,1'-binaphthalene-2,2'-diamine (BINAM)-derived thiophosphoramides catalyze the cyclization of unsaturated alcohols in the presence of N-(2-nitrophenylselenenyl)succinimide and methanesulfonic acid. A variety of cyclic seleno ethers were produced in good chemical yields and in moderate to good enantioselectivities, which constitutes the first catalytic, enantioselective selenofunctionalization of unactivated olefins.

  1. Quantitative Assessment of a Field-Based Course on Integrative Geology, Ecology and Cultural History

    ERIC Educational Resources Information Center

    Sheppard, Paul R.; Donaldson, Brad A.; Huckleberry, Gary

    2010-01-01

    A field-based course at the University of Arizona called Sense of Place (SOP) covers the geology, ecology and cultural history of the Tucson area. SOP was quantitatively assessed for pedagogical effectiveness. Students of the Spring 2008 course were given pre- and post-course word association surveys in order to assess awareness and comprehension…

  2. ['Anatomia actuosa et apta'. The mechanist 'proto'-physiology of B.S. Albinus].

    PubMed

    van der Korst, J K

    1993-01-01

    Already during his tenure as professor of anatomy and surgery (1721-1746) and before he became a professor of physiology and medicine at the University of Leiden, Bernard Siegfried Albinus held private lecture courses on physiology. In these lectures he pleaded for a separation of physiology from theoretical medicine, which was still its customary place in the medical curriculum of the first half of the eighteenth century. According to Albinus, physiology was a science in its own right and should be solely based on the careful observation of forms and structures of the human body. From the 'fabrica', the function ('aptitudo') could be derived by careful reasoning. As shown by a set of lecture notes, which recently came to light, Albinus adhered, initially, to a strictly mechanistic explanatory model, which was almost completely based on the physiological concepts of Herman Boerhaave. However, in contrast to the latter, he even rejected the involvement of chemical processes in digestion. Although his lectures were highly acclaimed as demonstrations of minute anatomy, Albinus met with little or no direct response in regard to his concept of physiology.

  3. Quantitative evaluation of specific vulnerability to nitrate for groundwater resource protection based on process-based simulation model.

    PubMed

    Huan, Huan; Wang, Jinsheng; Zhai, Yuanzheng; Xi, Beidou; Li, Juan; Li, Mingxiao

    2016-04-15

    It has been proved that groundwater vulnerability assessment is an effective tool for groundwater protection. Nowadays, quantitative assessment methods for specific vulnerability are scarce due to limited cognition of complicated contaminant fate and transport processes in the groundwater system. In this paper, process-based simulation model for specific vulnerability to nitrate using 1D flow and solute transport model in the unsaturated vadose zone is presented for groundwater resource protection. For this case study in Jilin City of northeast China, rate constants of denitrification and nitrification as well as adsorption constants of ammonium and nitrate in the vadose zone were acquired by laboratory experiments. The transfer time at the groundwater table t50 was taken as the specific vulnerability indicator. Finally, overall vulnerability was assessed by establishing the relationship between groundwater net recharge, layer thickness and t50. The results suggested that the most vulnerable regions of Jilin City were mainly distributed in the floodplain of Songhua River and Mangniu River. The least vulnerable areas mostly appear in the second terrace and back of the first terrace. The overall area of low, relatively low and moderate vulnerability accounted for 76% of the study area, suggesting the relatively low possibility of suffering nitrate contamination. In addition, the sensitivity analysis showed that the most sensitive factors of specific vulnerability in the vadose zone included the groundwater net recharge rate, physical properties of soil medium and rate constants of nitrate denitrification. By validating the suitability of the process-based simulation model for specific vulnerability and comparing with index-based method by a group of integrated indicators, more realistic and accurate specific vulnerability mapping could be acquired by the process-based simulation model acquiring. In addition, the advantages, disadvantages, constraint conditions and

  4. Quantitative mass spectrometry: an overview

    NASA Astrophysics Data System (ADS)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  5. Renilla luciferase-based quantitation of Potato virus A infection initiated with Agrobacterium infiltration of N. benthamiana leaves.

    PubMed

    Eskelin, K; Suntio, T; Hyvärinen, S; Hafren, A; Mäkinen, K

    2010-03-01

    A quantitation method based on the sensitive detection of Renilla luciferase (Rluc) activity was developed and optimized for Potato virus A (PVA; genus Potyviridae) gene expression. This system is based on infections initiated by Agrobacterium infiltration and subsequent detection of the translation of PVA::Rluc RNA, which is enhanced by viral replication, first within the cells infected initially and later by translation and replication within new cells after spread of the virus. Firefly luciferase (Fluc) was used as an internal control to normalize the Rluc activity. An approximately 10-fold difference in the Rluc/Fluc activity ratio between a movement-deficient and a replication-deficient mutant was observed starting from 48h post Agrobacterium infiltration (h.p.i.). The Rluc activity derived from wild type (wt) PVA increased significantly between 48 and 72h.p.i. and the Rluc/Fluc activity deviated clearly from that of the mutant viruses. Quantitation of the Rluc and Fluc mRNAs by semi-quantitative RT-PCR indicated that increases and decreases in the Renillareniformis luciferase (rluc) mRNA levels coincided with changes in Rluc activity. However, a subtle increase in the mRNA level led to pronounced changes in Rluc activity. PVA CP accumulation was quantitated by enzyme-linked immunosorbent assay. The increase in Rluc activity correlated closely with virus accumulation. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  6. Mechanistic modelling of cancer: some reflections from software engineering and philosophy of science.

    PubMed

    Cañete-Valdeón, José M; Wieringa, Roel; Smallbone, Kieran

    2012-12-01

    There is a growing interest in mathematical mechanistic modelling as a promising strategy for understanding tumour progression. This approach is accompanied by a methodological change of making research, in which models help to actively generate hypotheses instead of waiting for general principles to become apparent once sufficient data are accumulated. This paper applies recent research from philosophy of science to uncover three important problems of mechanistic modelling which may compromise its mainstream application, namely: the dilemma of formal and informal descriptions, the need to express degrees of confidence and the need of an argumentation framework. We report experience and research on similar problems from software engineering and provide evidence that the solutions adopted there can be transferred to the biological domain. We hope this paper can provoke new opportunities for further and profitable interdisciplinary research in the field.

  7. When relationships estimated in the past cannot be used to predict the future: using mechanistic models to predict landscape ecological dynamics in a changing world

    Treesearch

    Eric J. Gustafson

    2013-01-01

    Researchers and natural resource managers need predictions of how multiple global changes (e.g., climate change, rising levels of air pollutants, exotic invasions) will affect landscape composition and ecosystem function. Ecological predictive models used for this purpose are constructed using either a mechanistic (process-based) or a phenomenological (empirical)...

  8. Experimental validation of a Monte-Carlo-based inversion scheme for 3D quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Buchmann, Jens; Kaplan, Bernhard A.; Prohaska, Steffen; Laufer, Jan

    2017-03-01

    Quantitative photoacoustic tomography (qPAT) aims to extract physiological parameters, such as blood oxygen saturation (sO2), from measured multi-wavelength image data sets. The challenge of this approach lies in the inherently nonlinear fluence distribution in the tissue, which has to be accounted for by using an appropriate model, and the large scale of the inverse problem. In addition, the accuracy of experimental and scanner-specific parameters, such as the wavelength dependence of the incident fluence, the acoustic detector response, the beam profile and divergence, needs to be considered. This study aims at quantitative imaging of blood sO2, as it has been shown to be a more robust parameter compared to absolute concentrations. We propose a Monte-Carlo-based inversion scheme in conjunction with a reduction in the number of variables achieved using image segmentation. The inversion scheme is experimentally validated in tissue-mimicking phantoms consisting of polymer tubes suspended in a scattering liquid. The tubes were filled with chromophore solutions at different concentration ratios. 3-D multi-spectral image data sets were acquired using a Fabry-Perot based PA scanner. A quantitative comparison of the measured data with the output of the forward model is presented. Parameter estimates of chromophore concentration ratios were found to be within 5 % of the true values.

  9. Allelic-based gene-gene interaction associated with quantitative traits.

    PubMed

    Jung, Jeesun; Sun, Bin; Kwon, Deukwoo; Koller, Daniel L; Foroud, Tatiana M

    2009-05-01

    Recent studies have shown that quantitative phenotypes may be influenced not only by multiple single nucleotide polymorphisms (SNPs) within a gene but also by the interaction between SNPs at unlinked genes. We propose a new statistical approach that can detect gene-gene interactions at the allelic level which contribute to the phenotypic variation in a quantitative trait. By testing for the association of allelic combinations at multiple unlinked loci with a quantitative trait, we can detect the SNP allelic interaction whether or not it can be detected as a main effect. Our proposed method assigns a score to unrelated subjects according to their allelic combination inferred from observed genotypes at two or more unlinked SNPs, and then tests for the association of the allelic score with a quantitative trait. To investigate the statistical properties of the proposed method, we performed a simulation study to estimate type I error rates and power and demonstrated that this allelic approach achieves greater power than the more commonly used genotypic approach to test for gene-gene interaction. As an example, the proposed method was applied to data obtained as part of a candidate gene study of sodium retention by the kidney. We found that this method detects an interaction between the calcium-sensing receptor gene (CaSR), the chloride channel gene (CLCNKB) and the Na, K, 2Cl cotransporter gene (CLC12A1) that contributes to variation in diastolic blood pressure.

  10. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    PubMed

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  11. Digital micromirror device-based common-path quantitative phase imaging

    PubMed Central

    Zheng, Cheng; Zhou, Renjie; Kuang, Cuifang; Zhao, Guangyuan; Yaqoob, Zahid; So, Peter T. C.

    2017-01-01

    We propose a novel common-path quantitative phase imaging (QPI) method based on a digital micromirror device (DMD). The DMD is placed in a plane conjugate to the objective back-aperture plane for the purpose of generating two plane waves that illuminate the sample. A pinhole is used in the detection arm to filter one of the beams after sample to create a reference beam. Additionally, a transmission-type liquid crystal device, placed at the objective back-aperture plane, eliminates the specular reflection noise arising from all the “off” state DMD micromirrors, which is common in all DMD-based illuminations. We have demonstrated high sensitivity QPI, which has a measured spatial and temporal noise of 4.92 nm and 2.16 nm, respectively. Experiments with calibrated polystyrene beads illustrate the desired phase measurement accuracy. In addition, we have measured the dynamic height maps of red blood cell membrane fluctuations, showing the efficacy of the proposed system for live cell imaging. Most importantly, the DMD grants the system convenience in varying the interference fringe period on the camera to easily satisfy the pixel sampling conditions. This feature also alleviates the pinhole alignment complexity. We envision that the proposed DMD-based common-path QPI system will allow for system miniaturization and automation for a broader adaption. PMID:28362789

  12. Stable isotope labelling methods in mass spectrometry-based quantitative proteomics.

    PubMed

    Chahrour, Osama; Cobice, Diego; Malone, John

    2015-09-10

    Mass-spectrometry based proteomics has evolved as a promising technology over the last decade and is undergoing a dramatic development in a number of different areas, such as; mass spectrometric instrumentation, peptide identification algorithms and bioinformatic computational data analysis. The improved methodology allows quantitative measurement of relative or absolute protein amounts, which is essential for gaining insights into their functions and dynamics in biological systems. Several different strategies involving stable isotopes label (ICAT, ICPL, IDBEST, iTRAQ, TMT, IPTL, SILAC), label-free statistical assessment approaches (MRM, SWATH) and absolute quantification methods (AQUA) are possible, each having specific strengths and weaknesses. Inductively coupled plasma mass spectrometry (ICP-MS), which is still widely recognised as elemental detector, has recently emerged as a complementary technique to the previous methods. The new application area for ICP-MS is targeting the fast growing field of proteomics related research, allowing absolute protein quantification using suitable elemental based tags. This document describes the different stable isotope labelling methods which incorporate metabolic labelling in live cells, ICP-MS based detection and post-harvest chemical label tagging for protein quantification, in addition to summarising their pros and cons. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Principles of quantitation of viral loads using nucleic acid sequence-based amplification in combination with homogeneous detection using molecular beacons.

    PubMed

    Weusten, Jos J A M; Carpay, Wim M; Oosterlaken, Tom A M; van Zuijlen, Martien C A; van de Wiel, Paul A

    2002-03-15

    For quantitative NASBA-based viral load assays using homogeneous detection with molecular beacons, such as the NucliSens EasyQ HIV-1 assay, a quantitation algorithm is required. During the amplification process there is a constant growth in the concentration of amplicons to which the beacon can bind while generating a fluorescence signal. The overall fluorescence curve contains kinetic information on both amplicon formation and beacon binding, but only the former is relevant for quantitation. In the current paper, mathematical modeling of the relevant processes is used to develop an equation describing the fluorescence curve as a function of the amplification time and the relevant kinetic parameters. This equation allows reconstruction of RNA formation, which is characterized by an exponential increase in concentrations as long as the primer concentrations are not rate limiting and by linear growth over time after the primer pool is depleted. During the linear growth phase, the actual quantitation is based on assessing the amplicon formation rate from the viral RNA relative to that from a fixed amount of calibrator RNA. The quantitation procedure has been successfully applied in the NucliSens EasyQ HIV-1 assay.

  14. Competency-Based Education: A Quantitative Study of the U.S. Air Force Noncommissioned Officer Academy

    ERIC Educational Resources Information Center

    Houser, Bonnie L.

    2017-01-01

    There are relatively few empirical studies that examine whether using a competency-based education (CBE) approach results in increased student learning or achievement when compared to traditional education approaches. This study uses a quantitative research methodology, a nonexperimental comparative descriptive research design, and a two-group…

  15. Modeling of Mn/Road test sections with the CRREL mechanistic pavement design procedure

    DOT National Transportation Integrated Search

    1996-09-01

    The U.S. Army Cold Regions Research and Engineering Laboratory is developing a mechanistic pavement design procedure for use in seasonal frost areas. The procedure was used to predict pavement performance of some test sections under construction at t...

  16. Parabolic quantitative structure-activity relationships and photodynamic therapy: application of a three-compartment model with clearance to the in vivo quantitative structure-activity relationships of a congeneric series of pyropheophorbide derivatives used as photosensitizers for photodynamic therapy.

    PubMed

    Potter, W R; Henderson, B W; Bellnier, D A; Pandey, R K; Vaughan, L A; Weishaupt, K R; Dougherty, T J

    1999-11-01

    An open three-compartment pharmacokinetic model was applied to the in vivo quantitative structure-activity relationship (QSAR) data of a homologous series of pyropheophorbide photosensitizers for photodynamic therapy (PDT). The physical model was a lipid compartment sandwiched between two identical aqueous compartments. The first compartment was assumed to clear irreversibly at a rate K0. The measured octanol-water partition coefficients, P(i) (where i is the number of carbons in the alkyl chain) and the clearance rate K0 determined the clearance kinetics of the drugs. Solving the coupled differential equations of the three-compartment model produced clearance kinetics for each of the sensitizers in each of the compartments. The third compartment was found to contain the target of PDT. This series of compounds is quite lipophilic. Therefore these drugs are found mainly in the second compartment. The drug level in the third compartment represents a small fraction of the tissue level and is thus not accessible to direct measurement by extraction. The second compartment of the model accurately predicted the clearance from the serum of mice of the hexyl ether of pyropheophorbide a, one member of this series of compounds. The diffusion and clearance rate constants were those found by fitting the pharmacokinetics of the third compartment to the QSAR data. This result validated the magnitude and mechanistic significance of the rate constants used to model the QSAR data. The PDT response to dose theory was applied to the kinetic behavior of the target compartment drug concentration. This produced a pharmacokinetic-based function connecting PDT response to dose as a function of time postinjection. This mechanistic dose-response function was fitted to published, single time point QSAR data for the pheophorbides. As a result, the PDT target threshold dose together with the predicted QSAR as a function of time postinjection was found.

  17. A Quantitative Study of Teacher Readiness to Teach School-Based HIV/AIDS Education in Kenyan Primary Schools

    ERIC Educational Resources Information Center

    Lang'at, Edwin K.

    2014-01-01

    Purpose and Method of Study: The purpose of this study was to investigate teachers' self-perceived readiness to teach school-based HIV/AIDS Awareness and Prevention education in Kenyan primary schools based on their knowledge, attitudes and instructional confidence. This research utilized a non-experimental quantitative approach with a…

  18. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph

    PubMed Central

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045

  19. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph.

    PubMed

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.

  20. Modeling of batch sorber system: kinetic, mechanistic, and thermodynamic modeling

    NASA Astrophysics Data System (ADS)

    Mishra, Vishal

    2017-10-01

    The present investigation has dealt with the biosorption of copper and zinc ions on the surface of egg-shell particles in the liquid phase. Various rate models were evaluated to elucidate the kinetics of copper and zinc biosorptions, and the results indicated that the pseudo-second-order model was more appropriate than the pseudo-first-order model. The curve of the initial sorption rate versus the initial concentration of copper and zinc ions also complemented the results of the pseudo-second-order model. Models used for the mechanistic modeling were the intra-particle model of pore diffusion and Bangham's model of film diffusion. The results of the mechanistic modeling together with the values of pore and film diffusivities indicated that the preferential mode of the biosorption of copper and zinc ions on the surface of egg-shell particles in the liquid phase was film diffusion. The results of the intra-particle model showed that the biosorption of the copper and zinc ions was not dominated by the pore diffusion, which was due to macro-pores with open-void spaces present on the surface of egg-shell particles. The thermodynamic modeling reproduced the fact that the sorption of copper and zinc was spontaneous, exothermic with the increased order of the randomness at the solid-liquid interface.

  1. Refined pipe theory for mechanistic modeling of wood development.

    PubMed

    Deckmyn, Gaby; Evans, Sam P; Randle, Tim J

    2006-06-01

    We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).

  2. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  3. A computer system to be used with laser-based endoscopy for quantitative diagnosis of early gastric cancer.

    PubMed

    Miyaki, Rie; Yoshida, Shigeto; Tanaka, Shinji; Kominami, Yoko; Sanomura, Yoji; Matsuo, Taiji; Oka, Shiro; Raytchev, Bisser; Tamaki, Toru; Koide, Tetsushi; Kaneda, Kazufumi; Yoshihara, Masaharu; Chayama, Kazuaki

    2015-02-01

    To evaluate the usefulness of a newly devised computer system for use with laser-based endoscopy in differentiating between early gastric cancer, reddened lesions, and surrounding tissue. Narrow-band imaging based on laser light illumination has come into recent use. We devised a support vector machine (SVM)-based analysis system to be used with the newly devised endoscopy system to quantitatively identify gastric cancer on images obtained by magnifying endoscopy with blue-laser imaging (BLI). We evaluated the usefulness of the computer system in combination with the new endoscopy system. We evaluated the system as applied to 100 consecutive early gastric cancers in 95 patients examined by BLI magnification at Hiroshima University Hospital. We produced a set of images from the 100 early gastric cancers; 40 flat or slightly depressed, small, reddened lesions; and surrounding tissues, and we attempted to identify gastric cancer, reddened lesions, and surrounding tissue quantitatively. The average SVM output value was 0.846 ± 0.220 for cancerous lesions, 0.381 ± 0.349 for reddened lesions, and 0.219 ± 0.277 for surrounding tissue, with the SVM output value for cancerous lesions being significantly greater than that for reddened lesions or surrounding tissue. The average SVM output value for differentiated-type cancer was 0.840 ± 0.207 and for undifferentiated-type cancer was 0.865 ± 0.259. Although further development is needed, we conclude that our computer-based analysis system used with BLI will identify gastric cancers quantitatively.

  4. Quantitative structure-toxicity relationship (QSTR) studies on the organophosphate insecticides.

    PubMed

    Can, Alper

    2014-11-04

    Organophosphate insecticides are the most commonly used pesticides in the world. In this study, quantitative structure-toxicity relationship (QSTR) models were derived for estimating the acute oral toxicity of organophosphate insecticides to male rats. The 20 chemicals of the training set and the seven compounds of the external testing set were described by means of using descriptors. Descriptors for lipophilicity, polarity and molecular geometry, as well as quantum chemical descriptors for energy were calculated. Model development to predict toxicity of organophosphate insecticides in different matrices was carried out using multiple linear regression. The model was validated internally and externally. In the present study, QSTR model was used for the first time to understand the inherent relationships between the organophosphate insecticide molecules and their toxicity behavior. Such studies provide mechanistic insight about structure-toxicity relationship and help in the design of less toxic insecticides. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  6. Microstructural study of the nickel-base alloy WAZ-20 using qualitative and quantitative electron optical techniques

    NASA Technical Reports Server (NTRS)

    Young, S. G.

    1973-01-01

    The NASA nickel-base alloy WAZ-20 was analyzed by advanced metallographic techniques to qualitatively and quantitatively characterize its phases and stability. The as-cast alloy contained primary gamma-prime, a coarse gamma-gamma prime eutectic, a gamma-fine gamma prime matrix, and MC carbides. A specimen aged at 870 C for 1000 hours contained these same constituents and a few widely scattered high W particles. No detrimental phases (such as sigma or mu) were observed. Scanning electron microscope, light metallography, and replica electron microscope methods are compared. The value of quantitative electron microprobe techniques such as spot and area analysis is demonstrated.

  7. Quantitative identification of chemical compounds by dual-soliton based coherent anti-Stokes Raman scattering spectroscopy

    NASA Astrophysics Data System (ADS)

    Chen, Kun; Wu, Tao; Li, Yan; Wei, Haoyun

    2017-12-01

    Coherent anti-Stokes Raman scattering (CARS) is a powerful nonlinear spectroscopy technique that is rapidly gaining recognition of different molecules. Unfortunately, molecular concentration information is generally not immediately accessible from the raw CARS signal due to the nonresonant background. In addition, mainstream biomedical applications of CARS are currently hampered by a complex and bulky excitation setup. Here, we establish a dual-soliton Stokes based CARS spectroscopy scheme capable of quantifying the sample molecular, using a single fiber laser. This dual-soliton CARS scheme takes advantage of a differential configuration to achieve efficient suppression of nonresonant background and therefore allows extraction of quantitative composition information. Besides, our all-fiber based excitation source can probe the most fingerprint region (1100-1800 cm-1) with a spectral resolution of 15 cm-1 under the spectral focusing mechanism, where is considerably more information contained throughout an entire spectrum than at just a single frequency within that spectrum. Systematic studies of the scope of application and several fundamental aspects are discussed. Quantitative capability is further experimentally demonstrated through the determination of oleic acid concentration based on the linear dependence of signal on different Raman vibration bands.

  8. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China*

    PubMed Central

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-01-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors. PMID:19353749

  9. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China.

    PubMed

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-04-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors.

  10. A precision oncology approach to the pharmacological targeting of mechanistic dependencies in neuroendocrine tumors. | Office of Cancer Genomics

    Cancer.gov

    We introduce and validate a new precision oncology framework for the systematic prioritization of drugs targeting mechanistic tumor dependencies in individual patients. Compounds are prioritized on the basis of their ability to invert the concerted activity of master regulator proteins that mechanistically regulate tumor cell state, as assessed from systematic drug perturbation assays. We validated the approach on a cohort of 212 gastroenteropancreatic neuroendocrine tumors (GEP-NETs), a rare malignancy originating in the pancreas and gastrointestinal tract.

  11. Ultrasound and Microbubble Guided Drug Delivery: Mechanistic Understanding and Clinical Implications

    PubMed Central

    Wang, Tzu-Yin; Wilson, Katheryne E.; Machtaler, Steven; Willmann, Jürgen K.

    2014-01-01

    Ultrasound mediated drug delivery using microbubbles is a safe and noninvasive approach for spatially localized drug administration. This approach can create temporary and reversible openings on cellular membranes and vessel walls (a process called “sonoporation”), allowing for enhanced transport of therapeutic agents across these natural barriers. It is generally believed that the sonoporation process is highly associated with the energetic cavitation activities (volumetric expansion, contraction, fragmentation, and collapse) of the microbubble. However, a thorough understanding of the process was unavailable until recently. Important progress on the mechanistic understanding of sonoporation and the corresponding physiological responses in vitro and in vivo has been made. Specifically, recent research shed light on the cavitation process of microbubbles and fluid motion during insonation of ultrasound, on the spatio-temporal interactions between microbubbles and cells or vessel walls, as well as on the temporal course of the subsequent biological effects. These findings have significant clinical implications on the development of optimal treatment strategies for effective drug delivery. In this article, current progress in the mechanistic understanding of ultrasound and microbubble mediated drug delivery and its implications for clinical translation is discussed. PMID:24372231

  12. Quantitative comparison between full-spectrum and filter-based imaging in hyperspectral fluorescence microscopy

    PubMed Central

    GAO, L.; HAGEN, N.; TKACZYK, T.S.

    2012-01-01

    Summary We implement a filterless illumination scheme on a hyperspectral fluorescence microscope to achieve full-range spectral imaging. The microscope employs polarisation filtering, spatial filtering and spectral unmixing filtering to replace the role of traditional filters. Quantitative comparisons between full-spectrum and filter-based microscopy are provided in the context of signal dynamic range and accuracy of measured fluorophores’ emission spectra. To show potential applications, a five-colour cell immunofluorescence imaging experiment is theoretically simulated. Simulation results indicate that the use of proposed full-spectrum imaging technique may result in three times improvement in signal dynamic range compared to that can be achieved in the filter-based imaging. PMID:22356127

  13. COLLABORATION ON NHEERL EPIDEMIOLOGY STUDIES

    EPA Science Inventory

    This task will continue ORD's efforts to develop a biologically plausible, quantitative health risk model for particulate matter (PM) based on epidemiological, toxicological, and mechanistic studies using matched exposure assessments. The NERL, in collaboration with the NHEERL, ...

  14. Spatiotemporal microbiota dynamics from quantitative in vitro and in silico models of the gut

    NASA Astrophysics Data System (ADS)

    Hwa, Terence

    The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth behaviors, which ultimately dictate the gut microbiota composition. Combining measurements of bacterial growth physiology with analysis of published data on human physiology into a quantitative modeling framework, we show how hydrodynamic forces in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla in the gut. Our model quantitatively explains the observed variation of microbiota composition among healthy adults, and predicts colonic water absorption (manifested as stool consistency) and nutrient intake to be two key factors determining this composition. The model further reveals that both factors, which have been identified in recent correlative studies, exert their effects through the same mechanism: changes in colonic pH that differentially affect the growth of different bacteria. Our findings show that a predictive and mechanistic understanding of microbial ecology in the human gut is possible, and offer the hope for the rational design of intervention strategies to actively control the microbiota. This work is supported by the Bill and Melinda Gates Foundation.

  15. Quantitative insights for the design of substrate-based SIRT1 inhibitors.

    PubMed

    Kokkonen, Piia; Mellini, Paolo; Nyrhilä, Olli; Rahnasto-Rilla, Minna; Suuronen, Tiina; Kiviranta, Päivi; Huhtiniemi, Tero; Poso, Antti; Jarho, Elina; Lahtela-Kakkonen, Maija

    2014-08-01

    Sirtuin 1 (SIRT1) is the most studied human sirtuin and it catalyzes the deacetylation reaction of acetylated lysine residues of its target proteins, for example histones. It is a promising drug target in the treatment of age-related diseases, such as neurodegenerative diseases and cancer. In this study, a series of known substrate-based sirtuin inhibitors was analyzed with comparative molecular field analysis (CoMFA), which is a three-dimensional quantitative structure-activity relationships (3D-QSAR) technique. The CoMFA model was validated both internally and externally, producing the statistical values concordance correlation coefficient (CCC) of 0.88, the mean value r(2)m of 0.66 and Q(2)F3 of 0.89. Based on the CoMFA interaction contours, 13 new potential inhibitors with high predicted activity were designed, and the activities were verified by in vitro measurements. This work proposes an effective approach for the design and activity prediction of new potential substrate-based SIRT1 inhibitors. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Corrigendum: Free Will and Punishment: A Mechanistic View of Human Nature Reduces Retribution.

    PubMed

    2018-02-01

    Original article: Shariff, A. F., Greene, J. D., Karremans, J. C., Luguri, J. B., Clark, C. J., Schooler, J. W., . . . Vohs, K. D. (2014). Free will and punishment: A mechanistic view of human nature reduces retribution. Psychological Science, 25, 1563-1570. doi:10.1177/0956797614534693.

  17. Improved characterization of truck traffic volumes and axle loads for mechanistic-empirical pavement design.

    DOT National Transportation Integrated Search

    2012-12-01

    The recently developed mechanistic-empirical pavement design guide (MEPDG) requires a multitude of traffic : inputs to be defined for the design of pavement structures, including the initial two-way annual average daily truck : traffic (AADTT), direc...

  18. Layer moduli of Nebraska pavements for the new Mechanistic-Empirical Pavement Design Guide (MEPDG).

    DOT National Transportation Integrated Search

    2010-12-01

    As a step-wise implementation effort of the Mechanistic-Empirical Pavement Design Guide (MEPDG) for the design : and analysis of Nebraska flexible pavement systems, this research developed a database of layer moduli dynamic : modulus, creep compl...

  19. A climate-driven mechanistic population model of Aedes albopictus with diapause.

    PubMed

    Jia, Pengfei; Lu, Liang; Chen, Xiang; Chen, Jin; Guo, Li; Yu, Xiao; Liu, Qiyong

    2016-03-24

    The mosquito Aedes albopitus is a competent vector for the transmission of many blood-borne pathogens. An important factor that affects the mosquitoes' development and spreading is climate, such as temperature, precipitation and photoperiod. Existing climate-driven mechanistic models overlook the seasonal pattern of diapause, referred to as the survival strategy of mosquito eggs being dormant and unable to hatch under extreme weather. With respect to diapause, several issues remain unaddressed, including identifying the time when diapause eggs are laid and hatched under different climatic conditions, demarcating the thresholds of diapause and non-diapause periods, and considering the mortality rate of diapause eggs. Here we propose a generic climate-driven mechanistic population model of Ae. albopitus applicable to most Ae. albopictus-colonized areas. The new model is an improvement over the previous work by incorporating the diapause behaviors with many modifications to the stage-specific mechanism of the mosquitoes' life-cycle. monthly Container Index (CI) of Ae. albopitus collected in two Chinese cities, Guangzhou and Shanghai is used for model validation. The simulation results by the proposed model is validated with entomological field data by the Pearson correlation coefficient r (2) in Guangzhou (r (2) = 0.84) and in Shanghai (r (2) = 0.90). In addition, by consolidating the effect of diapause-related adjustments and temperature-related parameters in the model, the improvement is significant over the basic model. The model highlights the importance of considering diapause in simulating Ae. albopitus population. It also corroborates that temperature and photoperiod are significant in affecting the population dynamics of the mosquito. By refining the relationship between Ae. albopitus population and climatic factors, the model serves to establish a mechanistic relation to the growth and decline of the species. Understanding this relationship in a better way

  20. A globotetraosylceramide (Gb₄) receptor-based ELISA for quantitative detection of Shiga toxin 2e.

    PubMed

    Togashi, Katsuhiro; Sasaki, Shiho; Sato, Wataru

    2015-08-01

    Currently, no simple assays are available for routine quantitative detection of Escherichia coli-produced Shiga toxin 2e (Stx2e) that causes porcine edema disease. Here, we present a novel quantitative detection method for Stx2e based on the measurement of Stx2e binding to the specific globotetraosylceramide (Gb4) receptor by ELISA (Gb4-ELISA). No cross-reactivity was found with the other Shiga toxins Stx1 and Stx2, indicating high specificity. When the recombinant Stx2e B subunit (Stx2eB) was used, the absorbance measured by Gb4-ELISA increased linearly with Stx2eB concentration in the range of 20-2,500 ng/ml. The Gb4-ELISA method can be easily performed, suggesting that it would be a useful diagnostic tool for porcine edema disease.

  1. Understanding quantitative research: part 1.

    PubMed

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  2. Synthesizing Quantitative Evidence for Evidence-based Nursing: Systematic Review.

    PubMed

    Oh, Eui Geum

    2016-06-01

    As evidence-based practice has become an important issue in healthcare settings, the educational needs for knowledge and skills for the generation and utilization of healthcare evidence are increasing. Systematic review (SR), a way of evidence generation, is a synthesis of primary scientific evidence, which summarizes the best evidence on a specific clinical question using a transparent, a priori protocol driven approach. SR methodology requires a critical appraisal of primary studies, data extraction in a reliable and repeatable way, and examination for validity of the results. SRs are considered hierarchically as the highest form of evidence as they are a systematic search, identification, and summarization of the available evidence to answer a focused clinical question with particular attention to the methodological quality of studies or the credibility of opinion and text. The purpose of this paper is to introduce an overview of the fundamental knowledge, principals and processes in SR. The focus of this paper is on SR especially for the synthesis of quantitative data from primary research studies that examines the effectiveness of healthcare interventions. To activate evidence-based nursing care in various healthcare settings, the best and available scientific evidence are essential components. This paper will include some examples to promote understandings. Copyright © 2016. Published by Elsevier B.V.

  3. Physiologically induced color-pattern changes in butterfly wings: mechanistic and evolutionary implications.

    PubMed

    Otaki, Joji M

    2008-07-01

    A mechanistic understanding of the butterfly wing color-pattern determination can be facilitated by experimental pattern changes. Here I review physiologically induced color-pattern changes in nymphalid butterflies and their mechanistic and evolutionary implications. A type of color-pattern change can be elicited by elemental changes in size and position throughout the wing, as suggested by the nymphalid groundplan. These changes of pattern elements are bi-directional and bi-sided dislocation toward or away from eyespot foci and in both proximal and distal sides of the foci. The peripheral elements are dislocated even in the eyespot-less compartments. Anterior spots are more severely modified, suggesting the existence of an anterior-posterior gradient. In one species, eyespots are transformed into white spots with remnant-like orange scales, and such patterns emerge even at the eyespot-less "imaginary" foci. A series of these color-pattern modifications probably reveal "snap-shots" of a dynamic morphogenic signal due to heterochronic uncoupling between the signaling and reception steps. The conventional gradient model can be revised to account for these observed color-pattern changes.

  4. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  5. Unexpected Solubility Enhancement of Drug Bases in the Presence of a Dimethylaminoethyl Methacrylate Copolymer.

    PubMed

    Saal, Wiebke; Ross, Alfred; Wyttenbach, Nicole; Alsenz, Jochem; Kuentz, Martin

    2018-01-02

    The methacrylate copolymer Eudragit EPO (EPO) has previously shown to greatly enhance solubilization of acidic drugs via ionic interactions and by multiple hydrophobic contacts with polymeric side chains. The latter type of interaction could also play a role for solubilization of other compounds than acids. The aim of this study was therefore to investigate the solubility of six poorly soluble bases in presence and absence of EPO by quantitative ultrapressure liquid chromatography with concomitant X-ray powder diffraction analysis of the solid state. For a better mechanistic understanding, spectra and diffusion data were obtained by 1 H nuclear magnetic resonance (NMR) spectroscopy. Unexpected high solubility enhancement (up to 360-fold) was evidenced in the presence of EPO despite the fact that bases and polymer were both carrying positive charges. This exceptional and unexpected solubilization was not due to a change in the crystalline solid state. NMR spectra and measured diffusion coefficients indicated both strong drug-polymer interactions in the bulk solution, and diffusion data suggested conformational changes of the polymer in solution. Such conformational changes may have increased the accessibility and extent of hydrophobic contacts thereby leading to increased overall molecular interactions. These initially surprising solubilization results demonstrate that excipient selection should not be based solely on simple considerations of, for example, opposite charges of drug and excipient, but it requires a more refined molecular view. Different solution NMR techniques are especially promising tools to gain such mechanistic insights.

  6. Nonspecific immunomodulators for recurrent respiratory tract infections, wheezing and asthma in children: a systematic review of mechanistic and clinical evidence.

    PubMed

    Esposito, Susanna; Soto-Martinez, Manuel E; Feleszko, Wojciech; Jones, Marcus H; Shen, Kun-Ling; Schaad, Urs B

    2018-06-01

    To provide an overview of the mechanistic and clinical evidence for the use of nonspecific immunomodulators in paediatric respiratory tract infection (RTI) and wheezing/asthma prophylaxis. Nonspecific immunomodulators have a long history of empirical use for the prevention of RTIs in vulnerable populations, such as children. The past decade has seen an increase in both the number and quality of studies providing mechanistic and clinical evidence for the prophylactic potential of nonspecific immunomodulators against both respiratory infections and wheezing/asthma in the paediatric population. Orally administered immunomodulators result in the mounting of innate and adaptive immune responses to infection in the respiratory mucosa and anti-inflammatory effects in proinflammatory environments. Clinical data reflect these mechanistic effects in reductions in the recurrence of respiratory infections and wheezing events in high-risk paediatric populations. A new generation of clinical studies is currently underway with the power to position the nonspecific bacterial lysate immunomodulator OM-85 as a potential antiasthma prophylactic. An established mechanistic and clinical role for prophylaxis against paediatric respiratory infections by nonspecific immunomodulators exists. Clinical trials underway promise to provide high-quality data to establish whether a similar role exists in wheezing/asthma prevention.

  7. Brain Injury Lesion Imaging Using Preconditioned Quantitative Susceptibility Mapping without Skull Stripping.

    PubMed

    Soman, S; Liu, Z; Kim, G; Nemec, U; Holdsworth, S J; Main, K; Lee, B; Kolakowsky-Hayner, S; Selim, M; Furst, A J; Massaband, P; Yesavage, J; Adamson, M M; Spincemallie, P; Moseley, M; Wang, Y

    2018-04-01

    Identifying cerebral microhemorrhage burden can aid in the diagnosis and management of traumatic brain injury, stroke, hypertension, and cerebral amyloid angiopathy. MR imaging susceptibility-based methods are more sensitive than CT for detecting cerebral microhemorrhage, but methods other than quantitative susceptibility mapping provide results that vary with field strength and TE, require additional phase maps to distinguish blood from calcification, and depict cerebral microhemorrhages as bloom artifacts. Quantitative susceptibility mapping provides universal quantification of tissue magnetic property without these constraints but traditionally requires a mask generated by skull-stripping, which can pose challenges at tissue interphases. We evaluated the preconditioned quantitative susceptibility mapping MR imaging method, which does not require skull-stripping, for improved depiction of brain parenchyma and pathology. Fifty-six subjects underwent brain MR imaging with a 3D multiecho gradient recalled echo acquisition. Mask-based quantitative susceptibility mapping images were created using a commonly used mask-based quantitative susceptibility mapping method, and preconditioned quantitative susceptibility images were made using precondition-based total field inversion. All images were reviewed by a neuroradiologist and a radiology resident. Ten subjects (18%), all with traumatic brain injury, demonstrated blood products on 3D gradient recalled echo imaging. All lesions were visible on preconditioned quantitative susceptibility mapping, while 6 were not visible on mask-based quantitative susceptibility mapping. Thirty-one subjects (55%) demonstrated brain parenchyma and/or lesions that were visible on preconditioned quantitative susceptibility mapping but not on mask-based quantitative susceptibility mapping. Six subjects (11%) demonstrated pons artifacts on preconditioned quantitative susceptibility mapping and mask-based quantitative susceptibility mapping

  8. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  9. Characterization of truck traffic in Michigan for the new mechanistic empirical pavement design guide.

    DOT National Transportation Integrated Search

    2009-12-01

    The purpose of this study is to characterize traffic inputs in support of the new Mechanistic- : Empirical Pavement Design Guide (M-E PDG) for the state of Michigan. These traffic : characteristics include monthly distribution factors (MDF), hourly d...

  10. Targeted, Site-specific quantitation of N- and O-glycopeptides using 18O-labeling and product ion based mass spectrometry.

    PubMed

    Srikanth, Jandhyam; Agalyadevi, Rathinasamy; Babu, Ponnusamy

    2017-02-01

    The site-specific quantitation of N- and O-glycosylation is vital to understanding the function(s) of different glycans expressed at a given site of a protein under physiological and disease conditions. Most commonly used precursor ion intensity based quantification method is less accurate and other labeled methods are expensive and require enrichment of glycopeptides. Here, we used glycopeptide product (y and Y0) ions and 18 O-labeling of C-terminal carboxyl group as a strategy to obtain quantitative information about fold-change and relative abundance of most of the glycoforms attached to the glycopeptides. As a proof of concept, the accuracy and robustness of this targeted, relative quantification LC-MS method was demonstrated using Rituximab. Furthermore, the N-glycopeptide quantification results were compared with a biosimilar of Rituximab and validated with quantitative data obtained from 2-AB-UHPLC-FL method. We further demonstrated the intensity fold-change and relative abundance of 46 unique N- and O-glycopeptides and aglycopeptides from innovator and biosimilar samples of Etanercept using both the normal-MS and product ion based quantitation. The results showed a very similar site-specific expression of N- and O-glycopeptides between the samples but with subtle differences. Interestingly, we have also been able to quantify macro-heterogeneity of all N- and O-glycopetides of Etanercept. In addition to applications in biotherapeutics, the developed method can also be used for site-specific quantitation of N- and O-glycopeptides and aglycopeptides of glycoproteins with known glycosylation pattern.

  11. Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.

    PubMed

    Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P

    2013-12-16

    Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and

  12. Single Fluorescence Channel-based Multiplex Detection of Avian Influenza Virus by Quantitative PCR with Intercalating Dye

    PubMed Central

    Ahberg, Christian D.; Manz, Andreas; Neuzil, Pavel

    2015-01-01

    Since its invention in 1985 the polymerase chain reaction (PCR) has become a well-established method for amplification and detection of segments of double-stranded DNA. Incorporation of fluorogenic probe or DNA intercalating dyes (such as SYBR Green) into the PCR mixture allowed real-time reaction monitoring and extraction of quantitative information (qPCR). Probes with different excitation spectra enable multiplex qPCR of several DNA segments using multi-channel optical detection systems. Here we show multiplex qPCR using an economical EvaGreen-based system with single optical channel detection. Previously reported non quantitative multiplex real-time PCR techniques based on intercalating dyes were conducted once the PCR is completed by performing melting curve analysis (MCA). The technique presented in this paper is both qualitative and quantitative as it provides information about the presence of multiple DNA strands as well as the number of starting copies in the tested sample. Besides important internal control, multiplex qPCR also allows detecting concentrations of more than one DNA strand within the same sample. Detection of the avian influenza virus H7N9 by PCR is a well established method. Multiplex qPCR greatly enhances its specificity as it is capable of distinguishing both haemagglutinin (HA) and neuraminidase (NA) genes as well as their ratio. PMID:26088868

  13. A GIS-based Quantitative Approach for the Search of Clandestine Graves, Italy.

    PubMed

    Somma, Roberta; Cascio, Maria; Silvestro, Massimiliano; Torre, Eliana

    2018-05-01

    Previous research on the RAG color-coded prioritization systems for the discovery of clandestine graves has not considered all the factors influencing the burial site choice within a GIS project. The goal of this technical note was to discuss a GIS-based quantitative approach for the search of clandestine graves. The method is based on cross-referenced RAG maps with cumulative suitability factors to host a burial, leading to the editing of different search scenarios for ground searches showing high-(Red), medium-(Amber), and low-(Green) priority areas. The application of this procedure allowed several outcomes to be determined: If the concealment occurs at night, then the "search scenario without the visibility" will be the most effective one; if the concealment occurs in daylight, then the "search scenario with the DSM-based visibility" will be most appropriate; the different search scenarios may be cross-referenced with offender's confessions and eyewitnesses' testimonies to verify the veracity of their statements. © 2017 American Academy of Forensic Sciences.

  14. Identification of common coexpression modules based on quantitative network comparison.

    PubMed

    Jo, Yousang; Kim, Sanghyeon; Lee, Doheon

    2018-06-13

    Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.

  15. A Mechanistic Study of Arsenic (III) Rejection by Reverse Osmosis and Nanofiltration Membranes

    ERIC Educational Resources Information Center

    Suzuki, Tasuma

    2009-01-01

    Reverse osmosis/nanofiltration (RO/NF) membranes are capable to provide an effective barrier for a wide range of contaminants (including disinfection by-products precursors) in a single treatment step. However, solute rejection mechanisms by RO/NF membranes are not well understood. The lack of mechanistic information arises from experimental…

  16. Mechanistic and clinical insights at the scleroderma-cancer interface

    PubMed Central

    Shah, Ami A.; Casciola-Rosen, Livia

    2017-01-01

    Emerging data suggest tantalizing links between cancer and systemic inflammatory rheumatic syndromes. In scleroderma, patients may have an increased risk of cancer secondary to chronic inflammation and damage from the disease, malignant transformation promoted by immunosuppressive therapies, a shared susceptibility to both cancer and autoimmunity, or a common inciting exposure. However, it is increasingly recognized that a subset of patients develop cancer around the time that scleroderma clinically manifests, raising the question of cancer-induced autoimmunity. In this review, we discuss data suggesting a mechanistic link between cancer and the development of scleroderma, and the clinical implications of these findings. PMID:29264402

  17. Agonistic TAM-163 antibody targeting tyrosine kinase receptor-B: applying mechanistic modeling to enable preclinical to clinical translation and guide clinical trial design.

    PubMed

    Vugmeyster, Yulia; Rohde, Cynthia; Perreault, Mylene; Gimeno, Ruth E; Singh, Pratap

    2013-01-01

    TAM-163, an agonist monoclonal antibody targeting tyrosine receptor kinase-B (TrkB), is currently being investigated as a potential body weight modulatory agent in humans. To support the selection of the dose range for the first-in-human (FIH) trial of TAM-163, we conducted a mechanistic analysis of the pharmacokinetic (PK) and pharmacodynamic (PD) data (e.g., body weight gain) obtained in lean cynomolgus and obese rhesus monkeys following single doses ranging from 0.3 to 60 mg/kg. A target-mediated drug disposition (TMDD) model was used to describe the observed nonlinear PK and Emax approach was used to describe the observed dose-dependent PD effect. The TMDD model development was supported by the experimental determination of the binding affinity constant (9.4 nM) and internalization rate of the drug-target complex (2.08 h(-1)). These mechanistic analyses enabled linking of exposure, target (TrkB) coverage, and pharmacological activity (e.g., PD) in monkeys, and indicated that ≥ 38% target coverage (time-average) was required to achieve significant body weight gain in monkeys. Based on the scaling of the TMDD model from monkeys to humans and assuming similar relationship between the target coverage and pharmacological activity between monkey and humans, subcutaneous (SC) doses of 1 and 15 mg/kg in humans were projected to be the minimally and the fully pharmacologically active doses, respectively. Based on the minimal anticipated biological effect level (MABEL) approach for starting dose selection, the dose of 0.05 mg/kg (3 mg for a 60 kg human) SC was recommended as the starting dose for FIH trials, because at this dose level<10% target coverage was projected at Cmax (and all other time points). This study illustrates a rational mechanistic approach for the selection of FIH dose range for a therapeutic protein with a complex model of action.

  18. The Quantitative Evaluation of the Clinical and Translational Science Awards (CTSA) Program Based on Science Mapping and Scientometric Analysis

    PubMed Central

    Zhang, Yin; Wang, Lei

    2013-01-01

    Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689

  19. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.

  20. Ecological Forecasting in Chesapeake Bay: Using a Mechanistic-Empirical Modelling Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, C. W.; Hood, Raleigh R.; Long, Wen

    The Chesapeake Bay Ecological Prediction System (CBEPS) automatically generates daily nowcasts and three-day forecasts of several environmental variables, such as sea-surface temperature and salinity, the concentrations of chlorophyll, nitrate, and dissolved oxygen, and the likelihood of encountering several noxious species, including harmful algal blooms and water-borne pathogens, for the purpose of monitoring the Bay's ecosystem. While the physical and biogeochemical variables are forecast mechanistically using the Regional Ocean Modeling System configured for the Chesapeake Bay, the species predictions are generated using a novel mechanistic empirical approach, whereby real-time output from the coupled physical biogeochemical model drives multivariate empirical habitat modelsmore » of the target species. The predictions, in the form of digital images, are available via the World Wide Web to interested groups to guide recreational, management, and research activities. Though full validation of the integrated forecasts for all species is still a work in progress, we argue that the mechanistic–empirical approach can be used to generate a wide variety of short-term ecological forecasts, and that it can be applied in any marine system where sufficient data exist to develop empirical habitat models. This paper provides an overview of this system, its predictions, and the approach taken.« less

  1. Evolutionary and mechanistic drivers of laterality: A review and new synthesis.

    PubMed

    Wiper, Mallory L

    2017-11-01

    Laterality, best understood as asymmetries of bilateral structures or biases in behaviour, has been demonstrated in species from all major vertebrate classes, and in many invertebrates, showing a large degree of evolutionary conservation across vertebrate groups. Despite the establishment of this phenomenon in so many species, however, the evolutionary and mechanistic study of laterality is uneven with numerous areas in this field requiring greater attention. Here, I present a partial review of how far the study of laterality has come, outlining previous pioneering work, I discuss the hypothesized costs and benefits of a lateralized brain and the suggested path of the evolution of laterality for populations and individuals. I propose an expansion of laterality research into areas that have been touched upon in the past but require stronger evidence from which the field will greatly benefit. Namely, I suggest a continuation of the phylogenetic approach to investigating laterality to better understand its evolutionary path; and a further focus on mechanistic drivers, with special attention to genetic and environmental effects. Putting together the puzzle of laterality using as many pieces as possible will provide a stronger understanding of this field, allowing us to continue to expand the field in novel ways.

  2. Multifunctional sample preparation kit and on-chip quantitative nucleic acid sequence-based amplification tests for microbial detection.

    PubMed

    Zhao, Xinyan; Dong, Tao

    2012-10-16

    This study reports a quantitative nucleic acid sequence-based amplification (Q-NASBA) microfluidic platform composed of a membrane-based sampling module, a sample preparation cassette, and a 24-channel Q-NASBA chip for environmental investigations on aquatic microorganisms. This low-cost and highly efficient sampling module, having seamless connection with the subsequent steps of sample preparation and quantitative detection, is designed for the collection of microbial communities from aquatic environments. Eight kinds of commercial membrane filters are relevantly analyzed using Saccharomyces cerevisiae, Escherichia coli, and Staphylococcus aureus as model microorganisms. After the microorganisms are concentrated on the membrane filters, the retentate can be easily conserved in a transport medium (TM) buffer and sent to a remote laboratory. A Q-NASBA-oriented sample preparation cassette is originally designed to extract DNA/RNA molecules directly from the captured cells on the membranes. Sequentially, the extract is analyzed within Q-NASBA chips that are compatible with common microplate readers in laboratories. Particularly, a novel analytical algorithmic method is developed for simple but robust on-chip Q-NASBA assays. The reported multifunctional microfluidic system could detect a few microorganisms quantitatively and simultaneously. Further research should be conducted to simplify and standardize ecological investigations on aquatic environments.

  3. Mechanistic variables can enhance predictive models of endotherm distributions: the American pika under current, past, and future climates.

    PubMed

    Mathewson, Paul D; Moyer-Horner, Lucas; Beever, Erik A; Briscoe, Natalie J; Kearney, Michael; Yahn, Jeremiah M; Porter, Warren P

    2017-03-01

    How climate constrains species' distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8-19% less habitat loss in response to annual temperature increases of ~3-5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect

  4. Mechanistic variables can enhance predictive models of endotherm distributions: The American pika under current, past, and future climates

    USGS Publications Warehouse

    Mathewson, Paul; Moyer-Horner, Lucas; Beever, Erik; Briscoe, Natalie; Kearney, Michael T.; Yahn, Jeremiah; Porter, Warren P.

    2017-01-01

    How climate constrains species’ distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8–19% less habitat loss in response to annual temperature increases of ~3–5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect

  5. Asphalt materials characterization in support of implementation of the proposed mechanistic-empirical pavement design guide.

    DOT National Transportation Integrated Search

    2007-01-01

    The proposed Mechanistic-Empirical Pavement Design Guide (MEPDG) procedure is an improved methodology for pavement design and evaluation of paving materials. Since this new procedure depends heavily on the characterization of the fundamental engineer...

  6. Gold nanoparticle-based RT-PCR and real-time quantitative RT-PCR assays for detection of Japanese encephalitis virus

    NASA Astrophysics Data System (ADS)

    Huang, Su-Hua; Yang, Tsuey-Ching; Tsai, Ming-Hong; Tsai, I.-Shou; Lu, Huang-Chih; Chuang, Pei-Hsin; Wan, Lei; Lin, Ying-Ju; Lai, Chih-Ho; Lin, Cheng-Wen

    2008-10-01

    Virus isolation and antibody detection are routinely used for diagnosis of Japanese encephalitis virus (JEV) infection, but the low level of transient viremia in some JE patients makes JEV isolation from clinical and surveillance samples very difficult. We describe the use of gold nanoparticle-based RT-PCR and real-time quantitative RT-PCR assays for detection of JEV from its RNA genome. We tested the effect of gold nanoparticles on four different PCR systems, including conventional PCR, reverse-transcription PCR (RT-PCR), and SYBR green real-time PCR and RT-PCR assays for diagnosis in the acute phase of JEV infection. Gold nanoparticles increased the amplification yield of the PCR product and shortened the PCR time compared to the conventional reaction. In addition, nanogold-based real-time RT-PCR showed a linear relationship between Ct and template amount using ten-fold dilutions of JEV. The nanogold-based RT-PCR and real-time quantitative RT-PCR assays were able to detect low levels (1-10 000 copies) of the JEV RNA genomes extracted from culture medium or whole blood, providing early diagnostic tools for the detection of low-level viremia in the acute-phase infection. The assays described here were simple, sensitive, and rapid approaches for detection and quantitation of JEV in tissue cultured samples as well as clinical samples.

  7. Dose selection based on physiologically based pharmacokinetic (PBPK) approaches.

    PubMed

    Jones, Hannah M; Mayawala, Kapil; Poulin, Patrick

    2013-04-01

    Physiologically based pharmacokinetic (PBPK) models are built using differential equations to describe the physiology/anatomy of different biological systems. Readily available in vitro and in vivo preclinical data can be incorporated into these models to not only estimate pharmacokinetic (PK) parameters and plasma concentration-time profiles, but also to gain mechanistic insight into compound properties. They provide a mechanistic framework to understand and extrapolate PK and dose across in vitro and in vivo systems and across different species, populations and disease states. Using small molecule and large molecule examples from the literature and our own company, we have shown how PBPK techniques can be utilised for human PK and dose prediction. Such approaches have the potential to increase efficiency, reduce the need for animal studies, replace clinical trials and increase PK understanding. Given the mechanistic nature of these models, the future use of PBPK modelling in drug discovery and development is promising, however some limitations need to be addressed to realise its application and utility more broadly.

  8. Quantitative secondary electron detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  9. A probe-based quantitative PCR assay for detecting Tetracapsuloides bryosalmonae in fish tissue and environmental DNA water samples

    USGS Publications Warehouse

    Hutchins, Patrick; Sepulveda, Adam; Martin, Renee; Hopper, Lacey

    2017-01-01

    A probe-based quantitative real-time PCR assay was developed to detect Tetracapsuloides bryosalmonae, which causes proliferative kidney disease in salmonid fish, in kidney tissue and environmental DNA (eDNA) water samples. The limits of detection and quantification were 7 and 100 DNA copies for calibration standards and T. bryosalmonae was reliably detected down to 100 copies in tissue and eDNA samples. The assay presented here is a highly sensitive and quantitative tool for detecting T. bryosalmonae with potential applications for tissue diagnostics and environmental detection.

  10. A Cycloaromatization Protocol for Synthesis of Polysubstituted Phenol Derivatives: Method Development and Mechanistic Studies

    PubMed Central

    Spencer, William T.

    2012-01-01

    The scope of the cycloaromatization of propargylic ethers was explored using operationally simple air- and moisture-insensitive conditions. Highly substituted phenol derivatives were obtained in high yields. Mechanistic experiments indicate that the reaction occurs by an electrocyclization followed by 1,3-proton transfer. PMID:22891882

  11. Quantitative analysis and predictive engineering of self-rolling of nanomembranes under anisotropic mismatch strain

    NASA Astrophysics Data System (ADS)

    Chen, Cheng; Song, Pengfei; Meng, Fanchao; Li, Xiao; Liu, Xinyu; Song, Jun

    2017-12-01

    The present work presents a quantitative modeling framework for investigating the self-rolling of nanomembranes under different lattice mismatch strain anisotropy. The effect of transverse mismatch strain on the roll-up direction and curvature has been systematically studied employing both analytical modeling and numerical simulations. The bidirectional nature of the self-rolling of nanomembranes and the critical role of transverse strain in affecting the rolling behaviors have been demonstrated. Two fabrication strategies, i.e., third-layer deposition and corner geometry engineering, have been proposed to predictively manipulate the bidirectional rolling competition of strained nanomembranes, so as to achieve controlled, unidirectional roll-up. In particular for the strategy of corner engineering, microfabrication experiments have been performed to showcase its practical application and effectiveness. Our study offers new mechanistic knowledge towards understanding and predictive engineering of self-rolling of nanomembranes with improved roll-up yield.

  12. Preliminary research on quantitative methods of water resources carrying capacity based on water resources balance sheet

    NASA Astrophysics Data System (ADS)

    Wang, Yanqiu; Huang, Xiaorong; Gao, Linyun; Guo, Biying; Ma, Kai

    2018-06-01

    Water resources are not only basic natural resources, but also strategic economic resources and ecological control factors. Water resources carrying capacity constrains the sustainable development of regional economy and society. Studies of water resources carrying capacity can provide helpful information about how the socioeconomic system is both supported and restrained by the water resources system. Based on the research of different scholars, major problems in the study of water resources carrying capacity were summarized as follows: the definition of water resources carrying capacity is not yet unified; the methods of carrying capacity quantification based on the definition of inconsistency are poor in operability; the current quantitative research methods of water resources carrying capacity did not fully reflect the principles of sustainable development; it is difficult to quantify the relationship among the water resources, economic society and ecological environment. Therefore, it is necessary to develop a better quantitative evaluation method to determine the regional water resources carrying capacity. This paper proposes a new approach to quantifying water resources carrying capacity (that is, through the compilation of the water resources balance sheet) to get a grasp of the regional water resources depletion and water environmental degradation (as well as regional water resources stock assets and liabilities), figure out the squeeze of socioeconomic activities on the environment, and discuss the quantitative calculation methods and technical route of water resources carrying capacity which are able to embody the substance of sustainable development.

  13. Classification-based quantitative analysis of stable isotope labeling by amino acids in cell culture (SILAC) data.

    PubMed

    Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul

    2016-12-01

    Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. The Adoption Process of Ricefield-Based Fish Seed Production in Northwest Bangladesh: An Understanding through Quantitative and Qualitative Investigation

    ERIC Educational Resources Information Center

    Haque, Mohammad Mahfujul; Little, David C.; Barman, Benoy K.; Wahab, Md. Abdul

    2010-01-01

    Purpose: The purpose of the study was to understand the adoption process of ricefield based fish seed production (RBFSP) that has been developed, promoted and established in Northwest Bangladesh. Design/Methodology/Approach: Quantitative investigation based on regression analysis and qualitative investigation using semi-structured interview were…

  15. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    ERIC Educational Resources Information Center

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  16. Regulating the chromatin landscape: structural and mechanistic perspectives.

    PubMed

    Bartholomew, Blaine

    2014-01-01

    A large family of chromatin remodelers that noncovalently modify chromatin is crucial in cell development and differentiation. They are often the targets of cancer, neurological disorders, and other human diseases. These complexes alter nucleosome positioning, higher-order chromatin structure, and nuclear organization. They also assemble chromatin, exchange out histone variants, and disassemble chromatin at defined locations. We review aspects of the structural organization of these complexes, the functional properties of their protein domains, and variation between complexes. We also address the mechanistic details of these complexes in mobilizing nucleosomes and altering chromatin structure. A better understanding of these issues will be vital for further analyses of subunits of these chromatin remodelers, which are being identified as targets in human diseases by NGS (next-generation sequencing).

  17. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model

    PubMed Central

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software “Kongoh” for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1–4 persons’ contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI’s contribution in true contributors and non-contributors by using 2–4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI’s contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples. PMID:29149210

  18. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    PubMed

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  19. Traffic load spectra for implementing and using the mechanistic-empirical pavement design guide in Georgia.

    DOT National Transportation Integrated Search

    2014-02-01

    The GDOT is preparing for implementation of the Mechanistic-Empirical Pavement Design : Guide (MEPDG). As part of this preparation, a statewide traffic load spectra program is being : developed for gathering truck axle loading data. This final report...

  20. Divergence of mechanistic pathways mediating cardiovascular aging and developmental programming of cardiovascular disease

    PubMed Central

    Allison, Beth J.; Kaandorp, Joepe J.; Kane, Andrew D.; Camm, Emily J.; Lusby, Ciara; Cross, Christine M.; Nevin-Dolan, Rhianon; Thakor, Avnesh S.; Derks, Jan B.; Tarry-Adkins, Jane L.; Ozanne, Susan E.; Giussani, Dino A.

    2016-01-01

    Aging and developmental programming are both associated with oxidative stress and endothelial dysfunction, suggesting common mechanistic origins. However, their interrelationship has been little explored. In a rodent model of programmed cardiovascular dysfunction we determined endothelial function and vascular telomere length in young (4 mo) and aged (15 mo) adult offspring of normoxic or hypoxic pregnancy with or without maternal antioxidant treatment. We show loss of endothelial function [maximal arterial relaxation to acetylcholine (71 ± 3 vs. 55 ± 3%) and increased vascular short telomere abundance (4.2–1.3 kb) 43.0 ± 1.5 vs. 55.1 ± 3.8%) in aged vs. young offspring of normoxic pregnancy (P < 0.05). Hypoxic pregnancy in young offspring accelerated endothelial dysfunction (maximal arterial relaxation to acetylcholine: 42 ± 1%, P < 0.05) but this was dissociated from increased vascular short telomere length abundance. Maternal allopurinol rescued maximal arterial relaxation to acetylcholine in aged offspring of normoxic or hypoxic pregnancy but not in young offspring of hypoxic pregnancy. Aged offspring of hypoxic allopurinol pregnancy compared with aged offspring of untreated hypoxic pregnancy had lower levels of short telomeres (vascular short telomere length abundance 35.1 ± 2.5 vs. 48.2 ± 2.6%) and of plasma proinflammatory chemokine (24.6 ± 2.8 vs. 36.8 ± 5.5 pg/ml, P < 0.05). These data provide evidence for divergence of mechanistic pathways mediating cardiovascular aging and developmental programming of cardiovascular disease, and aging being decelerated by antioxidants even prior to birth.—Allison, B. J., Kaandorp, J. J., Kane, A. D., Camm, E. J., Lusby, C., Cross, C. M., Nevin-Dolan, R., Thakor, A. S., Derks, J. B., Tarry-Adkins, J. L., Ozanne, S. E., Giussani, D. A. Divergence of mechanistic pathways mediating cardiovascular aging and developmental programming of cardiovascular disease. PMID:26932929

  1. Divergence of mechanistic pathways mediating cardiovascular aging and developmental programming of cardiovascular disease.

    PubMed

    Allison, Beth J; Kaandorp, Joepe J; Kane, Andrew D; Camm, Emily J; Lusby, Ciara; Cross, Christine M; Nevin-Dolan, Rhianon; Thakor, Avnesh S; Derks, Jan B; Tarry-Adkins, Jane L; Ozanne, Susan E; Giussani, Dino A

    2016-05-01

    Aging and developmental programming are both associated with oxidative stress and endothelial dysfunction, suggesting common mechanistic origins. However, their interrelationship has been little explored. In a rodent model of programmed cardiovascular dysfunction we determined endothelial function and vascular telomere length in young (4 mo) and aged (15 mo) adult offspring of normoxic or hypoxic pregnancy with or without maternal antioxidant treatment. We show loss of endothelial function [maximal arterial relaxation to acetylcholine (71 ± 3 vs. 55 ± 3%) and increased vascular short telomere abundance (4.2-1.3 kb) 43.0 ± 1.5 vs. 55.1 ± 3.8%) in aged vs. young offspring of normoxic pregnancy (P < 0.05). Hypoxic pregnancy in young offspring accelerated endothelial dysfunction (maximal arterial relaxation to acetylcholine: 42 ± 1%, P < 0.05) but this was dissociated from increased vascular short telomere length abundance. Maternal allopurinol rescued maximal arterial relaxation to acetylcholine in aged offspring of normoxic or hypoxic pregnancy but not in young offspring of hypoxic pregnancy. Aged offspring of hypoxic allopurinol pregnancy compared with aged offspring of untreated hypoxic pregnancy had lower levels of short telomeres (vascular short telomere length abundance 35.1 ± 2.5 vs. 48.2 ± 2.6%) and of plasma proinflammatory chemokine (24.6 ± 2.8 vs. 36.8 ± 5.5 pg/ml, P < 0.05). These data provide evidence for divergence of mechanistic pathways mediating cardiovascular aging and developmental programming of cardiovascular disease, and aging being decelerated by antioxidants even prior to birth.-Allison, B. J., Kaandorp, J. J., Kane, A. D., Camm, E. J., Lusby, C., Cross, C. M., Nevin-Dolan, R., Thakor, A. S., Derks, J. B., Tarry-Adkins, J. L., Ozanne, S. E., Giussani, D. A. Divergence of mechanistic pathways mediating cardiovascular aging and developmental programming of cardiovascular disease. © FASEB.

  2. A Radioactivity Based Quantitative Analysis of the Amount of Thorium Present in Ores and Metallurgical Products; ANALYSE QUANTITATIVE DU THORIUM DANS LES MINERAIS ET LES PRODUITS THORIFERES PAR UNE METHODE BASEE SUR LA RADIOACTIVITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collee, R.; Govaerts, J.; Winand, L.

    1959-10-31

    A brief resume of the classical methods of quantitative determination of thorium in ores and thoriferous products is given to show that a rapid, accurate, and precise physical method based on the radioactivity of thorium would be of great utility. A method based on the utilization of the characteristic spectrum of the thorium gamma radiation is presented. The preparation of the samples and the instruments needed for the measurements is discussed. The experimental results show that the reproducibility is very satisfactory and that it is possible to detect Th contents of 1% or smaller. (J.S.R.)

  3. Comparison of Two-Phase Pipe Flow in OpenFOAM with a Mechanistic Model

    NASA Astrophysics Data System (ADS)

    Shuard, Adrian M.; Mahmud, Hisham B.; King, Andrew J.

    2016-03-01

    Two-phase pipe flow is a common occurrence in many industrial applications such as power generation and oil and gas transportation. Accurate prediction of liquid holdup and pressure drop is of vast importance to ensure effective design and operation of fluid transport systems. In this paper, a Computational Fluid Dynamics (CFD) study of a two-phase flow of air and water is performed using OpenFOAM. The two-phase solver, interFoam is used to identify flow patterns and generate values of liquid holdup and pressure drop, which are compared to results obtained from a two-phase mechanistic model developed by Petalas and Aziz (2002). A total of 60 simulations have been performed at three separate pipe inclinations of 0°, +10° and -10° respectively. A three dimensional, 0.052m diameter pipe of 4m length is used with the Shear Stress Transport (SST) k - ɷ turbulence model to solve the turbulent mixtures of air and water. Results show that the flow pattern behaviour and numerical values of liquid holdup and pressure drop compare reasonably well to the mechanistic model.

  4. Application of Pulse Radiolysis to Mechanistic Investigations of Catalysis Relevant to Artificial Photosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fujita, Etsuko; Grills, David C.; Polyansky, Dmitry E.

    Taking inspiration from natural photosystems, the goal of artificial photosynthesis is to harness solar energy to convert abundant materials, such as CO 2 and H 2O, into solar fuels. Catalysts are required to ensure that the necessary redox half-reactions proceed in the most energy-efficient manner. It is thus critical to gain a detailed mechanistic understanding of these catalytic reactions in order to develop new and improved catalysts. Many of the key catalytic intermediates are short-lived transient species, requiring time-resolved spectroscopic techniques for their observation. The two main methods for rapidly generating such species on the sub-microsecond timescale are laser flashmore » photolysis and pulse radiolysis. These methods complement one another, and both can provide important spectroscopic and kinetic information. However, pulse radiolysis proves to be superior in systems with significant spectroscopic overlap between photosensitizer and other species present during the reaction. In this paper, we review the pulse radiolysis technique and how it has been applied to mechanistic investigations of half-reactions relevant to artificial photosynthesis.« less

  5. Application of Pulse Radiolysis to Mechanistic Investigations of Catalysis Relevant to Artificial Photosynthesis

    DOE PAGES

    Fujita, Etsuko; Grills, David C.; Polyansky, Dmitry E.

    2017-09-12

    Taking inspiration from natural photosystems, the goal of artificial photosynthesis is to harness solar energy to convert abundant materials, such as CO 2 and H 2O, into solar fuels. Catalysts are required to ensure that the necessary redox half-reactions proceed in the most energy-efficient manner. It is thus critical to gain a detailed mechanistic understanding of these catalytic reactions in order to develop new and improved catalysts. Many of the key catalytic intermediates are short-lived transient species, requiring time-resolved spectroscopic techniques for their observation. The two main methods for rapidly generating such species on the sub-microsecond timescale are laser flashmore » photolysis and pulse radiolysis. These methods complement one another, and both can provide important spectroscopic and kinetic information. However, pulse radiolysis proves to be superior in systems with significant spectroscopic overlap between photosensitizer and other species present during the reaction. In this paper, we review the pulse radiolysis technique and how it has been applied to mechanistic investigations of half-reactions relevant to artificial photosynthesis.« less

  6. Toward a Mechanistic Understanding of Environmentally Forced Zoonotic Disease Emergence: Sin Nombre Hantavirus

    PubMed Central

    Carver, Scott; Mills, James N.; Parmenter, Cheryl A.; Parmenter, Robert R.; Richardson, Kyle S.; Harris, Rachel L.; Douglass, Richard J.; Kuenzi, Amy J.; Luis, Angela D.

    2015-01-01

    Understanding the environmental drivers of zoonotic reservoir and human interactions is crucial to understanding disease risk, but these drivers are poorly predicted. We propose a mechanistic understanding of human–reservoir interactions, using hantavirus pulmonary syndrome as a case study. Crucial processes underpinning the disease's incidence remain poorly studied, including the connectivity among natural and peridomestic deer mouse host activity, virus transmission, and human exposure. We found that disease cases were greatest in arid states and declined exponentially with increasing precipitation. Within arid environments, relatively rare climatic conditions (e.g., El Niño) are associated with increased rainfall and reservoir abundance, producing more frequent virus transmission and host dispersal. We suggest that deer mice increase their occupancy of peridomestic structures during spring–summer, amplifying intraspecific transmission and human infection risk. Disease incidence in arid states may increase with predicted climatic changes. Mechanistic approaches incorporating reservoir behavior, reservoir–human interactions, and pathogen spillover could enhance our understanding of global hantavirus ecology, with applications to other directly transmitted zoonoses. PMID:26955081

  7. Mechanistic models versus machine learning, a fight worth fighting for the biological community?

    PubMed

    Baker, Ruth E; Peña, Jose-Maria; Jayamohan, Jayaratnam; Jérusalem, Antoine

    2018-05-01

    Ninety per cent of the world's data have been generated in the last 5 years ( Machine learning: the power and promise of computers that learn by example Report no. DES4702. Issued April 2017. Royal Society). A small fraction of these data is collected with the aim of validating specific hypotheses. These studies are led by the development of mechanistic models focused on the causality of input-output relationships. However, the vast majority is aimed at supporting statistical or correlation studies that bypass the need for causality and focus exclusively on prediction. Along these lines, there has been a vast increase in the use of machine learning models, in particular in the biomedical and clinical sciences, to try and keep pace with the rate of data generation. Recent successes now beg the question of whether mechanistic models are still relevant in this area. Said otherwise, why should we try to understand the mechanisms of disease progression when we can use machine learning tools to directly predict disease outcome? © 2018 The Author(s).

  8. Mechanistic studies of high-density lipoproteins.

    PubMed

    Kashyap, M L

    1998-12-17

    There is increasing evidence that high-density lipoprotein (HDL) and its subfractions are protective against atherosclerotic cardiovascular disease. Physical exercise, weight reduction, smoking cessation, diabetes mellitus control, and specific drugs, including niacin, fibrates, and estrogens, are effective methods to increase HDL levels. Niacin is the oldest and most powerful clinical agent for raising HDL levels. Niaspan, an extended-release niacin formulation, is as potent as immediate-release niacin in increasing levels of HDL cholesterol; subfractions HDL2 and HDL3; apolipoprotein A-I, the major protein of HDL, and its cardioprotective subfraction lipoprotein A-I. Recent research from our laboratory suggests a novel mechanism by which niacin inhibits hepatic removal of HDL-apoprotein A-I without interfering with the removal of cholesterol carried by HDL, thus augmenting reverse cholesterol transport. Other mechanistic studies indicate that fibrates and estrogens stimulate the synthesis and production of HDL-apoprotein A-I. Because niacin decreases HDL-apoprotein A-I removal, and fibrates and estrogens increase HDL-apoprotein A-I production, combinations of niacin with these agents may raise HDL levels more than fibrates or estrogens alone.

  9. Mechanistic systems modeling to guide drug discovery and development

    PubMed Central

    Schmidt, Brian J.; Papin, Jason A.; Musante, Cynthia J.

    2013-01-01

    A crucial question that must be addressed in the drug development process is whether the proposed therapeutic target will yield the desired effect in the clinical population. Pharmaceutical and biotechnology companies place a large investment on research and development, long before confirmatory data are available from human trials. Basic science has greatly expanded the computable knowledge of disease processes, both through the generation of large omics data sets and a compendium of studies assessing cellular and systemic responses to physiologic and pathophysiologic stimuli. Given inherent uncertainties in drug development, mechanistic systems models can better inform target selection and the decision process for advancing compounds through preclinical and clinical research. PMID:22999913

  10. FDTD-based quantitative analysis of terahertz wave detection for multilayered structures.

    PubMed

    Tu, Wanli; Zhong, Shuncong; Shen, Yaochun; Zhou, Qing; Yao, Ligang

    2014-10-01

    Experimental investigations have shown that terahertz pulsed imaging (TPI) is able to quantitatively characterize a range of multilayered media (e.g., biological issues, pharmaceutical tablet coatings, layered polymer composites, etc.). Advanced modeling of the interaction of terahertz radiation with a multilayered medium is required to enable the wide application of terahertz technology in a number of emerging fields, including nondestructive testing. Indeed, there have already been many theoretical analyses performed on the propagation of terahertz radiation in various multilayered media. However, to date, most of these studies used 1D or 2D models, and the dispersive nature of the dielectric layers was not considered or was simplified. In the present work, the theoretical framework of using terahertz waves for the quantitative characterization of multilayered media was established. A 3D model based on the finite difference time domain (FDTD) method is proposed. A batch of pharmaceutical tablets with a single coating layer of different coating thicknesses and different refractive indices was modeled. The reflected terahertz wave from such a sample was computed using the FDTD method, assuming that the incident terahertz wave is broadband, covering a frequency range up to 3.5 THz. The simulated results for all of the pharmaceutical-coated tablets considered were found to be in good agreement with the experimental results obtained using a commercial TPI system. In addition, we studied a three-layered medium to mimic the occurrence of defects in the sample.

  11. Understanding Pre-Quantitative Risk in Projects

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  12. ASSESSING POPULATION EXPOSURES TO MULTIPLE AIR POLLUTANTS USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...

  13. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Precise non-steady-state characterization of solid active materials with no preliminary mechanistic assumptions

    DOE PAGES

    Constales, Denis; Yablonsky, Gregory S.; Wang, Lucun; ...

    2017-04-25

    This paper presents a straightforward and user-friendly procedure for extracting a reactivity characterization of catalytic reactions on solid materials under non-steady-state conditions, particularly in temporal analysis of products (TAP) experiments. The kinetic parameters derived by this procedure can help with the development of detailed mechanistic understanding. The procedure consists of the following two major steps: 1) Three “Laplace reactivities” are first determined based on the moments of the exit flow pulse response data; 2) Depending on a select kinetic model, kinetic constants of elementary reaction steps can then be expressed as a function of reactivities and determined accordingly. In particular,more » we distinguish two calculation methods based on the availability and reliability of reactant and product data. The theoretical results are illustrated using a reverse example with given parameters as well as an experimental example of CO oxidation over a supported Au/SiO 2 catalyst. The procedure presented here provides an efficient tool for kinetic characterization of many complex chemical reactions.« less

  15. Quantitative Analyses about Market- and Prevalence-Based Needs for Adapted Physical Education Teachers in the Public Schools in the United States

    ERIC Educational Resources Information Center

    Zhang, Jiabei

    2011-01-01

    The purpose of this study was to analyze quantitative needs for more adapted physical education (APE) teachers based on both market- and prevalence-based models. The market-based need for more APE teachers was examined based on APE teacher positions funded, while the prevalence-based need for additional APE teachers was analyzed based on students…

  16. Quantification and Comparison of Anti-Fibrotic Therapies by Polarized SRM and SHG-Based Morphometry in Rat UUO Model

    PubMed Central

    Weldon, Steve M.; Matera, Damian; Lee, ChungWein; Yang, Haichun; Fryer, Ryan M.; Fogo, Agnes B.; Reinhart, Glenn A.

    2016-01-01

    Renal interstitial fibrosis (IF) is an important pathologic manifestation of disease progression in a variety of chronic kidney diseases (CKD). However, the quantitative and reproducible analysis of IF remains a challenge, especially in experimental animal models of progressive IF. In this study, we compare traditional polarized Sirius Red morphometry (SRM) to novel Second Harmonic Generation (SHG)-based morphometry of unstained tissues for quantitative analysis of IF in the rat 5 day unilateral ureteral obstruction (UUO) model. To validate the specificity of SHG for detecting fibrillar collagen components in IF, co-localization studies for collagens type I, III, and IV were performed using IHC. In addition, we examined the correlation, dynamic range, sensitivity, and ability of polarized SRM and SHG-based morphometry to detect an anti-fibrotic effect of three different treatment regimens. Comparisons were made across three separate studies in which animals were treated with three mechanistically distinct pharmacologic agents: enalapril (ENA, 15, 30, 60 mg/kg), mycophenolate mofetil (MMF, 2, 20 mg/kg) or the connective tissue growth factor (CTGF) neutralizing antibody, EX75606 (1, 3, 10 mg/kg). Our results demonstrate a strong co-localization of the SHG signal with fibrillar collagens I and III but not non-fibrillar collagen IV. Quantitative IF, calculated as percent cortical area of fibrosis, demonstrated similar response profile for both polarized SRM and SHG-based morphometry. The two methodologies exhibited a strong correlation across all three pharmacology studies (r2 = 0.89–0.96). However, compared with polarized SRM, SHG-based morphometry delivered a greater dynamic range and absolute magnitude of reduction of IF after treatment. In summary, we demonstrate that SHG-based morphometry in unstained kidney tissues is comparable to polarized SRM for quantitation of fibrillar collagens, but with an enhanced sensitivity to detect treatment-induced reductions in

  17. Quantitative Methods in Psychology: Inevitable and Useless

    PubMed Central

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199

  18. Quantitative methods in psychology: inevitable and useless.

    PubMed

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  19. A Mechatronic System for Quantitative Application and Assessment of Massage-Like Actions in Small Animals

    PubMed Central

    Wang, Qian; Zeng, Hansong; Best, Thomas M.; Haas, Caroline; Heffner, Ned T.; Agarwal, Sudha; Zhao, Yi

    2013-01-01

    Massage therapy has a long history and has been widely believed effective in restoring tissue function, relieving pain and stress, and promoting overall well-being. However, the application of massage-like actions and the efficacy of massage are largely based on anecdotal experiences that are difficult to define and measure. This leads to a somewhat limited evidence-based interface of massage therapy with modern medicine. In this study, we introduce a mechatronic device that delivers highly reproducible massage-like mechanical loads to the hind limbs of small animals (rats and rabbits), where various massage-like actions are quantified by the loading parameters (magnitude, frequency and duration) of the compressive and transverse forces on the subject tissues. The effect of massage is measured by the difference in passive viscoelastic properties of the subject tissues before and after mechanical loading, both obtained by the same device. Results show that this device is useful in identifying the loading parameters that are most conducive to a change in tissue mechanical properties, and can determine the range of loading parameters that result in sustained changes in tissue mechanical properties and function. This device presents the first step in our effort for quantifying the application of massage-like actions used clinically and measurement of their efficacy that can readily be combined with various quantitative measures (e.g., active mechanical properties and physiological assays) for determining the therapeutic and mechanistic effects of massage therapies. PMID:23943071

  20. Characterizing seasonal variations in pavement material properties for use in a mechanistic-empirical design procedure

    DOT National Transportation Integrated Search

    2000-12-01

    Recent advances in flexible pavement design have prompted agencies to move toward the development and use of mechanistic-empirical (M-E) design procedures. This report analyzed seasonal trends in flexible pavement layer moduli to calibrate a M-E desi...