Comparing two-zone models of dust exposure.
Jones, Rachael M; Simmons, Catherine E; Boelter, Fred W
2011-09-01
The selection and application of mathematical models to work tasks is challenging. Previously, we developed and evaluated a semi-empirical two-zone model that predicts time-weighted average (TWA) concentrations (Ctwa) of dust emitted during the sanding of drywall joint compound. Here, we fit the emission rate and random air speed variables of a mechanistic two-zone model to testing event data and apply and evaluate the model using data from two field studies. We found that the fitted random air speed values and emission rate were sensitive to (i) the size of the near-field and (ii) the objective function used for fitting, but this did not substantially impact predicted dust Ctwa. The mechanistic model predictions were lower than the semi-empirical model predictions and measured respirable dust Ctwa at Site A but were within an acceptable range. At Site B, a 10.5 m3 room, the mechanistic model did not capture the observed difference between PBZ and area Ctwa. The model predicted uniform mixing and predicted dust Ctwa up to an order of magnitude greater than was measured. We suggest that applications of the mechanistic model be limited to contexts where the near-field volume is very small relative to the far-field volume.
A comprehensive mechanistic model for upward two-phase flow in wellbores
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sylvester, N.D.; Sarica, C.; Shoham, O.
1994-05-01
A comprehensive model is formulated to predict the flow behavior for upward two-phase flow. This model is composed of a model for flow-pattern prediction and a set of independent mechanistic models for predicting such flow characteristics as holdup and pressure drop in bubble, slug, and annular flow. The comprehensive model is evaluated by using a well data bank made up of 1,712 well cases covering a wide variety of field data. Model performance is also compared with six commonly used empirical correlations and the Hasan-Kabir mechanistic model. Overall model performance is in good agreement with the data. In comparison withmore » other methods, the comprehensive model performed the best.« less
Mathewson, Paul D; Moyer-Horner, Lucas; Beever, Erik A; Briscoe, Natalie J; Kearney, Michael; Yahn, Jeremiah M; Porter, Warren P
2017-03-01
How climate constrains species' distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8-19% less habitat loss in response to annual temperature increases of ~3-5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect: climate-imposed restrictions on activity. This more complete understanding is necessary to inform climate adaptation actions, management strategies, and conservation plans. © 2016 John Wiley & Sons Ltd.
Mathewson, Paul; Moyer-Horner, Lucas; Beever, Erik; Briscoe, Natalie; Kearney, Michael T.; Yahn, Jeremiah; Porter, Warren P.
2017-01-01
How climate constrains species’ distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8–19% less habitat loss in response to annual temperature increases of ~3–5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect: climate-imposed restrictions on activity. This more complete understanding is necessary to inform climate adaptation actions, management strategies, and conservation plans.
Moore, Shannon R.; Saidel, Gerald M.; Knothe, Ulf; Knothe Tate, Melissa L.
2014-01-01
The link between mechanics and biology in the generation and the adaptation of bone has been well studied in context of skeletal development and fracture healing. Yet, the prediction of tissue genesis within - and the spatiotemporal healing of - postnatal defects, necessitates a quantitative evaluation of mechano-biological interactions using experimental and clinical parameters. To address this current gap in knowledge, this study aims to develop a mechanistic mathematical model of tissue genesis using bone morphogenetic protein (BMP) to represent of a class of factors that may coordinate bone healing. Specifically, we developed a mechanistic, mathematical model to predict the dynamics of tissue genesis by periosteal progenitor cells within a long bone defect surrounded by periosteum and stabilized via an intramedullary nail. The emergent material properties and mechanical environment associated with nascent tissue genesis influence the strain stimulus sensed by progenitor cells within the periosteum. Using a mechanical finite element model, periosteal surface strains are predicted as a function of emergent, nascent tissue properties. Strains are then input to a mechanistic mathematical model, where mechanical regulation of BMP-2 production mediates rates of cellular proliferation, differentiation and tissue production, to predict healing outcomes. A parametric approach enables the spatial and temporal prediction of endochondral tissue regeneration, assessed as areas of cartilage and mineralized bone, as functions of radial distance from the periosteum and time. Comparing model results to histological outcomes from two previous studies of periosteum-mediated bone regeneration in a common ovine model, it was shown that mechanistic models incorporating mechanical feedback successfully predict patterns (spatial) and trends (temporal) of bone tissue regeneration. The novel model framework presented here integrates a mechanistic feedback system based on the mechanosensitivity of periosteal progenitor cells, which allows for modeling and prediction of tissue regeneration on multiple length and time scales. Through combination of computational, physical and engineering science approaches, the model platform provides a means to test new hypotheses in silico and to elucidate conditions conducive to endogenous tissue genesis. Next generation models will serve to unravel intrinsic differences in bone genesis by endochondral and intramembranous mechanisms. PMID:24967742
Johnson, Douglas H.; Cook, R.D.
2013-01-01
In her AAAS News & Notes piece "Can the Southwest manage its thirst?" (26 July, p. 362), K. Wren quotes Ajay Kalra, who advocates a particular method for predicting Colorado River streamflow "because it eschews complex physical climate models for a statistical data-driven modeling approach." A preference for data-driven models may be appropriate in this individual situation, but it is not so generally, Data-driven models often come with a warning against extrapolating beyond the range of the data used to develop the models. When the future is like the past, data-driven models can work well for prediction, but it is easy to over-model local or transient phenomena, often leading to predictive inaccuracy (1). Mechanistic models are built on established knowledge of the process that connects the response variables with the predictors, using information obtained outside of an extant data set. One may shy away from a mechanistic approach when the underlying process is judged to be too complicated, but good predictive models can be constructed with statistical components that account for ingredients missing in the mechanistic analysis. Models with sound mechanistic components are more generally applicable and robust than data-driven models.
Assessing uncertainty in mechanistic models
Edwin J. Green; David W. MacFarlane; Harry T. Valentine
2000-01-01
Concern over potential global change has led to increased interest in the use of mechanistic models for predicting forest growth. The rationale for this interest is that empirical models may be of limited usefulness if environmental conditions change. Intuitively, we expect that mechanistic models, grounded as far as possible in an understanding of the biology of tree...
Testing mechanistic models of growth in insects.
Maino, James L; Kearney, Michael R
2015-11-22
Insects are typified by their small size, large numbers, impressive reproductive output and rapid growth. However, insect growth is not simply rapid; rather, insects follow a qualitatively distinct trajectory to many other animals. Here we present a mechanistic growth model for insects and show that increasing specific assimilation during the growth phase can explain the near-exponential growth trajectory of insects. The presented model is tested against growth data on 50 insects, and compared against other mechanistic growth models. Unlike the other mechanistic models, our growth model predicts energy reserves per biomass to increase with age, which implies a higher production efficiency and energy density of biomass in later instars. These predictions are tested against data compiled from the literature whereby it is confirmed that insects increase their production efficiency (by 24 percentage points) and energy density (by 4 J mg(-1)) between hatching and the attainment of full size. The model suggests that insects achieve greater production efficiencies and enhanced growth rates by increasing specific assimilation and increasing energy reserves per biomass, which are less costly to maintain than structural biomass. Our findings illustrate how the explanatory and predictive power of mechanistic growth models comes from their grounding in underlying biological processes. © 2015 The Author(s).
Eric J. Gustafson
2013-01-01
Researchers and natural resource managers need predictions of how multiple global changes (e.g., climate change, rising levels of air pollutants, exotic invasions) will affect landscape composition and ecosystem function. Ecological predictive models used for this purpose are constructed using either a mechanistic (process-based) or a phenomenological (empirical)...
Predictive and mechanistic multivariate linear regression models for reaction development
Santiago, Celine B.; Guo, Jing-Yao
2018-01-01
Multivariate Linear Regression (MLR) models utilizing computationally-derived and empirically-derived physical organic molecular descriptors are described in this review. Several reports demonstrating the effectiveness of this methodological approach towards reaction optimization and mechanistic interrogation are discussed. A detailed protocol to access quantitative and predictive MLR models is provided as a guide for model development and parameter analysis. PMID:29719711
Safaie, Ammar; Wendzel, Aaron; Ge, Zhongfu; Nevers, Meredith; Whitman, Richard L.; Corsi, Steven R.; Phanikumar, Mantha S.
2016-01-01
Statistical and mechanistic models are popular tools for predicting the levels of indicator bacteria at recreational beaches. Researchers tend to use one class of model or the other, and it is difficult to generalize statements about their relative performance due to differences in how the models are developed, tested, and used. We describe a cooperative modeling approach for freshwater beaches impacted by point sources in which insights derived from mechanistic modeling were used to further improve the statistical models and vice versa. The statistical models provided a basis for assessing the mechanistic models which were further improved using probability distributions to generate high-resolution time series data at the source, long-term “tracer” transport modeling based on observed electrical conductivity, better assimilation of meteorological data, and the use of unstructured-grids to better resolve nearshore features. This approach resulted in improved models of comparable performance for both classes including a parsimonious statistical model suitable for real-time predictions based on an easily measurable environmental variable (turbidity). The modeling approach outlined here can be used at other sites impacted by point sources and has the potential to improve water quality predictions resulting in more accurate estimates of beach closures.
Combining correlative and mechanistic habitat suitability models to improve ecological compensation.
Meineri, Eric; Deville, Anne-Sophie; Grémillet, David; Gauthier-Clerc, Michel; Béchet, Arnaud
2015-02-01
Only a few studies have shown positive impacts of ecological compensation on species dynamics affected by human activities. We argue that this is due to inappropriate methods used to forecast required compensation in environmental impact assessments. These assessments are mostly descriptive and only valid at limited spatial and temporal scales. However, habitat suitability models developed to predict the impacts of environmental changes on potential species' distributions should provide rigorous science-based tools for compensation planning. Here we describe the two main classes of predictive models: correlative models and individual-based mechanistic models. We show how these models can be used alone or synoptically to improve compensation planning. While correlative models are easier to implement, they tend to ignore underlying ecological processes and lack accuracy. On the contrary, individual-based mechanistic models can integrate biological interactions, dispersal ability and adaptation. Moreover, among mechanistic models, those considering animal energy balance are particularly efficient at predicting the impact of foraging habitat loss. However, mechanistic models require more field data compared to correlative models. Hence we present two approaches which combine both methods for compensation planning, especially in relation to the spatial scale considered. We show how the availability of biological databases and software enabling fast and accurate population projections could be advantageously used to assess ecological compensation requirement efficiently in environmental impact assessments. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.
Assmus, Frauke; Houston, J Brian; Galetin, Aleksandra
2017-11-15
The prediction of tissue-to-plasma water partition coefficients (Kpu) from in vitro and in silico data using the tissue-composition based model (Rodgers & Rowland, J Pharm Sci. 2005, 94(6):1237-48.) is well established. However, distribution of basic drugs, in particular into lysosome-rich lung tissue, tends to be under-predicted by this approach. The aim of this study was to develop an extended mechanistic model for the prediction of Kpu which accounts for lysosomal sequestration and the contribution of different cell types in the tissue of interest. The extended model is based on compound-specific physicochemical properties and tissue composition data to describe drug ionization, distribution into tissue water and drug binding to neutral lipids, neutral phospholipids and acidic phospholipids in tissues, including lysosomes. Physiological data on the types of cells contributing to lung, kidney and liver, their lysosomal content and lysosomal pH were collated from the literature. The predictive power of the extended mechanistic model was evaluated using a dataset of 28 basic drugs (pK a ≥7.8, 17 β-blockers, 11 structurally diverse drugs) for which experimentally determined Kpu data in rat tissue have been reported. Accounting for the lysosomal sequestration in the extended mechanistic model improved the accuracy of Kpu predictions in lung compared to the original Rodgers model (56% drugs within 2-fold or 88% within 3-fold of observed values). Reduction in the extent of Kpu under-prediction was also evident in liver and kidney. However, consideration of lysosomal sequestration increased the occurrence of over-predictions, yielding overall comparable model performances for kidney and liver, with 68% and 54% of Kpu values within 2-fold error, respectively. High lysosomal concentration ratios relative to cytosol (>1000-fold) were predicted for the drugs investigated; the extent differed depending on the lysosomal pH and concentration of acidic phospholipids among cell types. Despite this extensive lysosomal sequestration in the individual cells types, the maximal change in the overall predicted tissue Kpu was <3-fold for lysosome-rich tissues investigated here. Accounting for the variability in cellular physiological model input parameters, in particular lysosomal pH and fraction of the cellular volume occupied by the lysosomes, only partially explained discrepancies between observed and predicted Kpu data in the lung. Improved understanding of the system properties, e.g., cell/organelle composition is required to support further development of mechanistic equations for the prediction of drug tissue distribution. Application of this revised mechanistic model is recommended for prediction of Kpu in lysosome-rich tissue to facilitate the advancement of physiologically-based prediction of volume of distribution and drug exposure in the tissues. Copyright © 2017 Elsevier B.V. All rights reserved.
MECHANISTIC-BASED DISINFECTION AND DISINFECTION BYPRODUCT MODELS
We propose developing a mechanistic-based numerical model for chlorine decay and regulated DBP (THM and HAA) formation derived from (free) chlorination; the model framework will allow future modifications for other DBPs and chloramination. Predicted chlorine residual and DBP r...
Investigation of mechanistic deterioration modeling for bridge design and management.
DOT National Transportation Integrated Search
2017-04-01
The ongoing deterioration of highway bridges in Colorado dictates that an effective method for allocating limited management resources be developed. In order to predict bridge deterioration in advance, mechanistic models that analyze the physical pro...
Duan, J; Kesisoglou, F; Novakovic, J; Amidon, GL; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R
2017-01-01
On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled “Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation.”1 The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole‐body framework.2 PMID:28571121
Model for estimating enteric methane emissions from United States dairy and feedlot cattle.
Kebreab, E; Johnson, K A; Archibeque, S L; Pape, D; Wirth, T
2008-10-01
Methane production from enteric fermentation in cattle is one of the major sources of anthropogenic greenhouse gas emission in the United States and worldwide. National estimates of methane emissions rely on mathematical models such as the one recommended by the Intergovernmental Panel for Climate Change (IPCC). Models used for prediction of methane emissions from cattle range from empirical to mechanistic with varying input requirements. Two empirical and 2 mechanistic models (COWPOLL and MOLLY) were evaluated for their prediction ability using individual cattle measurements. Model selection was based on mean square prediction error (MSPE), concordance correlation coefficient, and residuals vs. predicted values analyses. In dairy cattle, COWPOLL had the lowest root MSPE and greatest accuracy and precision of predicting methane emissions (correlation coefficient estimate = 0.75). The model simulated differences in diet more accurately than the other models, and the residuals vs. predicted value analysis showed no mean bias (P = 0.71). In feedlot cattle, MOLLY had the lowest root MSPE with almost all errors from random sources (correlation coefficient estimate = 0.69). The IPCC model also had good agreement with observed values, and no significant mean (P = 0.74) or linear bias (P = 0.11) was detected when residuals were plotted against predicted values. A fixed methane conversion factor (Ym) might be an easier alternative to diet-dependent variable Ym. Based on the results, the 2 mechanistic models were used to simulate methane emissions from representative US diets and were compared with the IPCC model. The average Ym in dairy cows was 5.63% of GE (range 3.78 to 7.43%) compared with 6.5% +/- 1% recommended by IPCC. In feedlot cattle, the average Ym was 3.88% (range 3.36 to 4.56%) compared with 3% +/- 1% recommended by IPCC. Based on our simulations, using IPCC values can result in an overestimate of about 12.5% and underestimate of emissions by about 9.8% for dairy and feedlot cattle, respectively. In addition to providing improved estimates of emissions based on diets, mechanistic models can be used to assess mitigation options such as changing source of carbohydrate or addition of fat to decrease methane, which is not possible with empirical models. We recommend national inventories use diet-specific Ym values predicted by mechanistic models to estimate methane emissions from cattle.
Nøst, Therese Haugdahl; Breivik, Knut; Wania, Frank; Rylander, Charlotta; Odland, Jon Øyvind; Sandanger, Torkjel Manning
2016-03-01
Studies on the health effects of polychlorinated biphenyls (PCBs) call for an understanding of past and present human exposure. Time-resolved mechanistic models may supplement information on concentrations in individuals obtained from measurements and/or statistical approaches if they can be shown to reproduce empirical data. Here, we evaluated the capability of one such mechanistic model to reproduce measured PCB concentrations in individual Norwegian women. We also assessed individual life-course concentrations. Concentrations of four PCB congeners in pregnant (n = 310, sampled in 2007-2009) and postmenopausal (n = 244, 2005) women were compared with person-specific predictions obtained using CoZMoMAN, an emission-based environmental fate and human food-chain bioaccumulation model. Person-specific predictions were also made using statistical regression models including dietary and lifestyle variables and concentrations. CoZMoMAN accurately reproduced medians and ranges of measured concentrations in the two study groups. Furthermore, rank correlations between measurements and predictions from both CoZMoMAN and regression analyses were strong (Spearman's r > 0.67). Precision in quartile assignments from predictions was strong overall as evaluated by weighted Cohen's kappa (> 0.6). Simulations indicated large inter-individual differences in concentrations experienced in the past. The mechanistic model reproduced all measurements of PCB concentrations within a factor of 10, and subject ranking and quartile assignments were overall largely consistent, although they were weak within each study group. Contamination histories for individuals predicted by CoZMoMAN revealed variation between study subjects, particularly in the timing of peak concentrations. Mechanistic models can provide individual PCB exposure metrics that could serve as valuable supplements to measurements.
Chen, Tao; Lian, Guoping; Kattou, Panayiotis
2016-07-01
The purpose was to develop a mechanistic mathematical model for predicting the pharmacokinetics of topically applied solutes penetrating through the skin and into the blood circulation. The model could be used to support the design of transdermal drug delivery systems and skin care products, and risk assessment of occupational or consumer exposure. A recently reported skin penetration model [Pharm Res 32 (2015) 1779] was integrated with the kinetic equations for dermis-to-capillary transport and systemic circulation. All model parameters were determined separately from the molecular, microscopic and physiological bases, without fitting to the in vivo data to be predicted. Published clinical studies of nicotine were used for model demonstration. The predicted plasma kinetics is in good agreement with observed clinical data. The simulated two-dimensional concentration profile in the stratum corneum vividly illustrates the local sub-cellular disposition kinetics, including tortuous lipid pathway for diffusion and the "reservoir" effect of the corneocytes. A mechanistic model for predicting transdermal and systemic kinetics was developed and demonstrated with published clinical data. The integrated mechanistic approach has significantly extended the applicability of a recently reported microscopic skin penetration model by providing prediction of solute concentration in the blood.
A mechanistic model to predict the capture of gas phase mercury species using in-situ generated titania nanosize particles activated by UV irradiation is developed. The model is an extension of a recently reported model1 for photochemical reactions that accounts for the rates of...
Towards predictive models of the human gut microbiome
2014-01-01
The intestinal microbiota is an ecosystem susceptible to external perturbations such as dietary changes and antibiotic therapies. Mathematical models of microbial communities could be of great value in the rational design of microbiota-tailoring diets and therapies. Here, we discuss how advances in another field, engineering of microbial communities for wastewater treatment bioreactors, could inspire development of mechanistic mathematical models of the gut microbiota. We review the current state-of-the-art in bioreactor modeling and current efforts in modeling the intestinal microbiota. Mathematical modeling could benefit greatly from the deluge of data emerging from metagenomic studies, but data-driven approaches such as network inference that aim to predict microbiome dynamics without explicit mechanistic knowledge seem better suited to model these data. Finally, we discuss how the integration of microbiome shotgun sequencing and metabolic modeling approaches such as flux balance analysis may fulfill the promise of a mechanistic model of the intestinal microbiota. PMID:24727124
Bridging paradigms: hybrid mechanistic-discriminative predictive models.
Doyle, Orla M; Tsaneva-Atansaova, Krasimira; Harte, James; Tiffin, Paul A; Tino, Peter; Díaz-Zuccarini, Vanessa
2013-03-01
Many disease processes are extremely complex and characterized by multiple stochastic processes interacting simultaneously. Current analytical approaches have included mechanistic models and machine learning (ML), which are often treated as orthogonal viewpoints. However, to facilitate truly personalized medicine, new perspectives may be required. This paper reviews the use of both mechanistic models and ML in healthcare as well as emerging hybrid methods, which are an exciting and promising approach for biologically based, yet data-driven advanced intelligent systems.
Mechanistic modelling of the inhibitory effect of pH on microbial growth.
Akkermans, Simen; Van Impe, Jan F
2018-06-01
Modelling and simulation of microbial dynamics as a function of processing, transportation and storage conditions is a useful tool to improve microbial food safety and quality. The goal of this research is to improve an existing methodology for building mechanistic predictive models based on the environmental conditions. The effect of environmental conditions on microbial dynamics is often described by combining the separate effects in a multiplicative way (gamma concept). This idea was extended further in this work by including the effects of the lag and stationary growth phases on microbial growth rate as independent gamma factors. A mechanistic description of the stationary phase as a function of pH was included, based on a novel class of models that consider product inhibition. Experimental results on Escherichia coli growth dynamics indicated that also the parameters of the product inhibition equations can be modelled with the gamma approach. This work has extended a modelling methodology, resulting in predictive models that are (i) mechanistically inspired, (ii) easily identifiable with a limited work load and (iii) easily extended to additional environmental conditions. Copyright © 2017. Published by Elsevier Ltd.
Putting the psychology back into psychological models: mechanistic versus rational approaches.
Sakamoto, Yasuaki; Jones, Mattr; Love, Bradley C
2008-09-01
Two basic approaches to explaining the nature of the mind are the rational and the mechanistic approaches. Rational analyses attempt to characterize the environment and the behavioral outcomes that humans seek to optimize, whereas mechanistic models attempt to simulate human behavior using processes and representations analogous to those used by humans. We compared these approaches with regard to their accounts of how humans learn the variability of categories. The mechanistic model departs in subtle ways from rational principles. In particular, the mechanistic model incrementally updates its estimates of category means and variances through error-driven learning, based on discrepancies between new category members and the current representation of each category. The model yields a prediction, which we verify, regarding the effects of order manipulations that the rational approach does not anticipate. Although both rational and mechanistic models can successfully postdict known findings, we suggest that psychological advances are driven primarily by consideration of process and representation and that rational accounts trail these breakthroughs.
Zhang, X; Duan, J; Kesisoglou, F; Novakovic, J; Amidon, G L; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R
2017-08-01
On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled "Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation." The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole-body framework. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Iowa calibration of MEPDG performance prediction models.
DOT National Transportation Integrated Search
2013-06-01
This study aims to improve the accuracy of AASHTO Mechanistic-Empirical Pavement Design Guide (MEPDG) pavement : performance predictions for Iowa pavement systems through local calibration of MEPDG prediction models. A total of 130 : representative p...
Draft user's guide for UDOT mechanistic-empirical pavement design.
DOT National Transportation Integrated Search
2009-10-01
Validation of the new AASHTO Mechanistic-Empirical Pavement Design Guides (MEPDG) nationally calibrated pavement distress and smoothness prediction models when applied under Utah conditions, and local calibration of the new hot-mix asphalt (HMA) p...
Theil, P K; Flummer, C; Hurley, W L; Kristensen, N B; Labouriau, R L; Sørensen, M T
2014-12-01
The aims of the present study were to quantify colostrum intake (CI) of piglets using the D2O dilution technique, to develop a mechanistic model to predict CI, to compare these data with CI predicted by a previous empirical predictive model developed for bottle-fed piglets, and to study how composition of diets fed to gestating sows affected piglet CI, sow colostrum yield (CY), and colostrum composition. In total, 240 piglets from 40 litters were enriched with D2O. The CI measured by D2O from birth until 24 h after the birth of first-born piglet was on average 443 g (SD 151). Based on measured CI, a mechanistic model to predict CI was developed using piglet characteristics (24-h weight gain [WG; g], BW at birth [BWB; kg], and duration of CI [D; min]: CI, g=-106+2.26 WG+200 BWB+0.111 D-1,414 WG/D+0.0182 WG/BWB (R2=0.944). This model was used to predict the CI for all colostrum suckling piglets within the 40 litters (n=500, mean=437 g, SD=153 g) and was compared with the CI predicted by a previous empirical predictive model (mean=305 g, SD=140 g). The previous empirical model underestimated the CI by 30% compared with that obtained by the new mechanistic model. The sows were fed 1 of 4 gestation diets (n=10 per diet) based on different fiber sources (low fiber [17%] or potato pulp, pectin residue, or sugarbeet pulp [32 to 40%]) from mating until d 108 of gestation. From d 108 of gestation until parturition, sows were fed 1 of 5 prefarrowing diets (n=8 per diet) varying in supplemented fat (3% animal fat, 8% coconut oil, 8% sunflower oil, 8% fish oil, or 4% fish oil+4% octanoic acid). Sows fed diets with pectin residue or sugarbeet pulp during gestation produced colostrum with lower protein, fat, DM, and energy concentrations and higher lactose concentrations, and their piglets had greater CI as compared with sows fed potato pulp or the low-fiber diet (P<0.05), and sows fed pectin residue had a greater CY than potato pulp-fed sows (P<0.05). Prefarrowing diets affected neither CI nor CY, but the prefarrowing diet with coconut oil decreased lactose and increased DM concentrations of colostrum compared with other prefarrowing diets (P<0.05). In conclusion, the new mechanistic predictive model for CI suggests that the previous empirical predictive model underestimates CI of sow-reared piglets by 30%. It was also concluded that nutrition of sows during gestation affected CY and colostrum composition.
Merkle, Jerod A.; Cross, Paul C.; Scurlock, Brandon M.; Cole, Eric K.; Courtemanch, Alyson B.; Dewey, Sarah R.; Kauffman, Matthew J.
2018-01-01
Disease models typically focus on temporal dynamics of infection, while often neglecting environmental processes that determine host movement. In many systems, however, temporal disease dynamics may be slow compared to the scale at which environmental conditions alter host space-use and accelerate disease transmission.Using a mechanistic movement modelling approach, we made space-use predictions of a mobile host (elk [Cervus Canadensis] carrying the bacterial disease brucellosis) under environmental conditions that change daily and annually (e.g., plant phenology, snow depth), and we used these predictions to infer how spring phenology influences the risk of brucellosis transmission from elk (through aborted foetuses) to livestock in the Greater Yellowstone Ecosystem.Using data from 288 female elk monitored with GPS collars, we fit step selection functions (SSFs) during the spring abortion season and then implemented a master equation approach to translate SSFs into predictions of daily elk distribution for five plausible winter weather scenarios (from a heavy snow, to an extreme winter drought year). We predicted abortion events by combining elk distributions with empirical estimates of daily abortion rates, spatially varying elk seroprevelance and elk population counts.Our results reveal strong spatial variation in disease transmission risk at daily and annual scales that is strongly governed by variation in host movement in response to spring phenology. For example, in comparison with an average snow year, years with early snowmelt are predicted to have 64% of the abortions occurring on feedgrounds shift to occurring on mainly public lands, and to a lesser extent on private lands.Synthesis and applications. Linking mechanistic models of host movement with disease dynamics leads to a novel bridge between movement and disease ecology. Our analysis framework offers new avenues for predicting disease spread, while providing managers tools to proactively mitigate risks posed by mobile disease hosts. More broadly, we demonstrate how mechanistic movement models can provide predictions of ecological conditions that are consistent with climate change but may be more extreme than has been observed historically.
Varma, Manthena V S; Lin, Jian; Bi, Yi-An; Rotter, Charles J; Fahmi, Odette A; Lam, Justine L; El-Kattan, Ayman F; Goosen, Theunis C; Lai, Yurong
2013-05-01
Repaglinide is mainly metabolized by cytochrome P450 enzymes CYP2C8 and CYP3A4, and it is also a substrate to a hepatic uptake transporter, organic anion transporting polypeptide (OATP)1B1. The purpose of this study is to predict the dosing time-dependent pharmacokinetic interactions of repaglinide with rifampicin, using mechanistic models. In vitro hepatic transport of repaglinide, characterized using sandwich-cultured human hepatocytes, and intrinsic metabolic parameters were used to build a dynamic whole-body physiologically-based pharmacokinetic (PBPK) model. The PBPK model adequately described repaglinide plasma concentration-time profiles and successfully predicted area under the plasma concentration-time curve ratios of repaglinide (within ± 25% error), dosed (staggered 0-24 hours) after rifampicin treatment when primarily considering induction of CYP3A4 and reversible inhibition of OATP1B1 by rifampicin. Further, a static mechanistic "extended net-effect" model incorporating transport and metabolic disposition parameters of repaglinide and interaction potency of rifampicin was devised. Predictions based on the static model are similar to those observed in the clinic (average error ∼19%) and to those based on the PBPK model. Both the models suggested that the combined effect of increased gut extraction and decreased hepatic uptake caused minimal repaglinide systemic exposure change when repaglinide is dosed simultaneously or 1 hour after the rifampicin dose. On the other hand, isolated induction effect as a result of temporal separation of the two drugs translated to an approximate 5-fold reduction in repaglinide systemic exposure. In conclusion, both dynamic and static mechanistic models are instrumental in delineating the quantitative contribution of transport and metabolism in the dosing time-dependent repaglinide-rifampicin interactions.
Emami Riedmaier, Arian; Lindley, David J; Hall, Jeffrey A; Castleberry, Steven; Slade, Russell T; Stuart, Patricia; Carr, Robert A; Borchardt, Thomas B; Bow, Daniel A J; Nijsen, Marjoleen
2018-01-01
Venetoclax, a selective B-cell lymphoma-2 inhibitor, is a biopharmaceutics classification system class IV compound. The aim of this study was to develop a physiologically based pharmacokinetic (PBPK) model to mechanistically describe absorption and disposition of an amorphous solid dispersion formulation of venetoclax in humans. A mechanistic PBPK model was developed incorporating measured amorphous solubility, dissolution, metabolism, and plasma protein binding. A middle-out approach was used to define permeability. Model predictions of oral venetoclax pharmacokinetics were verified against clinical studies of fed and fasted healthy volunteers, and clinical drug interaction studies with strong CYP3A inhibitor (ketoconazole) and inducer (rifampicin). Model verification demonstrated accurate prediction of the observed food effect following a low-fat diet. Ratios of predicted versus observed C max and area under the curve of venetoclax were within 0.8- to 1.25-fold of observed ratios for strong CYP3A inhibitor and inducer interactions, indicating that the venetoclax elimination pathway was correctly specified. The verified venetoclax PBPK model is one of the first examples mechanistically capturing absorption, food effect, and exposure of an amorphous solid dispersion formulated compound. This model allows evaluation of untested drug-drug interactions, especially those primarily occurring in the intestine, and paves the way for future modeling of biopharmaceutics classification system IV compounds. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Thomas E. Dilts; Peter J. Weisberg; Camie M. Dencker; Jeanne C. Chambers
2015-01-01
We have three goals. (1) To develop a suite of functionally relevant climate variables for modelling vegetation distribution on arid and semi-arid landscapes of the Great Basin, USA. (2) To compare the predictive power of vegetation distribution models based on mechanistically proximate factors (water deficit variables) and factors that are more mechanistically removed...
A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.
Revell, Christopher; Somveille, Marius
2017-08-29
In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.
Nøst, Therese Haugdahl; Breivik, Knut; Wania, Frank; Rylander, Charlotta; Odland, Jon Øyvind; Sandanger, Torkjel Manning
2015-01-01
Background Studies on the health effects of polychlorinated biphenyls (PCBs) call for an understanding of past and present human exposure. Time-resolved mechanistic models may supplement information on concentrations in individuals obtained from measurements and/or statistical approaches if they can be shown to reproduce empirical data. Objectives Here, we evaluated the capability of one such mechanistic model to reproduce measured PCB concentrations in individual Norwegian women. We also assessed individual life-course concentrations. Methods Concentrations of four PCB congeners in pregnant (n = 310, sampled in 2007–2009) and postmenopausal (n = 244, 2005) women were compared with person-specific predictions obtained using CoZMoMAN, an emission-based environmental fate and human food-chain bioaccumulation model. Person-specific predictions were also made using statistical regression models including dietary and lifestyle variables and concentrations. Results CoZMoMAN accurately reproduced medians and ranges of measured concentrations in the two study groups. Furthermore, rank correlations between measurements and predictions from both CoZMoMAN and regression analyses were strong (Spearman’s r > 0.67). Precision in quartile assignments from predictions was strong overall as evaluated by weighted Cohen’s kappa (> 0.6). Simulations indicated large inter-individual differences in concentrations experienced in the past. Conclusions The mechanistic model reproduced all measurements of PCB concentrations within a factor of 10, and subject ranking and quartile assignments were overall largely consistent, although they were weak within each study group. Contamination histories for individuals predicted by CoZMoMAN revealed variation between study subjects, particularly in the timing of peak concentrations. Mechanistic models can provide individual PCB exposure metrics that could serve as valuable supplements to measurements. Citation Nøst TH, Breivik K, Wania F, Rylander C, Odland JØ, Sandanger TM. 2016. Estimating time-varying PCB exposures using person-specific predictions to supplement measured values: a comparison of observed and predicted values in two cohorts of Norwegian women. Environ Health Perspect 124:299–305; http://dx.doi.org/10.1289/ehp.1409191 PMID:26186800
Validation of pavement performance curves for the mechanistic-empirical pavement design guide.
DOT National Transportation Integrated Search
2009-02-01
The objective of this research is to determine whether the nationally calibrated performance models used in the Mechanistic-Empirical : Pavement Design Guide (MEPDG) provide a reasonable prediction of actual field performance, and if the desired accu...
Multi input single output model predictive control of non-linear bio-polymerization process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arumugasamy, Senthil Kumar; Ahmad, Z.
This paper focuses on Multi Input Single Output (MISO) Model Predictive Control of bio-polymerization process in which mechanistic model is developed and linked with the feedforward neural network model to obtain a hybrid model (Mechanistic-FANN) of lipase-catalyzed ring-opening polymerization of ε-caprolactone (ε-CL) for Poly (ε-caprolactone) production. In this research, state space model was used, in which the input to the model were the reactor temperatures and reactor impeller speeds and the output were the molecular weight of polymer (M{sub n}) and polymer polydispersity index. State space model for MISO created using System identification tool box of Matlab™. This state spacemore » model is used in MISO MPC. Model predictive control (MPC) has been applied to predict the molecular weight of the biopolymer and consequently control the molecular weight of biopolymer. The result shows that MPC is able to track reference trajectory and give optimum movement of manipulated variable.« less
Scherrer, Stephen R; Rideout, Brendan P; Giorli, Giacomo; Nosal, Eva-Marie; Weng, Kevin C
2018-01-01
Passive acoustic telemetry using coded transmitter tags and stationary receivers is a popular method for tracking movements of aquatic animals. Understanding the performance of these systems is important in array design and in analysis. Close proximity detection interference (CPDI) is a condition where receivers fail to reliably detect tag transmissions. CPDI generally occurs when the tag and receiver are near one another in acoustically reverberant settings. Here we confirm transmission multipaths reflected off the environment arriving at a receiver with sufficient delay relative to the direct signal cause CPDI. We propose a ray-propagation based model to estimate the arrival of energy via multipaths to predict CPDI occurrence, and we show how deeper deployments are particularly susceptible. A series of experiments were designed to develop and validate our model. Deep (300 m) and shallow (25 m) ranging experiments were conducted using Vemco V13 acoustic tags and VR2-W receivers. Probabilistic modeling of hourly detections was used to estimate the average distance a tag could be detected. A mechanistic model for predicting the arrival time of multipaths was developed using parameters from these experiments to calculate the direct and multipath path lengths. This model was retroactively applied to the previous ranging experiments to validate CPDI observations. Two additional experiments were designed to validate predictions of CPDI with respect to combinations of deployment depth and distance. Playback of recorded tags in a tank environment was used to confirm multipaths arriving after the receiver's blanking interval cause CPDI effects. Analysis of empirical data estimated the average maximum detection radius (AMDR), the farthest distance at which 95% of tag transmissions went undetected by receivers, was between 840 and 846 m for the deep ranging experiment across all factor permutations. From these results, CPDI was estimated within a 276.5 m radius of the receiver. These empirical estimations were consistent with mechanistic model predictions. CPDI affected detection at distances closer than 259-326 m from receivers. AMDR determined from the shallow ranging experiment was between 278 and 290 m with CPDI neither predicted nor observed. Results of validation experiments were consistent with mechanistic model predictions. Finally, we were able to predict detection/nondetection with 95.7% accuracy using the mechanistic model's criterion when simulating transmissions with and without multipaths. Close proximity detection interference results from combinations of depth and distance that produce reflected signals arriving after a receiver's blanking interval has ended. Deployment scenarios resulting in CPDI can be predicted with the proposed mechanistic model. For deeper deployments, sea-surface reflections can produce CPDI conditions, resulting in transmission rejection, regardless of the reflective properties of the seafloor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.; Zawadzki, S.A.
The primary physical/chemical models that form the basis of the FASTGRASS mechanistic computer model for calculating fission-product release from nuclear fuel are described. Calculated results are compared with test data and the major mechanisms affecting the transport of fission products during steady-state and accident conditions are identified.
Mechanistic model for catalytic recombination during aerobraking maneuvers
NASA Technical Reports Server (NTRS)
Willey, Ronald J.
1989-01-01
Several mechanistic models are developed to predict recombination coefficients for use in heat shield design for reusable surface insulation (RSI) on aerobraking vehicles such as space shuttles. The models are applied over a temperature range of 300 to 1800 K and a stagnation pressure range of 0 to 3,000 Pa. A four parameter model in temperature was found to work best; however, several models (including those with atom concentrations at the surface) were also investigated. Mechanistic models developed with atom concentration terms may be applicable when sufficient data becomes available. The requirement is shown for recombination experiments in the 300 to 1000 K and 1500 to 1850 K temperature range, with deliberate concentration variations.
Pathak, Shriram M; Ruff, Aaron; Kostewicz, Edmund S; Patel, Nikunjkumar; Turner, David B; Jamei, Masoud
2017-12-04
Mechanistic modeling of in vitro data generated from metabolic enzyme systems (viz., liver microsomes, hepatocytes, rCYP enzymes, etc.) facilitates in vitro-in vivo extrapolation (IVIV_E) of metabolic clearance which plays a key role in the successful prediction of clearance in vivo within physiologically-based pharmacokinetic (PBPK) modeling. A similar concept can be applied to solubility and dissolution experiments whereby mechanistic modeling can be used to estimate intrinsic parameters required for mechanistic oral absorption simulation in vivo. However, this approach has not widely been applied within an integrated workflow. We present a stepwise modeling approach where relevant biopharmaceutics parameters for ketoconazole (KTZ) are determined and/or confirmed from the modeling of in vitro experiments before being directly used within a PBPK model. Modeling was applied to various in vitro experiments, namely: (a) aqueous solubility profiles to determine intrinsic solubility, salt limiting solubility factors and to verify pK a ; (b) biorelevant solubility measurements to estimate bile-micelle partition coefficients; (c) fasted state simulated gastric fluid (FaSSGF) dissolution for formulation disintegration profiling; and (d) transfer experiments to estimate supersaturation and precipitation parameters. These parameters were then used within a PBPK model to predict the dissolved and total (i.e., including the precipitated fraction) concentrations of KTZ in the duodenum of a virtual population and compared against observed clinical data. The developed model well characterized the intraluminal dissolution, supersaturation, and precipitation behavior of KTZ. The mean simulated AUC 0-t of the total and dissolved concentrations of KTZ were comparable to (within 2-fold of) the corresponding observed profile. Moreover, the developed PBPK model of KTZ successfully described the impact of supersaturation and precipitation on the systemic plasma concentration profiles of KTZ for 200, 300, and 400 mg doses. These results demonstrate that IVIV_E applied to biopharmaceutical experiments can be used to understand and build confidence in the quality of the input parameters and mechanistic models used for mechanistic oral absorption simulations in vivo, thereby improving the prediction performance of PBPK models. Moreover, this approach can inform the selection and design of in vitro experiments, potentially eliminating redundant experiments and thus helping to reduce the cost and time of drug product development.
Mechanistic species distribution modelling as a link between physiology and conservation.
Evans, Tyler G; Diamond, Sarah E; Kelly, Morgan W
2015-01-01
Climate change conservation planning relies heavily on correlative species distribution models that estimate future areas of occupancy based on environmental conditions encountered in present-day ranges. The approach benefits from rapid assessment of vulnerability over a large number of organisms, but can have poor predictive power when transposed to novel environments and reveals little in the way of causal mechanisms that define changes in species distribution or abundance. Having conservation planning rely largely on this single approach also increases the risk of policy failure. Mechanistic models that are parameterized with physiological information are expected to be more robust when extrapolating distributions to future environmental conditions and can identify physiological processes that set range boundaries. Implementation of mechanistic species distribution models requires knowledge of how environmental change influences physiological performance, and because this information is currently restricted to a comparatively small number of well-studied organisms, use of mechanistic modelling in the context of climate change conservation is limited. In this review, we propose that the need to develop mechanistic models that incorporate physiological data presents an opportunity for physiologists to contribute more directly to climate change conservation and advance the field of conservation physiology. We begin by describing the prevalence of species distribution modelling in climate change conservation, highlighting the benefits and drawbacks of both mechanistic and correlative approaches. Next, we emphasize the need to expand mechanistic models and discuss potential metrics of physiological performance suitable for integration into mechanistic models. We conclude by summarizing other factors, such as the need to consider demography, limiting broader application of mechanistic models in climate change conservation. Ideally, modellers, physiologists and conservation practitioners would work collaboratively to build models, interpret results and consider conservation management options, and articulating this need here may help to stimulate collaboration.
Predicting colloid transport through saturated porous media: A critical review
NASA Astrophysics Data System (ADS)
Molnar, Ian L.; Johnson, William P.; Gerhard, Jason I.; Willson, Clinton S.; O'Carroll, Denis M.
2015-09-01
Understanding and predicting colloid transport and retention in water-saturated porous media is important for the protection of human and ecological health. Early applications of colloid transport research before the 1990s included the removal of pathogens in granular drinking water filters. Since then, interest has expanded significantly to include such areas as source zone protection of drinking water systems and injection of nanometals for contaminated site remediation. This review summarizes predictive tools for colloid transport from the pore to field scales. First, we review experimental breakthrough and retention of colloids under favorable and unfavorable colloid/collector interactions (i.e., no significant and significant colloid-surface repulsion, respectively). Second, we review the continuum-scale modeling strategies used to describe observed transport behavior. Third, we review the following two components of colloid filtration theory: (i) mechanistic force/torque balance models of pore-scale colloid trajectories and (ii) approximating correlation equations used to predict colloid retention. The successes and limitations of these approaches for favorable conditions are summarized, as are recent developments to predict colloid retention under the unfavorable conditions particularly relevant to environmental applications. Fourth, we summarize the influences of physical and chemical heterogeneities on colloid transport and avenues for their prediction. Fifth, we review the upscaling of mechanistic model results to rate constants for use in continuum models of colloid behavior at the column and field scales. Overall, this paper clarifies the foundation for existing knowledge of colloid transport and retention, features recent advances in the field, critically assesses where existing approaches are successful and the limits of their application, and highlights outstanding challenges and future research opportunities. These challenges and opportunities include improving mechanistic descriptions, and subsequent correlation equations, for nanoparticle (i.e., Brownian particle) transport through soil, developing mechanistic descriptions of colloid retention in so-called "unfavorable" conditions via methods such as the "discrete heterogeneity" approach, and employing imaging techniques such as X-ray tomography to develop realistic expressions for grain topology and mineral distribution that can aid the development of these mechanistic approaches.
MATHEMATICAL MODEL OF STERIODOGENESIS TO PREDICT DYNAMIC RESPONSE TO ENDOCRINE DISRUPTORS
WE ARE DEVELOPING A MECHANISTIC MATHEMATICAL MODEL OF THE INTRATESTICULAR AND INTRAOVARIAN METABOLIC NETWORK THAT MEDIATES STEROID SYNTHESIS, AND THE KINETICS FOR ENZYME INHIBITION BY EDCs TO PREDICT THE TIME AND DOSE-RESPONSE.
(Q)SARs to predict environmental toxicities: current status and future needs.
Cronin, Mark T D
2017-03-22
The current state of the art of (Quantitative) Structure-Activity Relationships ((Q)SARs) to predict environmental toxicity is assessed along with recommendations to develop these models further. The acute toxicity of compounds acting by the non-polar narcotic mechanism of action can be well predicted, however other approaches, including read-across, may be required for compounds acting by specific mechanisms of action. The chronic toxicity of compounds to environmental species is more difficult to predict from (Q)SARs, with robust data sets and more mechanistic information required. In addition, the toxicity of mixtures is little addressed by (Q)SAR approaches. Developments in environmental toxicology including Adverse Outcome Pathways (AOPs) and omics responses should be utilised to develop better, more mechanistically relevant, (Q)SAR models.
Modeling of Mn/Road test sections with the CRREL mechanistic pavement design procedure
DOT National Transportation Integrated Search
1996-09-01
The U.S. Army Cold Regions Research and Engineering Laboratory is developing a mechanistic pavement design procedure for use in seasonal frost areas. The procedure was used to predict pavement performance of some test sections under construction at t...
Mechanistic materials modeling for nuclear fuel performance
Tonks, Michael R.; Andersson, David; Phillpot, Simon R.; ...
2017-03-15
Fuel performance codes are critical tools for the design, certification, and safety analysis of nuclear reactors. However, their ability to predict fuel behavior under abnormal conditions is severely limited by their considerable reliance on empirical materials models correlated to burn-up (a measure of the number of fission events that have occurred, but not a unique measure of the history of the material). In this paper, we propose a different paradigm for fuel performance codes to employ mechanistic materials models that are based on the current state of the evolving microstructure rather than burn-up. In this approach, a series of statemore » variables are stored at material points and define the current state of the microstructure. The evolution of these state variables is defined by mechanistic models that are functions of fuel conditions and other state variables. The material properties of the fuel and cladding are determined from microstructure/property relationships that are functions of the state variables and the current fuel conditions. Multiscale modeling and simulation is being used in conjunction with experimental data to inform the development of these models. Finally, this mechanistic, microstructure-based approach has the potential to provide a more predictive fuel performance capability, but will require a team of researchers to complete the required development and to validate the approach.« less
Biomechanics meets the ecological niche: the importance of temporal data resolution.
Kearney, Michael R; Matzelle, Allison; Helmuth, Brian
2012-03-15
The emerging field of mechanistic niche modelling aims to link the functional traits of organisms to their environments to predict survival, reproduction, distribution and abundance. This approach has great potential to increase our understanding of the impacts of environmental change on individuals, populations and communities by providing functional connections between physiological and ecological response to increasingly available spatial environmental data. By their nature, such mechanistic models are more data intensive in comparison with the more widely applied correlative approaches but can potentially provide more spatially and temporally explicit predictions, which are often needed by decision makers. A poorly explored issue in this context is the appropriate level of temporal resolution of input data required for these models, and specifically the error in predictions that can be incurred through the use of temporally averaged data. Here, we review how biomechanical principles from heat-transfer and metabolic theory are currently being used as foundations for mechanistic niche models and consider the consequences of different temporal resolutions of environmental data for modelling the niche of a behaviourally thermoregulating terrestrial lizard. We show that fine-scale temporal resolution (daily) data can be crucial for unbiased inference of climatic impacts on survival, growth and reproduction. This is especially so for species with little capacity for behavioural buffering, because of behavioural or habitat constraints, and for detecting temporal trends. However, coarser-resolution data (long-term monthly averages) can be appropriate for mechanistic studies of climatic constraints on distribution and abundance limits in thermoregulating species at broad spatial scales.
A semi-mechanistic model of dead fine fuel moisture for Temperate and Mediterranean ecosystems
NASA Astrophysics Data System (ADS)
Resco de Dios, Víctor; Fellows, Aaron; Boer, Matthias; Bradstock, Ross; Nolan, Rachel; Goulden, Michel
2014-05-01
Fire is a major disturbance in terrestrial ecosystems globally. It has an enormous economic and social cost, and leads to fatalities in the worst cases. The moisture content of the vegetation (fuel moisture) is one of the main determinants of fire risk. Predicting the moisture content of dead and fine fuel (< 2.5 cm in diameter) is particularly important, as this is often the most important component of the fuel complex for fire propagation. A variety of drought indices, empirical and mechanistic models have been proposed to model fuel moisture. A commonality across these different approaches is that they have been neither validated across large temporal datasets nor validated across broadly different vegetation types. Here, we present the results of a study performed at 6 locations in California, USA (5 sites) and New South Wales, Australia (1 site), where 10-hours fuel moisture content was continuously measured every 30 minutes during one full year at each site. We observed that drought indices did not accurately predict fuel moisture, and that empirical and mechanistic models both needed site-specific calibrations, which hinders their global application as indices of fuel moisture. We developed a novel, single equation and semi-mechanistic model, based on atmospheric vapor-pressure deficit. Across sites and years, mean absolute error (MAE) of predicted fuel moisture was 4.7%. MAE dropped <1% in the critical range of fuel moisture <10%. The high simplicity, accuracy and precision of our model makes it suitable for a wide range of applications: from operational purposes, to global vegetation models.
Ecological Forecasting in Chesapeake Bay: Using a Mechanistic-Empirical Modelling Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, C. W.; Hood, Raleigh R.; Long, Wen
The Chesapeake Bay Ecological Prediction System (CBEPS) automatically generates daily nowcasts and three-day forecasts of several environmental variables, such as sea-surface temperature and salinity, the concentrations of chlorophyll, nitrate, and dissolved oxygen, and the likelihood of encountering several noxious species, including harmful algal blooms and water-borne pathogens, for the purpose of monitoring the Bay's ecosystem. While the physical and biogeochemical variables are forecast mechanistically using the Regional Ocean Modeling System configured for the Chesapeake Bay, the species predictions are generated using a novel mechanistic empirical approach, whereby real-time output from the coupled physical biogeochemical model drives multivariate empirical habitat modelsmore » of the target species. The predictions, in the form of digital images, are available via the World Wide Web to interested groups to guide recreational, management, and research activities. Though full validation of the integrated forecasts for all species is still a work in progress, we argue that the mechanistic–empirical approach can be used to generate a wide variety of short-term ecological forecasts, and that it can be applied in any marine system where sufficient data exist to develop empirical habitat models. This paper provides an overview of this system, its predictions, and the approach taken.« less
Learning to predict chemical reactions.
Kayala, Matthew A; Azencott, Chloé-Agathe; Chen, Jonathan H; Baldi, Pierre
2011-09-26
Being able to predict the course of arbitrary chemical reactions is essential to the theory and applications of organic chemistry. Approaches to the reaction prediction problems can be organized around three poles corresponding to: (1) physical laws; (2) rule-based expert systems; and (3) inductive machine learning. Previous approaches at these poles, respectively, are not high throughput, are not generalizable or scalable, and lack sufficient data and structure to be implemented. We propose a new approach to reaction prediction utilizing elements from each pole. Using a physically inspired conceptualization, we describe single mechanistic reactions as interactions between coarse approximations of molecular orbitals (MOs) and use topological and physicochemical attributes as descriptors. Using an existing rule-based system (Reaction Explorer), we derive a restricted chemistry data set consisting of 1630 full multistep reactions with 2358 distinct starting materials and intermediates, associated with 2989 productive mechanistic steps and 6.14 million unproductive mechanistic steps. And from machine learning, we pose identifying productive mechanistic steps as a statistical ranking, information retrieval problem: given a set of reactants and a description of conditions, learn a ranking model over potential filled-to-unfilled MO interactions such that the top-ranked mechanistic steps yield the major products. The machine learning implementation follows a two-stage approach, in which we first train atom level reactivity filters to prune 94.00% of nonproductive reactions with a 0.01% error rate. Then, we train an ensemble of ranking models on pairs of interacting MOs to learn a relative productivity function over mechanistic steps in a given system. Without the use of explicit transformation patterns, the ensemble perfectly ranks the productive mechanism at the top 89.05% of the time, rising to 99.86% of the time when the top four are considered. Furthermore, the system is generalizable, making reasonable predictions over reactants and conditions which the rule-based expert does not handle. A web interface to the machine learning based mechanistic reaction predictor is accessible through our chemoinformatics portal ( http://cdb.ics.uci.edu) under the Toolkits section.
Learning to Predict Chemical Reactions
Kayala, Matthew A.; Azencott, Chloé-Agathe; Chen, Jonathan H.
2011-01-01
Being able to predict the course of arbitrary chemical reactions is essential to the theory and applications of organic chemistry. Approaches to the reaction prediction problems can be organized around three poles corresponding to: (1) physical laws; (2) rule-based expert systems; and (3) inductive machine learning. Previous approaches at these poles respectively are not high-throughput, are not generalizable or scalable, or lack sufficient data and structure to be implemented. We propose a new approach to reaction prediction utilizing elements from each pole. Using a physically inspired conceptualization, we describe single mechanistic reactions as interactions between coarse approximations of molecular orbitals (MOs) and use topological and physicochemical attributes as descriptors. Using an existing rule-based system (Reaction Explorer), we derive a restricted chemistry dataset consisting of 1630 full multi-step reactions with 2358 distinct starting materials and intermediates, associated with 2989 productive mechanistic steps and 6.14 million unproductive mechanistic steps. And from machine learning, we pose identifying productive mechanistic steps as a statistical ranking, information retrieval, problem: given a set of reactants and a description of conditions, learn a ranking model over potential filled-to-unfilled MO interactions such that the top ranked mechanistic steps yield the major products. The machine learning implementation follows a two-stage approach, in which we first train atom level reactivity filters to prune 94.00% of non-productive reactions with a 0.01% error rate. Then, we train an ensemble of ranking models on pairs of interacting MOs to learn a relative productivity function over mechanistic steps in a given system. Without the use of explicit transformation patterns, the ensemble perfectly ranks the productive mechanism at the top 89.05% of the time, rising to 99.86% of the time when the top four are considered. Furthermore, the system is generalizable, making reasonable predictions over reactants and conditions which the rule-based expert does not handle. A web interface to the machine learning based mechanistic reaction predictor is accessible through our chemoinformatics portal (http://cdb.ics.uci.edu) under the Toolkits section. PMID:21819139
A reactive transport model was developed to simultaneously predict Cryptosporidium parvum oocyst inactivation and bromate formation during ozonation of natural water. A mechanistic model previously established to predict bromate formation in organic-free synthetic waters w...
Dudley, Peter N; Bonazza, Riccardo; Jones, T Todd; Wyneken, Jeanette; Porter, Warren P
2014-01-01
As global temperatures increase throughout the coming decades, species ranges will shift. New combinations of abiotic conditions will make predicting these range shifts difficult. Biophysical mechanistic niche modeling places bounds on an animal's niche through analyzing the animal's physical interactions with the environment. Biophysical mechanistic niche modeling is flexible enough to accommodate these new combinations of abiotic conditions. However, this approach is difficult to implement for aquatic species because of complex interactions among thrust, metabolic rate and heat transfer. We use contemporary computational fluid dynamic techniques to overcome these difficulties. We model the complex 3D motion of a swimming neonate and juvenile leatherback sea turtle to find power and heat transfer rates during the stroke. We combine the results from these simulations and a numerical model to accurately predict the core temperature of a swimming leatherback. These results are the first steps in developing a highly accurate mechanistic niche model, which can assists paleontologist in understanding biogeographic shifts as well as aid contemporary species managers about potential range shifts over the coming decades.
NASA Astrophysics Data System (ADS)
Abdul-Aziz, O. I.; Ishtiaq, K. S.
2015-12-01
We present a user-friendly modeling tool on MS Excel to predict the greenhouse gas (GHG) fluxes and estimate potential carbon sequestration from the coastal wetlands. The dominant controls of wetland GHG fluxes and their relative mechanistic linkages with various hydro-climatic, sea level, biogeochemical and ecological drivers were first determined by employing a systematic data-analytics method, including Pearson correlation matrix, principal component and factor analyses, and exploratory partial least squares regressions. The mechanistic knowledge and understanding was then utilized to develop parsimonious non-linear (power-law) models to predict wetland carbon dioxide (CO2) and methane (CH4) fluxes based on a sub-set of climatic, hydrologic and environmental drivers such as the photosynthetically active radiation, soil temperature, water depth, and soil salinity. The models were tested with field data for multiple sites and seasons (2012-13) collected from the Waquoit Bay, MA. The model estimated the annual wetland carbon storage by up-scaling the instantaneous predicted fluxes to an extended growing season (e.g., May-October) and by accounting for the net annual lateral carbon fluxes between the wetlands and estuary. The Excel Spreadsheet model is a simple ecological engineering tool for coastal carbon management and their incorporation into a potential carbon market under a changing climate, sea level and environment. Specifically, the model can help to determine appropriate GHG offset protocols and monitoring plans for projects that focus on tidal wetland restoration and maintenance.
NASA Astrophysics Data System (ADS)
Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.
2003-03-01
A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.
Kayala, Matthew A; Baldi, Pierre
2012-10-22
Proposing reasonable mechanisms and predicting the course of chemical reactions is important to the practice of organic chemistry. Approaches to reaction prediction have historically used obfuscating representations and manually encoded patterns or rules. Here we present ReactionPredictor, a machine learning approach to reaction prediction that models elementary, mechanistic reactions as interactions between approximate molecular orbitals (MOs). A training data set of productive reactions known to occur at reasonable rates and yields and verified by inclusion in the literature or textbooks is derived from an existing rule-based system and expanded upon with manual curation from graduate level textbooks. Using this training data set of complex polar, hypervalent, radical, and pericyclic reactions, a two-stage machine learning prediction framework is trained and validated. In the first stage, filtering models trained at the level of individual MOs are used to reduce the space of possible reactions to consider. In the second stage, ranking models over the filtered space of possible reactions are used to order the reactions such that the productive reactions are the top ranked. The resulting model, ReactionPredictor, perfectly ranks polar reactions 78.1% of the time and recovers all productive reactions 95.7% of the time when allowing for small numbers of errors. Pericyclic and radical reactions are perfectly ranked 85.8% and 77.0% of the time, respectively, rising to >93% recovery for both reaction types with a small number of allowed errors. Decisions about which of the polar, pericyclic, or radical reaction type ranking models to use can be made with >99% accuracy. Finally, for multistep reaction pathways, we implement the first mechanistic pathway predictor using constrained tree-search to discover a set of reasonable mechanistic steps from given reactants to given products. Webserver implementations of both the single step and pathway versions of ReactionPredictor are available via the chemoinformatics portal http://cdb.ics.uci.edu/.
Crickenberger, Sam; Wethey, David S
2018-05-10
Range shifts due to annual variation in temperature are more tractable than range shifts linked to decadal to century long temperature changes due to climate change, providing natural experiments to determine the mechanisms responsible for driving long-term distributional shifts. In this study we couple physiologically grounded mechanistic models with biogeographic surveys in 2 years with high levels of annual temperature variation to disentangle the drivers of a historical range shift driven by climate change. The distribution of the barnacle Semibalanus balanoides has shifted 350 km poleward in the past half century along the east coast of the United States. Recruits were present throughout the historical range following the 2015 reproductive season, when temperatures were similar to those in the past century, and absent following the 2016 reproductive season when temperatures were warmer than they have been since 1870, the earliest date for temperature records. Our dispersal dependent mechanistic models of reproductive success were highly accurate and predicted patterns of reproduction success documented in field surveys throughout the historical range in 2015 and 2016. Our mechanistic models of reproductive success not only predicted recruitment dynamics near the range edge but also predicted interior range fragmentation in a number of years between 1870 and 2016. All recruits monitored within the historical range following the 2015 colonization died before 2016 suggesting juvenile survival was likely the primary driver of the historical range retraction. However, if 2016 is indicative of future temperatures mechanisms of range limitation will shift and reproductive failure will lead to further range retraction in the future. Mechanistic models are necessary for accurately predicting the effects of climate change on ranges of species. © 2018 John Wiley & Sons Ltd.
Technical note: Bayesian calibration of dynamic ruminant nutrition models.
Reed, K F; Arhonditsis, G B; France, J; Kebreab, E
2016-08-01
Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Rougier, Thibaud; Lassalle, Géraldine; Drouineau, Hilaire; Dumoulin, Nicolas; Faure, Thierry; Deffuant, Guillaume; Rochard, Eric; Lambert, Patrick
2015-01-01
Species can respond to climate change by tracking appropriate environmental conditions in space, resulting in a range shift. Species Distribution Models (SDMs) can help forecast such range shift responses. For few species, both correlative and mechanistic SDMs were built, but allis shad (Alosa alosa), an endangered anadromous fish species, is one of them. The main purpose of this study was to provide a framework for joint analyses of correlative and mechanistic SDMs projections in order to strengthen conservation measures for species of conservation concern. Guidelines for joint representation and subsequent interpretation of models outputs were defined and applied. The present joint analysis was based on the novel mechanistic model GR3D (Global Repositioning Dynamics of Diadromous fish Distribution) which was parameterized on allis shad and then used to predict its future distribution along the European Atlantic coast under different climate change scenarios (RCP 4.5 and RCP 8.5). We then used a correlative SDM for this species to forecast its distribution across the same geographic area and under the same climate change scenarios. First, projections from correlative and mechanistic models provided congruent trends in probability of habitat suitability and population dynamics. This agreement was preferentially interpreted as referring to the species vulnerability to climate change. Climate change could not be accordingly listed as a major threat for allis shad. The congruence in predicted range limits between SDMs projections was the next point of interest. The difference, when noticed, required to deepen our understanding of the niche modelled by each approach. In this respect, the relative position of the northern range limit between the two methods strongly suggested here that a key biological process related to intraspecific variability was potentially lacking in the mechanistic SDM. Based on our knowledge, we hypothesized that local adaptations to cold temperatures deserved more attention in terms of modelling, but further in conservation planning as well.
Rougier, Thibaud; Lassalle, Géraldine; Drouineau, Hilaire; Dumoulin, Nicolas; Faure, Thierry; Deffuant, Guillaume; Rochard, Eric; Lambert, Patrick
2015-01-01
Species can respond to climate change by tracking appropriate environmental conditions in space, resulting in a range shift. Species Distribution Models (SDMs) can help forecast such range shift responses. For few species, both correlative and mechanistic SDMs were built, but allis shad (Alosa alosa), an endangered anadromous fish species, is one of them. The main purpose of this study was to provide a framework for joint analyses of correlative and mechanistic SDMs projections in order to strengthen conservation measures for species of conservation concern. Guidelines for joint representation and subsequent interpretation of models outputs were defined and applied. The present joint analysis was based on the novel mechanistic model GR3D (Global Repositioning Dynamics of Diadromous fish Distribution) which was parameterized on allis shad and then used to predict its future distribution along the European Atlantic coast under different climate change scenarios (RCP 4.5 and RCP 8.5). We then used a correlative SDM for this species to forecast its distribution across the same geographic area and under the same climate change scenarios. First, projections from correlative and mechanistic models provided congruent trends in probability of habitat suitability and population dynamics. This agreement was preferentially interpreted as referring to the species vulnerability to climate change. Climate change could not be accordingly listed as a major threat for allis shad. The congruence in predicted range limits between SDMs projections was the next point of interest. The difference, when noticed, required to deepen our understanding of the niche modelled by each approach. In this respect, the relative position of the northern range limit between the two methods strongly suggested here that a key biological process related to intraspecific variability was potentially lacking in the mechanistic SDM. Based on our knowledge, we hypothesized that local adaptations to cold temperatures deserved more attention in terms of modelling, but further in conservation planning as well. PMID:26426280
DOT National Transportation Integrated Search
2007-08-01
The objective of this research study was to develop performance characteristics or variables (e.g., ride quality, rutting, : fatigue cracking, transverse cracking) of flexible pavements in Montana, and to use these characteristics in the : implementa...
Monte Carlo modeling of atomic oxygen attack of polymers with protective coatings on LDEF
NASA Technical Reports Server (NTRS)
Banks, Bruce A.; Degroh, Kim K.; Sechkar, Edward A.
1992-01-01
Characterization of the behavior of atomic oxygen interaction with materials on the Long Duration Exposure Facility (LDEF) will assist in understanding the mechanisms involved, and will lead to improved reliability in predicting in-space durability of materials based on ground laboratory testing. A computational simulation of atomic oxygen interaction with protected polymers was developed using Monte Carlo techniques. Through the use of assumed mechanistic behavior of atomic oxygen and results of both ground laboratory and LDEF data, a predictive Monte Carlo model was developed which simulates the oxidation processes that occur on polymers with applied protective coatings that have defects. The use of high atomic oxygen fluence-directed ram LDEF results has enabled mechanistic implications to be made by adjusting Monte Carlo modeling assumptions to match observed results based on scanning electron microscopy. Modeling assumptions, implications, and predictions are presented, along with comparison of observed ground laboratory and LDEF results.
Reinterpreting maximum entropy in ecology: a null hypothesis constrained by ecological mechanism.
O'Dwyer, James P; Rominger, Andrew; Xiao, Xiao
2017-07-01
Simplified mechanistic models in ecology have been criticised for the fact that a good fit to data does not imply the mechanism is true: pattern does not equal process. In parallel, the maximum entropy principle (MaxEnt) has been applied in ecology to make predictions constrained by just a handful of state variables, like total abundance or species richness. But an outstanding question remains: what principle tells us which state variables to constrain? Here we attempt to solve both problems simultaneously, by translating a given set of mechanisms into the state variables to be used in MaxEnt, and then using this MaxEnt theory as a null model against which to compare mechanistic predictions. In particular, we identify the sufficient statistics needed to parametrise a given mechanistic model from data and use them as MaxEnt constraints. Our approach isolates exactly what mechanism is telling us over and above the state variables alone. © 2017 John Wiley & Sons Ltd/CNRS.
Rotary ultrasonic machining of CFRP: a mechanistic predictive model for cutting force.
Cong, W L; Pei, Z J; Sun, X; Zhang, C L
2014-02-01
Cutting force is one of the most important output variables in rotary ultrasonic machining (RUM) of carbon fiber reinforced plastic (CFRP) composites. Many experimental investigations on cutting force in RUM of CFRP have been reported. However, in the literature, there are no cutting force models for RUM of CFRP. This paper develops a mechanistic predictive model for cutting force in RUM of CFRP. The material removal mechanism of CFRP in RUM has been analyzed first. The model is based on the assumption that brittle fracture is the dominant mode of material removal. CFRP micromechanical analysis has been conducted to represent CFRP as an equivalent homogeneous material to obtain the mechanical properties of CFRP from its components. Based on this model, relationships between input variables (including ultrasonic vibration amplitude, tool rotation speed, feedrate, abrasive size, and abrasive concentration) and cutting force can be predicted. The relationships between input variables and important intermediate variables (indentation depth, effective contact time, and maximum impact force of single abrasive grain) have been investigated to explain predicted trends of cutting force. Experiments are conducted to verify the model, and experimental results agree well with predicted trends from this model. Copyright © 2013 Elsevier B.V. All rights reserved.
Specialists without spirit: limitations of the mechanistic biomedical model.
Hewa, S; Hetherington, R W
1995-06-01
This paper examines the origin and the development of the mechanistic model of the human body and health in terms of Max Weber's theory of rationalization. It is argued that the development of Western scientific medicine is a part of the broad process of rationalization that began in sixteenth century Europe as a result of the Reformation. The development of the mechanistic view of the human body in Western medicine is consistent with the ideas of calculability, predictability, and control-the major tenets of the process of rationalization as described by Weber. In recent years, however, the limitations of the mechanistic model have been the topic of many discussions. George Engel, a leading advocate of general systems theory, is one of the leading proponents of a new medical model which includes the general quality of life, clean environment, and psychological, or spiritual stability of life. The paper concludes with consideration of the potential of Engel's proposed new model in the context of the current state of rationalization in modern industrialized society.
Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict doseresponse and time-course (...
An ecophysiological perspective on likely giant panda habitat responses to climate change.
Zhang, Yuke; Mathewson, Paul D; Zhang, Qiongyue; Porter, Warren P; Ran, Jianghong
2018-04-01
Threatened and endangered species are more vulnerable to climate change due to small population and specific geographical distribution. Therefore, identifying and incorporating the biological processes underlying a species' adaptation to its environment are important for determining whether they can persist in situ. Correlative models are widely used to predict species' distribution changes, but generally fail to capture the buffering capacity of organisms. Giant pandas (Ailuropoda melanoleuca) live in topographically complex mountains and are known to avoid heat stress. Although many studies have found that climate change will lead to severe habitat loss and threaten previous conservation efforts, the mechanisms underlying panda's responses to climate change have not been explored. Here, we present a case study in Daxiangling Mountains, one of the six Mountain Systems that giant panda distributes. We used a mechanistic model, Niche Mapper, to explore what are likely panda habitat response to climate change taking physiological, behavioral and ecological responses into account, through which we map panda's climatic suitable activity area (SAA) for the first time. We combined SAA with bamboo forest distribution to yield highly suitable habitat (HSH) and seasonal suitable habitat (SSH), and their temporal dynamics under climate change were predicted. In general, SAA in the hottest month (July) would reduce 11.7%-52.2% by 2070, which is more moderate than predicted bamboo habitat loss (45.6%-86.9%). Limited by the availability of bamboo and forest, panda's suitable habitat loss increases, and only 15.5%-68.8% of current HSH would remain in 2070. Our method of mechanistic modeling can help to distinguish whether habitat loss is caused by thermal environmental deterioration or food loss under climate change. Furthermore, mechanistic models can produce robust predictions by incorporating ecophysiological feedbacks and minimizing extrapolation into novel environments. We suggest that a mechanistic approach should be incorporated into distribution predictions and conservation planning. © 2017 John Wiley & Sons Ltd.
Kirk, Devin; Jones, Natalie; Peacock, Stephanie; Phillips, Jessica; Molnár, Péter K; Krkošek, Martin; Luijckx, Pepijn
2018-02-01
The complexity of host-parasite interactions makes it difficult to predict how host-parasite systems will respond to climate change. In particular, host and parasite traits such as survival and virulence may have distinct temperature dependencies that must be integrated into models of disease dynamics. Using experimental data from Daphnia magna and a microsporidian parasite, we fitted a mechanistic model of the within-host parasite population dynamics. Model parameters comprising host aging and mortality, as well as parasite growth, virulence, and equilibrium abundance, were specified by relationships arising from the metabolic theory of ecology. The model effectively predicts host survival, parasite growth, and the cost of infection across temperature while using less than half the parameters compared to modeling temperatures discretely. Our results serve as a proof of concept that linking simple metabolic models with a mechanistic host-parasite framework can be used to predict temperature responses of parasite population dynamics at the within-host level.
Jones, Natalie; Peacock, Stephanie; Phillips, Jessica; Molnár, Péter K.; Krkošek, Martin; Luijckx, Pepijn
2018-01-01
The complexity of host–parasite interactions makes it difficult to predict how host–parasite systems will respond to climate change. In particular, host and parasite traits such as survival and virulence may have distinct temperature dependencies that must be integrated into models of disease dynamics. Using experimental data from Daphnia magna and a microsporidian parasite, we fitted a mechanistic model of the within-host parasite population dynamics. Model parameters comprising host aging and mortality, as well as parasite growth, virulence, and equilibrium abundance, were specified by relationships arising from the metabolic theory of ecology. The model effectively predicts host survival, parasite growth, and the cost of infection across temperature while using less than half the parameters compared to modeling temperatures discretely. Our results serve as a proof of concept that linking simple metabolic models with a mechanistic host–parasite framework can be used to predict temperature responses of parasite population dynamics at the within-host level. PMID:29415043
Mechanistic models versus machine learning, a fight worth fighting for the biological community?
Baker, Ruth E; Peña, Jose-Maria; Jayamohan, Jayaratnam; Jérusalem, Antoine
2018-05-01
Ninety per cent of the world's data have been generated in the last 5 years ( Machine learning: the power and promise of computers that learn by example Report no. DES4702. Issued April 2017. Royal Society). A small fraction of these data is collected with the aim of validating specific hypotheses. These studies are led by the development of mechanistic models focused on the causality of input-output relationships. However, the vast majority is aimed at supporting statistical or correlation studies that bypass the need for causality and focus exclusively on prediction. Along these lines, there has been a vast increase in the use of machine learning models, in particular in the biomedical and clinical sciences, to try and keep pace with the rate of data generation. Recent successes now beg the question of whether mechanistic models are still relevant in this area. Said otherwise, why should we try to understand the mechanisms of disease progression when we can use machine learning tools to directly predict disease outcome? © 2018 The Author(s).
DOT National Transportation Integrated Search
2017-02-08
The study re-evaluates distress prediction models using the Mechanistic-Empirical Pavement Design Guide (MEPDG) and expands the sensitivity analysis to a wide range of pavement structures and soils. In addition, an extensive validation analysis of th...
Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We developed a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course (DRTC)...
Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course (...
Predicting agricultural impacts of large-scale drought: 2012 and the case for better modeling
USDA-ARS?s Scientific Manuscript database
We present an example of a simulation-based forecast for the 2012 U.S. maize growing season produced as part of a high-resolution, multi-scale, predictive mechanistic modeling study designed for decision support, risk management, and counterfactual analysis. The simulations undertaken for this analy...
A method to identify and analyze biological programs through automated reasoning
Yordanov, Boyan; Dunn, Sara-Jane; Kugler, Hillel; Smith, Austin; Martello, Graziano; Emmott, Stephen
2016-01-01
Predictive biology is elusive because rigorous, data-constrained, mechanistic models of complex biological systems are difficult to derive and validate. Current approaches tend to construct and examine static interaction network models, which are descriptively rich, but often lack explanatory and predictive power, or dynamic models that can be simulated to reproduce known behavior. However, in such approaches implicit assumptions are introduced as typically only one mechanism is considered, and exhaustively investigating all scenarios is impractical using simulation. To address these limitations, we present a methodology based on automated formal reasoning, which permits the synthesis and analysis of the complete set of logical models consistent with experimental observations. We test hypotheses against all candidate models, and remove the need for simulation by characterizing and simultaneously analyzing all mechanistic explanations of observed behavior. Our methodology transforms knowledge of complex biological processes from sets of possible interactions and experimental observations to precise, predictive biological programs governing cell function. PMID:27668090
Fjodorova, Natalja; Novič, Marjana
2012-01-01
The knowledge-based Toxtree expert system (SAR approach) was integrated with the statistically based counter propagation artificial neural network (CP ANN) model (QSAR approach) to contribute to a better mechanistic understanding of a carcinogenicity model for non-congeneric chemicals using Dragon descriptors and carcinogenic potency for rats as a response. The transparency of the CP ANN algorithm was demonstrated using intrinsic mapping technique specifically Kohonen maps. Chemical structures were represented by Dragon descriptors that express the structural and electronic features of molecules such as their shape and electronic surrounding related to reactivity of molecules. It was illustrated how the descriptors are correlated with particular structural alerts (SAs) for carcinogenicity with recognized mechanistic link to carcinogenic activity. Moreover, the Kohonen mapping technique enables one to examine the separation of carcinogens and non-carcinogens (for rats) within a family of chemicals with a particular SA for carcinogenicity. The mechanistic interpretation of models is important for the evaluation of safety of chemicals. PMID:24688639
Emmott, Stephen; Hutton, Jon; Lyutsarev, Vassily; Smith, Matthew J.; Scharlemann, Jörn P. W.; Purves, Drew W.
2014-01-01
Anthropogenic activities are causing widespread degradation of ecosystems worldwide, threatening the ecosystem services upon which all human life depends. Improved understanding of this degradation is urgently needed to improve avoidance and mitigation measures. One tool to assist these efforts is predictive models of ecosystem structure and function that are mechanistic: based on fundamental ecological principles. Here we present the first mechanistic General Ecosystem Model (GEM) of ecosystem structure and function that is both global and applies in all terrestrial and marine environments. Functional forms and parameter values were derived from the theoretical and empirical literature where possible. Simulations of the fate of all organisms with body masses between 10 µg and 150,000 kg (a range of 14 orders of magnitude) across the globe led to emergent properties at individual (e.g., growth rate), community (e.g., biomass turnover rates), ecosystem (e.g., trophic pyramids), and macroecological scales (e.g., global patterns of trophic structure) that are in general agreement with current data and theory. These properties emerged from our encoding of the biology of, and interactions among, individual organisms without any direct constraints on the properties themselves. Our results indicate that ecologists have gathered sufficient information to begin to build realistic, global, and mechanistic models of ecosystems, capable of predicting a diverse range of ecosystem properties and their response to human pressures. PMID:24756001
Harfoot, Michael B J; Newbold, Tim; Tittensor, Derek P; Emmott, Stephen; Hutton, Jon; Lyutsarev, Vassily; Smith, Matthew J; Scharlemann, Jörn P W; Purves, Drew W
2014-04-01
Anthropogenic activities are causing widespread degradation of ecosystems worldwide, threatening the ecosystem services upon which all human life depends. Improved understanding of this degradation is urgently needed to improve avoidance and mitigation measures. One tool to assist these efforts is predictive models of ecosystem structure and function that are mechanistic: based on fundamental ecological principles. Here we present the first mechanistic General Ecosystem Model (GEM) of ecosystem structure and function that is both global and applies in all terrestrial and marine environments. Functional forms and parameter values were derived from the theoretical and empirical literature where possible. Simulations of the fate of all organisms with body masses between 10 µg and 150,000 kg (a range of 14 orders of magnitude) across the globe led to emergent properties at individual (e.g., growth rate), community (e.g., biomass turnover rates), ecosystem (e.g., trophic pyramids), and macroecological scales (e.g., global patterns of trophic structure) that are in general agreement with current data and theory. These properties emerged from our encoding of the biology of, and interactions among, individual organisms without any direct constraints on the properties themselves. Our results indicate that ecologists have gathered sufficient information to begin to build realistic, global, and mechanistic models of ecosystems, capable of predicting a diverse range of ecosystem properties and their response to human pressures.
Forbes, Valery E; Salice, Chris J; Birnir, Bjorn; Bruins, Randy J F; Calow, Peter; Ducrot, Virginie; Galic, Nika; Garber, Kristina; Harvey, Bret C; Jager, Henriette; Kanarek, Andrew; Pastorok, Robert; Railsback, Steve F; Rebarber, Richard; Thorbek, Pernille
2017-04-01
Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. Environ Toxicol Chem 2017;36:845-859. © 2017 SETAC. © 2017 SETAC.
Band, Leah R.; Fozard, John A.; Godin, Christophe; Jensen, Oliver E.; Pridmore, Tony; Bennett, Malcolm J.; King, John R.
2012-01-01
Over recent decades, we have gained detailed knowledge of many processes involved in root growth and development. However, with this knowledge come increasing complexity and an increasing need for mechanistic modeling to understand how those individual processes interact. One major challenge is in relating genotypes to phenotypes, requiring us to move beyond the network and cellular scales, to use multiscale modeling to predict emergent dynamics at the tissue and organ levels. In this review, we highlight recent developments in multiscale modeling, illustrating how these are generating new mechanistic insights into the regulation of root growth and development. We consider how these models are motivating new biological data analysis and explore directions for future research. This modeling progress will be crucial as we move from a qualitative to an increasingly quantitative understanding of root biology, generating predictive tools that accelerate the development of improved crop varieties. PMID:23110897
Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose response and time-course...
Modeling food matrix effects on chemical reactivity: Challenges and perspectives.
Capuano, Edoardo; Oliviero, Teresa; van Boekel, Martinus A J S
2017-06-29
The same chemical reaction may be different in terms of its position of the equilibrium (i.e., thermodynamics) and its kinetics when studied in different foods. The diversity in the chemical composition of food and in its structural organization at macro-, meso-, and microscopic levels, that is, the food matrix, is responsible for this difference. In this viewpoint paper, the multiple, and interconnected ways the food matrix can affect chemical reactivity are summarized. Moreover, mechanistic and empirical approaches to explain and predict the effect of food matrix on chemical reactivity are described. Mechanistic models aim to quantify the effect of food matrix based on a detailed understanding of the chemical and physical phenomena occurring in food. Their applicability is limited at the moment to very simple food systems. Empirical modeling based on machine learning combined with data-mining techniques may represent an alternative, useful option to predict the effect of the food matrix on chemical reactivity and to identify chemical and physical properties to be further tested. In such a way the mechanistic understanding of the effect of the food matrix on chemical reactions can be improved.
STATISTICAL METHODOLOGY FOR ESTIMATING PARAMETERS IN PBPK/PD MODELS
PBPK/PD models are large dynamic models that predict tissue concentration and biological effects of a toxicant before PBPK/PD models can be used in risk assessments in the arena of toxicological hypothesis testing, models allow the consequences of alternative mechanistic hypothes...
Battista, C; Woodhead, JL; Stahl, SH; Mettetal, JT; Watkins, PB; Siler, SQ; Howell, BA
2017-01-01
Elevations in serum bilirubin during drug treatment may indicate global liver dysfunction and a high risk of liver failure. However, drugs also can increase serum bilirubin in the absence of hepatic injury by inhibiting specific enzymes/transporters. We constructed a mechanistic model of bilirubin disposition based on known functional polymorphisms in bilirubin metabolism/transport. Using physiologically based pharmacokinetic (PBPK) model‐predicted drug exposure and enzyme/transporter inhibition constants determined in vitro, our model correctly predicted indinavir‐mediated hyperbilirubinemia in humans and rats. Nelfinavir was predicted not to cause hyperbilirubinemia, consistent with clinical observations. We next examined a new drug candidate that caused both elevations in serum bilirubin and biochemical evidence of liver injury in rats. Simulations suggest that bilirubin elevation primarily resulted from inhibition of transporters rather than global liver dysfunction. We conclude that mechanistic modeling of bilirubin can help elucidate underlying mechanisms of drug‐induced hyperbilirubinemia, and thereby distinguish benign from clinically important elevations in serum bilirubin. PMID:28074467
Bouhaddou, Mehdi; Koch, Rick J.; DiStefano, Matthew S.; Tan, Annie L.; Mertz, Alex E.
2018-01-01
Most cancer cells harbor multiple drivers whose epistasis and interactions with expression context clouds drug and drug combination sensitivity prediction. We constructed a mechanistic computational model that is context-tailored by omics data to capture regulation of stochastic proliferation and death by pan-cancer driver pathways. Simulations and experiments explore how the coordinated dynamics of RAF/MEK/ERK and PI-3K/AKT kinase activities in response to synergistic mitogen or drug combinations control cell fate in a specific cellular context. In this MCF10A cell context, simulations suggest that synergistic ERK and AKT inhibitor-induced death is likely mediated by BIM rather than BAD, which is supported by prior experimental studies. AKT dynamics explain S-phase entry synergy between EGF and insulin, but simulations suggest that stochastic ERK, and not AKT, dynamics seem to drive cell-to-cell proliferation variability, which in simulations is predictable from pre-stimulus fluctuations in C-Raf/B-Raf levels. Simulations suggest MEK alteration negligibly influences transformation, consistent with clinical data. Tailoring the model to an alternate cell expression and mutation context, a glioma cell line, allows prediction of increased sensitivity of cell death to AKT inhibition. Our model mechanistically interprets context-specific landscapes between driver pathways and cell fates, providing a framework for designing more rational cancer combination therapy. PMID:29579036
From documentation to prediction: Raising the bar for thermokarst research
Rowland, Joel C.; Coon, Ethan T.
2015-11-12
Here we report that to date the majority of published research on thermokarst has been directed at documenting its form, occurrence, and rates of occurrence. The fundamental processes driving thermokarst have long been largely understood. However, the detailed physical couplings between, water, air, soil, and the thermal dynamics governing freeze-thaw and soil mechanics is less understood and not captured in models aimed at predicting the response of frozen soils to warming and thaw. As computational resources increase more sophisticated mechanistic models can be applied; these show great promise as predictive tools. These models will be capable of simulating the responsemore » of soil deformation to thawing/freezing cycles and the long-term, non-recoverable response of the land surface to the loss of ice. At the same time, advances in remote sensing of permafrost environments also show promise in providing detailed and spatially extensive estimates in the rates and patterns of subsidence. These datasets provide key constraints to calibrate and evaluate the predictive power of mechanistic models. In conclusion, in the coming decade, these emerging technologies will greatly increase our ability to predict when, where, and how thermokarst will occur in a changing climate.« less
Simulating malaria transmission in the current and future climate of West Africa
NASA Astrophysics Data System (ADS)
Yamana, T. K.; Bomblies, A.; Eltahir, E. A. B.
2015-12-01
Malaria transmission in West Africa is closely tied to climate, as rain fed water pools provide breeding habitat for the anopheles mosquito vector, and temperature affects the mosquito's ability to spread disease. We present results of a highly detailed, spatially explicit mechanistic modelling study exploring the relationships between the environment and malaria in the current and future climate of West Africa. A mechanistic model of human immunity was incorporated into an existing agent-based model of malaria transmission, allowing us to move beyond entomological measures such as mosquito density and vectorial capacity to analyzing the prevalence of the malaria parasite within human populations. The result is a novel modelling tool that mechanistically simulates all of the key processes linking environment to malaria transmission. Simulations were conducted across climate zones in West Africa, linking temperature and rainfall to entomological and epidemiological variables with a focus on nonlinearities due to threshold effects and interannual variability. Comparisons to observations from the region confirmed that the model provides a reasonable representation of the entomological and epidemiological conditions in this region. We used the predictions of future climate from the most credible CMIP5 climate models to predict the change in frequency and severity of malaria epidemics in West Africa as a result of climate change.
Estimating Cumulative Traffic Loads, Final Report for Phase 1
DOT National Transportation Integrated Search
2000-07-01
The knowledge of traffic loads is a prerequisite for the pavement analysis process, especially for the development of load-related distress prediction models. Furthermore, the emerging mechanistically based pavement performance models and pavement de...
Predicting neuroblastoma using developmental signals and a logic-based model.
Kasemeier-Kulesa, Jennifer C; Schnell, Santiago; Woolley, Thomas; Spengler, Jennifer A; Morrison, Jason A; McKinney, Mary C; Pushel, Irina; Wolfe, Lauren A; Kulesa, Paul M
2018-07-01
Genomic information from human patient samples of pediatric neuroblastoma cancers and known outcomes have led to specific gene lists put forward as high risk for disease progression. However, the reliance on gene expression correlations rather than mechanistic insight has shown limited potential and suggests a critical need for molecular network models that better predict neuroblastoma progression. In this study, we construct and simulate a molecular network of developmental genes and downstream signals in a 6-gene input logic model that predicts a favorable/unfavorable outcome based on the outcome of the four cell states including cell differentiation, proliferation, apoptosis, and angiogenesis. We simulate the mis-expression of the tyrosine receptor kinases, trkA and trkB, two prognostic indicators of neuroblastoma, and find differences in the number and probability distribution of steady state outcomes. We validate the mechanistic model assumptions using RNAseq of the SHSY5Y human neuroblastoma cell line to define the input states and confirm the predicted outcome with antibody staining. Lastly, we apply input gene signatures from 77 published human patient samples and show that our model makes more accurate disease outcome predictions for early stage disease than any current neuroblastoma gene list. These findings highlight the predictive strength of a logic-based model based on developmental genes and offer a better understanding of the molecular network interactions during neuroblastoma disease progression. Copyright © 2018. Published by Elsevier B.V.
MacLeod, Miles; Nersessian, Nancy J
2015-02-01
In this paper we draw upon rich ethnographic data of two systems biology labs to explore the roles of explanation and understanding in large-scale systems modeling. We illustrate practices that depart from the goal of dynamic mechanistic explanation for the sake of more limited modeling goals. These processes use abstract mathematical formulations of bio-molecular interactions and data fitting techniques which we call top-down abstraction to trade away accurate mechanistic accounts of large-scale systems for specific information about aspects of those systems. We characterize these practices as pragmatic responses to the constraints many modelers of large-scale systems face, which in turn generate more limited pragmatic non-mechanistic forms of understanding of systems. These forms aim at knowledge of how to predict system responses in order to manipulate and control some aspects of them. We propose that this analysis of understanding provides a way to interpret what many systems biologists are aiming for in practice when they talk about the objective of a "systems-level understanding." Copyright © 2014 Elsevier Ltd. All rights reserved.
DOT National Transportation Integrated Search
2013-06-01
This report summarizes a research project aimed at developing degradation models for bridge decks in the state of Michigan based on durability mechanics. A probabilistic framework to implement local-level mechanistic-based models for predicting the c...
Livestock Helminths in a Changing Climate: Approaches and Restrictions to Meaningful Predictions.
Fox, Naomi J; Marion, Glenn; Davidson, Ross S; White, Piran C L; Hutchings, Michael R
2012-03-06
Climate change is a driving force for livestock parasite risk. This is especially true for helminths including the nematodes Haemonchus contortus, Teladorsagia circumcincta, Nematodirus battus, and the trematode Fasciola hepatica, since survival and development of free-living stages is chiefly affected by temperature and moisture. The paucity of long term predictions of helminth risk under climate change has driven us to explore optimal modelling approaches and identify current bottlenecks to generating meaningful predictions. We classify approaches as correlative or mechanistic, exploring their strengths and limitations. Climate is one aspect of a complex system and, at the farm level, husbandry has a dominant influence on helminth transmission. Continuing environmental change will necessitate the adoption of mitigation and adaptation strategies in husbandry. Long term predictive models need to have the architecture to incorporate these changes. Ultimately, an optimal modelling approach is likely to combine mechanistic processes and physiological thresholds with correlative bioclimatic modelling, incorporating changes in livestock husbandry and disease control. Irrespective of approach, the principal limitation to parasite predictions is the availability of active surveillance data and empirical data on physiological responses to climate variables. By combining improved empirical data and refined models with a broad view of the livestock system, robust projections of helminth risk can be developed.
Influence of landscape-scale factors in limiting brook trout populations in Pennsylvania streams
Kocovsky, P.M.; Carline, R.F.
2006-01-01
Landscapes influence the capacity of streams to produce trout through their effect on water chemistry and other factors at the reach scale. Trout abundance also fluctuates over time; thus, to thoroughly understand how spatial factors at landscape scales affect trout populations, one must assess the changes in populations over time to provide a context for interpreting the importance of spatial factors. We used data from the Pennsylvania Fish and Boat Commission's fisheries management database to investigate spatial factors that affect the capacity of streams to support brook trout Salvelinus fontinalis and to provide models useful for their management. We assessed the relative importance of spatial and temporal variation by calculating variance components and comparing relative standard errors for spatial and temporal variation. We used binary logistic regression to predict the presence of harvestable-length brook trout and multiple linear regression to assess the mechanistic links between landscapes and trout populations and to predict population density. The variance in trout density among streams was equal to or greater than the temporal variation for several streams, indicating that differences among sites affect population density. Logistic regression models correctly predicted the absence of harvestable-length brook trout in 60% of validation samples. The r 2-value for the linear regression model predicting density was 0.3, indicating low predictive ability. Both logistic and linear regression models supported buffering capacity against acid episodes as an important mechanistic link between landscapes and trout populations. Although our models fail to predict trout densities precisely, their success at elucidating the mechanistic links between landscapes and trout populations, in concert with the importance of spatial variation, increases our understanding of factors affecting brook trout abundance and will help managers and private groups to protect and enhance populations of wild brook trout. ?? Copyright by the American Fisheries Society 2006.
Changes in Black-legged Tick Population in New England with Future Climate Change
NASA Astrophysics Data System (ADS)
Krishnan, S.; Huber, M.
2015-12-01
Lyme disease is one of the most frequently reported vector-borne diseases in the United States. In the Northeastern United States, vector transmission is maintained in a horizontal transmission cycle between the vector, the black-legged ticks, and the vertebrate reservoir hosts, which include white-tailed deer, rodents and other medium to large sized mammals. Predicting how vector populations change with future climate change is critical to understanding disease spread in the future, and for developing suitable regional adaptation strategies. For the United States, these predictions have mostly been made using regressions based on field and lab studies, or using spatial suitability studies. However, the relation between tick populations at various life-cycle stages and climate variables are complex, necessitating a mechanistic approach. In this study, we present a framework for driving a mechanistic tick population model with high-resolution regional climate modeling projections. The goal is to estimate changes in black-legged tick populations in New England for the 21st century. The tick population model used is based on the mechanistic approach of Ogden et al., (2005) developed for Canada. Dynamically downscaled climate projections at a 3-kms resolution using the Weather and Research Forecasting Model (WRF) are used to drive the tick population model.
A framework for predicting impacts on ecosystem services ...
Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. The framework introduced here represents an ongoing initiative supported by the National Institute of Mathematical and Biological Synthesis (NIMBioS; http://www.nimbi
Modeling behavioral thermoregulation in a climate change sentinel.
Moyer-Horner, Lucas; Mathewson, Paul D; Jones, Gavin M; Kearney, Michael R; Porter, Warren P
2015-12-01
When possible, many species will shift in elevation or latitude in response to rising temperatures. However, before such shifts occur, individuals will first tolerate environmental change and then modify their behavior to maintain heat balance. Behavioral thermoregulation allows animals a range of climatic tolerances and makes predicting geographic responses under future warming scenarios challenging. Because behavioral modification may reduce an individual's fecundity by, for example, limiting foraging time and thus caloric intake, we must consider the range of behavioral options available for thermoregulation to accurately predict climate change impacts on individual species. To date, few studies have identified mechanistic links between an organism's daily activities and the need to thermoregulate. We used a biophysical model, Niche Mapper, to mechanistically model microclimate conditions and thermoregulatory behavior for a temperature-sensitive mammal, the American pika (Ochotona princeps). Niche Mapper accurately simulated microclimate conditions, as well as empirical metabolic chamber data for a range of fur properties, animal sizes, and environmental parameters. Niche Mapper predicted pikas would be behaviorally constrained because of the need to thermoregulate during the hottest times of the day. We also showed that pikas at low elevations could receive energetic benefits by being smaller in size and maintaining summer pelage during longer stretches of the active season under a future warming scenario. We observed pika behavior for 288 h in Glacier National Park, Montana, and thermally characterized their rocky, montane environment. We found that pikas were most active when temperatures were cooler, and at sites characterized by high elevations and north-facing slopes. Pikas became significantly less active across a suite of behaviors in the field when temperatures surpassed 20°C, which supported a metabolic threshold predicted by Niche Mapper. In general, mechanistic predictions and empirical observations were congruent. This research is unique in providing both an empirical and mechanistic description of the effects of temperature on a mammalian sentinel of climate change, the American pika. Our results suggest that previously underinvestigated characteristics, specifically fur properties and body size, may play critical roles in pika populations' response to climate change. We also demonstrate the potential importance of considering behavioral thermoregulation and microclimate variability when predicting animal responses to climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment.more » Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.« less
Langenbucher, Frieder
2007-08-01
This paper discusses Excel applications related to the prediction of drug absorbability from physicochemical constants. PHDISSOC provides a generalized model for pH profiles of electrolytic dissociation, water solubility, and partition coefficient. SKMODEL predicts drug absorbability, based on a log-log plot of water solubility and O/W partitioning; augmented by additional features such as electrolytic dissociation, melting point, and the dose administered. GIABS presents a mechanistic model of g.i. drug absorption. BIODATCO presents a database compiling relevant drug data to be used for quantitative predictions.
Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models
Hooten, Mevin B.; Leeds, William B.; Fiechter, Jerome; Wikle, Christopher K.
2011-01-01
We present an approach for estimating physical parameters in nonlinear models that relies on an approximation to the mechanistic model itself for computational efficiency. The proposed methodology is validated and applied in two different modeling scenarios: (a) Simulation and (b) lower trophic level ocean ecosystem model. The approach we develop relies on the ability to predict right singular vectors (resulting from a decomposition of computer model experimental output) based on the computer model input and an experimental set of parameters. Critically, we model the right singular vectors in terms of the model parameters via a nonlinear statistical model. Specifically, we focus our attention on first-order models of these right singular vectors rather than the second-order (covariance) structure.
Modeling Rabbit Responses to Single and Multiple Aerosol ...
Journal Article Survival models are developed here to predict response and time-to-response for mortality in rabbits following exposures to single or multiple aerosol doses of Bacillus anthracis spores. Hazard function models were developed for a multiple dose dataset to predict the probability of death through specifying dose-response functions and the time between exposure and the time-to-death (TTD). Among the models developed, the best-fitting survival model (baseline model) has an exponential dose-response model with a Weibull TTD distribution. Alternative models assessed employ different underlying dose-response functions and use the assumption that, in a multiple dose scenario, earlier doses affect the hazard functions of each subsequent dose. In addition, published mechanistic models are analyzed and compared with models developed in this paper. None of the alternative models that were assessed provided a statistically significant improvement in fit over the baseline model. The general approach utilizes simple empirical data analysis to develop parsimonious models with limited reliance on mechanistic assumptions. The baseline model predicts TTDs consistent with reported results from three independent high-dose rabbit datasets. More accurate survival models depend upon future development of dose-response datasets specifically designed to assess potential multiple dose effects on response and time-to-response. The process used in this paper to dev
Sarkar, Joydeep
2018-01-01
Iron plays vital roles in the human body including enzymatic processes, oxygen-transport via hemoglobin and immune response. Iron metabolism is characterized by ~95% recycling and minor replenishment through diet. Anemia of chronic kidney disease (CKD) is characterized by a lack of synthesis of erythropoietin leading to reduced red blood cell (RBC) formation and aberrant iron recycling. Treatment of CKD anemia aims to normalize RBC count and serum hemoglobin. Clinically, the various fluxes of iron transport and accumulation are not measured so that changes during disease (e.g., CKD) and treatment are unknown. Unwanted iron accumulation in patients is known to lead to adverse effects. Current whole-body models lack the mechanistic details of iron transport related to RBC maturation, transferrin (Tf and TfR) dynamics and assume passive iron efflux from macrophages. Hence, they are not predictive of whole-body iron dynamics and cannot be used to design individualized patient treatment. For prediction, we developed a mechanistic, multi-scale computational model of whole-body iron metabolism incorporating four compartments containing major pools of iron and RBC generation process. The model accounts for multiple forms of iron in vivo, mechanisms involved in iron uptake and release and their regulation. Furthermore, the model is interfaced with drug pharmacokinetics to allow simulation of treatment dynamics. We calibrated our model with experimental and clinical data from peer-reviewed literature to reliably simulate CKD anemia and the effects of current treatment involving combination of epoietin-alpha and iron dextran. This in silico whole-body model of iron metabolism predicts that a year of treatment can potentially lead to 90% downregulation of ferroportin (FPN) levels, 15-fold increase in iron stores with only a 20% increase in iron flux from the reticulo-endothelial system (RES). Model simulations quantified unmeasured iron fluxes, previously unknown effects of treatment on FPN-level and iron stores in the RES. This mechanistic whole-body model can be the basis for future studies that incorporate iron metabolism together with related clinical experiments. Such an approach could pave the way for development of effective personalized treatment of CKD anemia. PMID:29659573
Modelling the mating system of polar bears: a mechanistic approach to the Allee effect.
Molnár, Péter K; Derocher, Andrew E; Lewis, Mark A; Taylor, Mitchell K
2008-01-22
Allee effects may render exploited animal populations extinction prone, but empirical data are often lacking to describe the circumstances leading to an Allee effect. Arbitrary assumptions regarding Allee effects could lead to erroneous management decisions so that predictive modelling approaches are needed that identify the circumstances leading to an Allee effect before such a scenario occurs. We present a predictive approach of Allee effects for polar bears where low population densities, an unpredictable habitat and harvest-depleted male populations result in infrequent mating encounters. We develop a mechanistic model for the polar bear mating system that predicts the proportion of fertilized females at the end of the mating season given population density and operational sex ratio. The model is parametrized using pairing data from Lancaster Sound, Canada, and describes the observed pairing dynamics well. Female mating success is shown to be a nonlinear function of the operational sex ratio, so that a sudden and rapid reproductive collapse could occur if males are severely depleted. The operational sex ratio where an Allee effect is expected is dependent on population density. We focus on the prediction of Allee effects in polar bears but our approach is also applicable to other species.
A new model integrating short- and long-term aging of copper added to soils
Zeng, Saiqi; Li, Jumei; Wei, Dongpu
2017-01-01
Aging refers to the processes by which the bioavailability/toxicity, isotopic exchangeability, and extractability of metals added to soils decline overtime. We studied the characteristics of the aging process in copper (Cu) added to soils and the factors that affect this process. Then we developed a semi-mechanistic model to predict the lability of Cu during the aging process with descriptions of the diffusion process using complementary error function. In the previous studies, two semi-mechanistic models to separately predict short-term and long-term aging of Cu added to soils were developed with individual descriptions of the diffusion process. In the short-term model, the diffusion process was linearly related to the square root of incubation time (t1/2), and in the long-term model, the diffusion process was linearly related to the natural logarithm of incubation time (lnt). Both models could predict short-term or long-term aging processes separately, but could not predict the short- and long-term aging processes by one model. By analyzing and combining the two models, we found that the short- and long-term behaviors of the diffusion process could be described adequately using the complementary error function. The effect of temperature on the diffusion process was obtained in this model as well. The model can predict the aging process continuously based on four factors—soil pH, incubation time, soil organic matter content and temperature. PMID:28820888
Mechanistic modeling of developmental defects through computational embryology (WC10th)
Abstract: An important consideration for 3Rs is to identify developmental hazards utilizing mechanism-based in vitro assays (e.g., ToxCast) and in silico predictive models. Steady progress has been made with agent-based models that recapitulate morphogenetic drivers for angiogen...
DOT National Transportation Integrated Search
2010-08-01
This study was intended to recommend future directions for the development of TxDOTs Mechanistic-Empirical : (TexME) design system. For stress predictions, a multi-layer linear elastic system was evaluated and its validity was : verified by compar...
Most predictions of the effect of climate change on species’ ranges are based on correlations between climate and current species’ distributions. These so-called envelope models may be a good first approximation, but we need demographically mechanistic models to incorporate the ...
Bashir Surfraz, M; Fowkes, Adrian; Plante, Jeffrey P
2017-08-01
The need to find an alternative to costly animal studies for developmental and reproductive toxicity testing has shifted the focus considerably to the assessment of in vitro developmental toxicology models and the exploitation of pharmacological data for relevant molecular initiating events. We hereby demonstrate how automation can be applied successfully to handle heterogeneous oestrogen receptor data from ChEMBL. Applying expert-derived thresholds to specific bioactivities allowed an activity call to be attributed to each data entry. Human intervention further improved this mechanistic dataset which was mined to develop structure-activity relationship alerts and an expert model covering 45 chemical classes for the prediction of oestrogen receptor modulation. The evaluation of the model using FDA EDKB and Tox21 data was quite encouraging. This model can also provide a teratogenicity prediction along with the additional information it provides relevant to the query compound, all of which will require careful assessment of potential risk by experts. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Franek, F; Jarlfors, A; Larsen, F; Holm, P; Steffansen, B
2015-09-18
Desvenlafaxine is a biopharmaceutics classification system (BCS) class 1 (high solubility, high permeability) and biopharmaceutical drug disposition classification system (BDDCS) class 3, (high solubility, poor metabolism; implying low permeability) compound. Thus the rate-limiting step for desvenlafaxine absorption (i.e. intestinal dissolution or permeation) is not fully clarified. The aim of this study was to investigate whether dissolution and/or intestinal permeability rate-limit desvenlafaxine absorption from an immediate-release formulation (IRF) and Pristiq(®), an extended release formulation (ERF). Semi-mechanistic models of desvenlafaxine were built (using SimCyp(®)) by combining in vitro data on dissolution and permeation (mechanistic part of model) with clinical data (obtained from literature) on distribution and clearance (non-mechanistic part of model). The model predictions of desvenlafaxine pharmacokinetics after IRF and ERF administration were compared with published clinical data from 14 trials. Desvenlafaxine in vivo dissolution from the IRF and ERF was predicted from in vitro solubility studies and biorelevant dissolution studies (using the USP3 dissolution apparatus), respectively. Desvenlafaxine apparent permeability (Papp) at varying apical pH was investigated using the Caco-2 cell line and extrapolated to effective intestinal permeability (Peff) in human duodenum, jejunum, ileum and colon. Desvenlafaxine pKa-values and octanol-water partition coefficients (Do:w) were determined experimentally. Due to predicted rapid dissolution after IRF administration, desvenlafaxine was predicted to be available for permeation in the duodenum. Desvenlafaxine Do:w and Papp increased approximately 13-fold when increasing apical pH from 5.5 to 7.4. Desvenlafaxine Peff thus increased with pH down the small intestine. Consequently, desvenlafaxine absorption from an IRF appears rate-limited by low Peff in the upper small intestine, which "delays" the predicted time to the maximal plasma concentration (tmax), consistent with clinical data. Conversely, desvenlafaxine absorption from the ERF appears rate-limited by dissolution due to the formulation, which tends to negate the influence of pH-dependent permeability on absorption. We suggest that desvenlafaxine Peff is mainly driven by transcellular diffusion of the unionized form. In the case of desvenlafaxine, poor metabolism does not imply low intestinal permeability, as indicated by the BDDCS, merely low duodenal/jejunal permeability. Copyright © 2015 Elsevier B.V. All rights reserved.
Computational Modeling of Inflammation and Wound Healing
Ziraldo, Cordelia; Mi, Qi; An, Gary; Vodovotz, Yoram
2013-01-01
Objective Inflammation is both central to proper wound healing and a key driver of chronic tissue injury via a positive-feedback loop incited by incidental cell damage. We seek to derive actionable insights into the role of inflammation in wound healing in order to improve outcomes for individual patients. Approach To date, dynamic computational models have been used to study the time evolution of inflammation in wound healing. Emerging clinical data on histo-pathological and macroscopic images of evolving wounds, as well as noninvasive measures of blood flow, suggested the need for tissue-realistic, agent-based, and hybrid mechanistic computational simulations of inflammation and wound healing. Innovation We developed a computational modeling system, Simple Platform for Agent-based Representation of Knowledge, to facilitate the construction of tissue-realistic models. Results A hybrid equation–agent-based model (ABM) of pressure ulcer formation in both spinal cord-injured and -uninjured patients was used to identify control points that reduce stress caused by tissue ischemia/reperfusion. An ABM of arterial restenosis revealed new dynamics of cell migration during neointimal hyperplasia that match histological features, but contradict the currently prevailing mechanistic hypothesis. ABMs of vocal fold inflammation were used to predict inflammatory trajectories in individuals, possibly allowing for personalized treatment. Conclusions The intertwined inflammatory and wound healing responses can be modeled computationally to make predictions in individuals, simulate therapies, and gain mechanistic insights. PMID:24527362
Public Databases Supporting Computational Toxicology
A major goal of the emerging field of computational toxicology is the development of screening-level models that predict potential toxicity of chemicals from a combination of mechanistic in vitro assay data and chemical structure descriptors. In order to build these models, resea...
Body Fineness Ratio as a Predictor of Maximum Prolonged-Swimming Speed in Coral Reef Fishes
Walker, Jeffrey A.; Alfaro, Michael E.; Noble, Mae M.; Fulton, Christopher J.
2013-01-01
The ability to sustain high swimming speeds is believed to be an important factor affecting resource acquisition in fishes. While we have gained insights into how fin morphology and motion influences swimming performance in coral reef fishes, the role of other traits, such as body shape, remains poorly understood. We explore the ability of two mechanistic models of the causal relationship between body fineness ratio and endurance swimming-performance to predict maximum prolonged-swimming speed (Umax) among 84 fish species from the Great Barrier Reef, Australia. A drag model, based on semi-empirical data on the drag of rigid, submerged bodies of revolution, was applied to species that employ pectoral-fin propulsion with a rigid body at U max. An alternative model, based on the results of computer simulations of optimal shape in self-propelled undulating bodies, was applied to the species that swim by body-caudal-fin propulsion at Umax. For pectoral-fin swimmers, Umax increased with fineness, and the rate of increase decreased with fineness, as predicted by the drag model. While the mechanistic and statistical models of the relationship between fineness and Umax were very similar, the mechanistic (and statistical) model explained only a small fraction of the variance in Umax. For body-caudal-fin swimmers, we found a non-linear relationship between fineness and Umax, which was largely negative over most of the range of fineness. This pattern fails to support either predictions from the computational models or standard functional interpretations of body shape variation in fishes. Our results suggest that the widespread hypothesis that a more optimal fineness increases endurance-swimming performance via reduced drag should be limited to fishes that swim with rigid bodies. PMID:24204575
Solvation models, based on fundamental chemical structure theory, were developed in the SPARC mechanistic tool box to predict a large array of physical properties of organic compounds in water and in non-aqueous solvents strictly from molecular structure. The SPARC self-interact...
Verifiable metamodels for nitrate losses to drains and groundwater in the corn belt, USA
USDA-ARS?s Scientific Manuscript database
Metamodels (MMs) consisting of artificial neural networks were developed to simplify and upscale mechanistic fate and transport models for prediction of nitrate losses to drains and groundwater in the Corn Belt, USA. The two final MMs predicted nitrate concentration and flux, respectively, in the sh...
Perception of mind and dehumanization: Human, animal, or machine?
Morera, María D; Quiles, María N; Correa, Ana D; Delgado, Naira; Leyens, Jacques-Philippe
2016-08-02
Dehumanization is reached through several approaches, including the attribute-based model of mind perception and the metaphor-based model of dehumanization. We performed two studies to find different (de)humanized images for three targets: Professional people, Evil people, and Lowest of the low. In Study 1, we examined dimensions of mind, expecting the last two categories to be dehumanized through denial of agency (Lowest of the low) or experience (Evil people), compared with humanized targets (Professional people). Study 2 aimed to distinguish these targets using metaphors. We predicted that Evil and Lowest of the low targets would suffer mechanistic and animalistic dehumanization, respectively; our predictions were confirmed, but the metaphor-based model nuanced these results: animalistic and mechanistic dehumanization were shown as overlapping rather than independent. Evil persons were perceived as "killing machines" and "predators." Finally, Lowest of the low were not animalized but considered human beings. We discuss possible interpretations. © 2016 International Union of Psychological Science.
Corrosion fatigue crack propagation in metals
NASA Technical Reports Server (NTRS)
Gangloff, Richard P.
1990-01-01
This review assesses fracture mechanics data and mechanistic models for corrosion fatigue crack propagation in structural alloys exposed to ambient temperature gases and electrolytes. Extensive stress intensity-crack growth rate data exist for ferrous, aluminum and nickel based alloys in a variety of environments. Interactive variables (viz., stress intensity range, mean stress, alloy composition and microstructure, loading frequency, temperature, gas pressure and electrode potential) strongly affect crack growth kinetics and complicate fatigue control. Mechanistic models to predict crack growth rates were formulated by coupling crack tip mechanics with occluded crack chemistry, and from both the hydrogen embrittlement and anodic dissolution/film rupture perspectives. Research is required to better define: (1) environmental effects near threshold and on crack closure; (2) damage tolerant life prediction codes and the validity of similitude; (3) the behavior of microcrack; (4) probes and improved models of crack tip damage; and (5) the cracking performance of advanced alloys and composites.
Pradeep, Prachi; Povinelli, Richard J; Merrill, Stephen J; Bozdag, Serdar; Sem, Daniel S
2015-04-01
The availability of large in vitro datasets enables better insight into the mode of action of chemicals and better identification of potential mechanism(s) of toxicity. Several studies have shown that not all in vitro assays can contribute as equal predictors of in vivo carcinogenicity for development of hybrid Quantitative Structure Activity Relationship (QSAR) models. We propose two novel approaches for the use of mechanistically relevant in vitro assay data in the identification of relevant biological descriptors and development of Quantitative Biological Activity Relationship (QBAR) models for carcinogenicity prediction. We demonstrate that in vitro assay data can be used to develop QBAR models for in vivo carcinogenicity prediction via two case studies corroborated with firm scientific rationale. The case studies demonstrate the similarities between QBAR and QSAR modeling in: (i) the selection of relevant descriptors to be used in the machine learning algorithm, and (ii) the development of a computational model that maps chemical or biological descriptors to a toxic endpoint. The results of both the case studies show: (i) improved accuracy and sensitivity which is especially desirable under regulatory requirements, and (ii) overall adherence with the OECD/REACH guidelines. Such mechanism based models can be used along with QSAR models for prediction of mechanistically complex toxic endpoints. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Musther, Helen; Harwood, Matthew D; Yang, Jiansong; Turner, David B; Rostami-Hodjegan, Amin; Jamei, Masoud
2017-09-01
The use of in vitro-in vivo extrapolation (IVIVE) techniques, mechanistically incorporated within physiologically based pharmacokinetic (PBPK) models, can harness in vitro drug data and enhance understanding of in vivo pharmacokinetics. This study's objective was to develop a user-friendly rat (250 g, male Sprague-Dawley) IVIVE-linked PBPK model. A 13-compartment PBPK model including mechanistic absorption models was developed, with required system data (anatomical, physiological, and relevant IVIVE scaling factors) collated from literature and analyzed. Overall, 178 system parameter values for the model are provided. This study also highlights gaps in available system data required for strain-specific rat PBPK model development. The model's functionality and performance were assessed using previous literature-sourced in vitro properties for diazepam, metoprolol, and midazolam. The results of simulations were compared against observed pharmacokinetic rat data. Predicted and observed concentration profiles in 10 tissues for diazepam after a single intravenous (i.v.) dose making use of either observed i.v. clearance (CL iv ) or in vitro hepatocyte intrinsic clearance (CL int ) for simulations generally led to good predictions in various tissue compartments. Overall, all i.v. plasma concentration profiles were successfully predicted. However, there were challenges in predicting oral plasma concentration profiles for metoprolol and midazolam, and the potential reasons and according solutions are discussed. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Verifiable metamodels for nitrate losses to drains and groundwater in the Corn Belt, USA
Nolan, Bernard T.; Malone, Robert W.; Gronberg, Jo Ann M.; Thorp, K.R.; Ma, Liwang
2012-01-01
Nitrate leaching in the unsaturated zone poses a risk to groundwater, whereas nitrate in tile drainage is conveyed directly to streams. We developed metamodels (MMs) consisting of artificial neural networks to simplify and upscale mechanistic fate and transport models for prediction of nitrate losses by drains and leaching in the Corn Belt, USA. The two final MMs predicted nitrate concentration and flux, respectively, in the shallow subsurface. Because each MM considered both tile drainage and leaching, they represent an integrated approach to vulnerability assessment. The MMs used readily available data comprising farm fertilizer nitrogen (N), weather data, and soil properties as inputs; therefore, they were well suited for regional extrapolation. The MMs effectively related the outputs of the underlying mechanistic model (Root Zone Water Quality Model) to the inputs (R2 = 0.986 for the nitrate concentration MM). Predicted nitrate concentration was compared with measured nitrate in 38 samples of recently recharged groundwater, yielding a Pearson’s r of 0.466 (p = 0.003). Predicted nitrate generally was higher than that measured in groundwater, possibly as a result of the time-lag for modern recharge to reach well screens, denitrification in groundwater, or interception of recharge by tile drains. In a qualitative comparison, predicted nitrate concentration also compared favorably with results from a previous regression model that predicted total N in streams.
Farrell, Tracy L; Poquet, Laure; Dew, Tristan P; Barber, Stuart; Williamson, Gary
2012-02-01
There is a considerable need to rationalize the membrane permeability and mechanism of transport for potential nutraceuticals. The aim of this investigation was to develop a theoretical permeability equation, based on a reported descriptive absorption model, enabling calculation of the transcellular component of absorption across Caco-2 monolayers. Published data for Caco-2 permeability of 30 drugs transported by the transcellular route were correlated with the descriptors 1-octanol/water distribution coefficient (log D, pH 7.4) and size, based on molecular mass. Nonlinear regression analysis was used to derive a set of model parameters a', β', and b' with an integrated molecular mass function. The new theoretical transcellular permeability (TTP) model obtained a good fit of the published data (R² = 0.93) and predicted reasonably well (R² = 0.86) the experimental apparent permeability coefficient (P(app)) for nine non-training set compounds reportedly transported by the transcellular route. For the first time, the TTP model was used to predict the absorption characteristics of six phenolic acids, and this original investigation was supported by in vitro Caco-2 cell mechanistic studies, which suggested that deviation of the P(app) value from the predicted transcellular permeability (P(app)(trans)) may be attributed to involvement of active uptake, efflux transporters, or paracellular flux.
Upton, J; Murphy, M; Shalloo, L; Groot Koerkamp, P W G; De Boer, I J M
2014-01-01
Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical data of 1yr on commercial spring calving, grass-based dairy farms with 45, 88, and 195 milking cows; and (3) demonstrated the functionality of the model by applying 2 electricity tariffs to the electricity consumption data and examining the effect on total dairy farm electricity costs. The MECD was developed using a mechanistic modeling approach and required the key inputs of milk production, cow number, and details relating to the milk-cooling system, milking machine system, water-heating system, lighting systems, water pump systems, and the winter housing facilities as well as details relating to the management of the farm (e.g., season of calving). Model validation showed an overall relative prediction error (RPE) of less than 10% for total electricity consumption. More than 87% of the mean square prediction error of total electricity consumption was accounted for by random variation. The RPE values of the milk-cooling systems, water-heating systems, and milking machine systems were less than 20%. The RPE values for automatic scraper systems, lighting systems, and water pump systems varied from 18 to 113%, indicating a poor prediction for these metrics. However, automatic scrapers, lighting, and water pumps made up only 14% of total electricity consumption across all farms, reducing the overall impact of these poor predictions. Demonstration of the model showed that total farm electricity costs increased by between 29 and 38% by moving from a day and night tariff to a flat tariff. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Modeling the risk of water pollution by pesticides from imbalanced data.
Trajanov, Aneta; Kuzmanovski, Vladimir; Real, Benoit; Perreau, Jonathan Marks; Džeroski, Sašo; Debeljak, Marko
2018-04-30
The pollution of ground and surface waters with pesticides is a serious ecological issue that requires adequate treatment. Most of the existing water pollution models are mechanistic mathematical models. While they have made a significant contribution to understanding the transfer processes, they face the problem of validation because of their complexity, the user subjectivity in their parameterization, and the lack of empirical data for validation. In addition, the data describing water pollution with pesticides are, in most cases, very imbalanced. This is due to strict regulations for pesticide applications, which lead to only a few pollution events. In this study, we propose the use of data mining to build models for assessing the risk of water pollution by pesticides in field-drained outflow water. Unlike the mechanistic models, the models generated by data mining are based on easily obtainable empirical data, while the parameterization of the models is not influenced by the subjectivity of ecological modelers. We used empirical data from field trials at the La Jaillière experimental site in France and applied the random forests algorithm to build predictive models that predict "risky" and "not-risky" pesticide application events. To address the problems of the imbalanced classes in the data, cost-sensitive learning and different measures of predictive performance were used. Despite the high imbalance between risky and not-risky application events, we managed to build predictive models that make reliable predictions. The proposed modeling approach can be easily applied to other ecological modeling problems where we encounter empirical data with highly imbalanced classes.
Wang, Yi; Lee, Sui Mae; Dykes, Gary
2015-01-01
Bacterial attachment to abiotic surfaces can be explained as a physicochemical process. Mechanisms of the process have been widely studied but are not yet well understood due to their complexity. Physicochemical processes can be influenced by various interactions and factors in attachment systems, including, but not limited to, hydrophobic interactions, electrostatic interactions and substratum surface roughness. Mechanistic models and control strategies for bacterial attachment to abiotic surfaces have been established based on the current understanding of the attachment process and the interactions involved. Due to a lack of process control and standardization in the methodologies used to study the mechanisms of bacterial attachment, however, various challenges are apparent in the development of models and control strategies. In this review, the physicochemical mechanisms, interactions and factors affecting the process of bacterial attachment to abiotic surfaces are described. Mechanistic models established based on these parameters are discussed in terms of their limitations. Currently employed methods to study these parameters and bacterial attachment are critically compared. The roles of these parameters in the development of control strategies for bacterial attachment are reviewed, and the challenges that arise in developing mechanistic models and control strategies are assessed.
DOT National Transportation Integrated Search
2018-01-01
This report explores the application of a discrete computational model for predicting the fracture behavior of asphalt mixtures at low temperatures based on the results of simple laboratory experiments. In this discrete element model, coarse aggregat...
Adaptive Response in Female Modeling of the Hypothalamic-pituitary-gonadal Axis
Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course ...
Olivares-Morales, Andrés; Ghosh, Avijit; Aarons, Leon; Rostami-Hodjegan, Amin
2016-11-01
A new minimal Segmented Transit and Absorption model (mSAT) model has been recently proposed and combined with intrinsic intestinal effective permeability (P eff,int ) to predict the regional gastrointestinal (GI) absorption (f abs ) of several drugs. Herein, this model was extended and applied for the prediction of oral bioavailability and pharmacokinetics of oxybutynin and its enantiomers to provide a mechanistic explanation of the higher relative bioavailability observed for oxybutynin's modified-release OROS® formulation compared to its immediate-release (IR) counterpart. The expansion of the model involved the incorporation of mechanistic equations for the prediction of release, transit, dissolution, permeation and first-pass metabolism. The predicted pharmacokinetics of oxybutynin enantiomers after oral administration for both the IR and OROS® formulations were in close agreement with the observed data. The predicted absolute bioavailability for the IR formulation was within 5% of the observed value, and the model adequately predicted the higher relative bioavailability observed for the OROS® formulation vs. the IR counterpart. From the model predictions, it can be noticed that the higher bioavailability observed for the OROS® formulation was mainly attributable to differences in the intestinal availability (F G ) rather than due to a higher colonic f abs , thus confirming previous hypotheses. The predicted f abs was almost 70% lower for the OROS® formulation compared to the IR formulation, whereas the F G was almost eightfold higher than in the IR formulation. These results provide further support to the hypothesis of an increased F G as the main factor responsible for the higher bioavailability of oxybutynin's OROS® formulation vs. the IR.
Mechanistic modeling of pesticide exposure: The missing keystone of honey bee toxicology.
Sponsler, Douglas B; Johnson, Reed M
2017-04-01
The role of pesticides in recent honey bee losses is controversial, partly because field studies often fail to detect effects predicted by laboratory studies. This dissonance highlights a critical gap in the field of honey bee toxicology: there exists little mechanistic understanding of the patterns and processes of exposure that link honey bees to pesticides in their environment. The authors submit that 2 key processes underlie honey bee pesticide exposure: 1) the acquisition of pesticide by foraging bees, and 2) the in-hive distribution of pesticide returned by foragers. The acquisition of pesticide by foraging bees must be understood as the spatiotemporal intersection between environmental contamination and honey bee foraging activity. This implies that exposure is distributional, not discrete, and that a subset of foragers may acquire harmful doses of pesticide while the mean colony exposure would appear safe. The in-hive distribution of pesticide is a complex process driven principally by food transfer interactions between colony members, and this process differs importantly between pollen and nectar. High priority should be placed on applying the extensive literature on honey bee biology to the development of more rigorously mechanistic models of honey bee pesticide exposure. In combination with mechanistic effects modeling, mechanistic exposure modeling has the potential to integrate the field of honey bee toxicology, advancing both risk assessment and basic research. Environ Toxicol Chem 2017;36:871-881. © 2016 SETAC. © 2016 SETAC.
McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.
2017-01-01
In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216
Lau, Max S Y; Gibson, Gavin J; Adrakey, Hola; McClelland, Amanda; Riley, Steven; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D; Grenfell, Bryan T
2017-10-01
In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.
Putting mechanisms into crop production models.
Boote, Kenneth J; Jones, James W; White, Jeffrey W; Asseng, Senthold; Lizaso, Jon I
2013-09-01
Crop growth models dynamically simulate processes of C, N and water balance on daily or hourly time-steps to predict crop growth and development and at season-end, final yield. Their ability to integrate effects of genetics, environment and crop management have led to applications ranging from understanding gene function to predicting potential impacts of climate change. The history of crop models is reviewed briefly, and their level of mechanistic detail for assimilation and respiration, ranging from hourly leaf-to-canopy assimilation to daily radiation-use efficiency is discussed. Crop models have improved steadily over the past 30-40 years, but much work remains. Improvements are needed for the prediction of transpiration response to elevated CO₂ and high temperature effects on phenology and reproductive fertility, and simulation of root growth and nutrient uptake under stressful edaphic conditions. Mechanistic improvements are needed to better connect crop growth to genetics and to soil fertility, soil waterlogging and pest damage. Because crop models integrate multiple processes and consider impacts of environment and management, they have excellent potential for linking research from genomics and allied disciplines to crop responses at the field scale, thus providing a valuable tool for deciphering genotype by environment by management effects. © 2013 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Vanwalleghem, T.; Román, A.; Giraldez, J. V.
2016-12-01
There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of a geostatistical versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.
NASA Astrophysics Data System (ADS)
Ghimire, B.; Riley, W. J.; Koven, C. D.; Randerson, J. T.; Mu, M.; Kattge, J.; Rogers, A.; Reich, P. B.
2014-12-01
In many ecosystems, nitrogen is the most limiting nutrient for plant growth and productivity. However mechanistic representation of nitrogen uptake linked to root traits, and functional nitrogen allocation among different leaf enzymes involved in respiration and photosynthesis is currently lacking in Earth System models. The linkage between nitrogen availability and plant productivity is simplistically represented by potential photosynthesis rates, and is subsequently downregulated depending on nitrogen supply and other nitrogen consumers in the model (e.g., nitrification). This type of potential photosynthesis rate calculation is problematic for several reasons. Firstly, plants do not photosynthesize at potential rates and then downregulate. Secondly, there is considerable subjectivity on the meaning of potential photosynthesis rates. Thirdly, there exists lack of understanding on modeling these potential photosynthesis rates in a changing climate. In addition to model structural issues in representing photosynthesis rates, the role of plant roots in nutrient acquisition have been largely ignored in Earth System models. For example, in CLM4.5, nitrogen uptake is linked to leaf level processes (e.g., primarily productivity) rather than root scale process involved in nitrogen uptake. We present a new plant model for CLM with an improved mechanistic presentation of plant nitrogen uptake based on root scale Michaelis Menten kinetics, and stronger linkages between leaf nitrogen and plant productivity by inferring relationships observed in global databases of plant traits (including the TRY database and several individual studies). We also incorporate improved representation of plant nitrogen leaf allocation, especially in tropical regions where significant over-prediction of plant growth and productivity in CLM4.5 simulations exist. We evaluate our improved global model simulations using the International Land Model Benchmarking (ILAMB) framework. We conclude that mechanistic representation of leaf-level nitrogen allocation and a theoretically consistent treatment of competition with belowground consumers leads to overall improvements in CLM4.5's global carbon cycling predictions.
Varma, Manthena V S; Lai, Yurong; Kimoto, Emi; Goosen, Theunis C; El-Kattan, Ayman F; Kumar, Vikas
2013-04-01
Quantitative prediction of complex drug-drug interactions (DDIs) is challenging. Repaglinide is mainly metabolized by cytochrome-P-450 (CYP)2C8 and CYP3A4, and is also a substrate of organic anion transporting polypeptide (OATP)1B1. The purpose is to develop a physiologically based pharmacokinetic (PBPK) model to predict the pharmacokinetics and DDIs of repaglinide. In vitro hepatic transport of repaglinide, gemfibrozil and gemfibrozil 1-O-β-glucuronide was characterized using sandwich-culture human hepatocytes. A PBPK model, implemented in Simcyp (Sheffield, UK), was developed utilizing in vitro transport and metabolic clearance data. In vitro studies suggested significant active hepatic uptake of repaglinide. Mechanistic model adequately described repaglinide pharmacokinetics, and successfully predicted DDIs with several OATP1B1 and CYP3A4 inhibitors (<10% error). Furthermore, repaglinide-gemfibrozil interaction at therapeutic dose was closely predicted using in vitro fraction metabolism for CYP2C8 (0.71), when primarily considering reversible inhibition of OATP1B1 and mechanism-based inactivation of CYP2C8 by gemfibrozil and gemfibrozil 1-O-β-glucuronide. This study demonstrated that hepatic uptake is rate-determining in the systemic clearance of repaglinide. The model quantitatively predicted several repaglinide DDIs, including the complex interactions with gemfibrozil. Both OATP1B1 and CYP2C8 inhibition contribute significantly to repaglinide-gemfibrozil interaction, and need to be considered for quantitative rationalization of DDIs with either drug.
How to make a tree ring: Coupling stem water flow and cambial activity in mature Alpine conifers
NASA Astrophysics Data System (ADS)
Peters, Richard L.; Frank, David C.; Treydte, Kerstin; Steppe, Kathy; Kahmen, Ansgar; Fonti, Patrick
2017-04-01
Inter-annual tree-ring measurements are used to understand tree-growth responses to climatic variability and reconstruct past climate conditions. In parallel, mechanistic models use experimentally defined plant-atmosphere interactions to explain past growth responses and predict future environmental impact on forest productivity. Yet, substantial inconsistencies within mechanistic model ensembles and mismatches with empirical data indicate that significant progress is still needed to understand the processes occurring at an intra-annual resolution that drive annual growth. However, challenges arise due to i) few datasets describing climatic responses of high-resolution physiological processes over longer time-scales, ii) uncertainties on the main mechanistic process limiting radial stem growth and iii) complex interactions between multiple environmental factors which obscure detection of the main stem growth driver, generating a gap between our understanding of intra- and inter-annual growth mechanisms. We attempt to bridge the gap between inter-annual tree-ring width and sub-daily radial stem-growth and provide a mechanistic perspective on how environmental conditions affect physiological processes that shape tree rings in conifers. We combine sub-hourly sap flow and point dendrometer measurements performed on mature Alpine conifers (Larix decidua) into an individual-based mechanistic tree-growth model to simulate sub-hourly cambial activity. The monitored trees are located along a high elevational transect in the Swiss Alps (Lötschental) to analyse the effect of increasing temperature. The model quantifies internal tree hydraulic pathways that regulate the turgidity within the cambial zone and induce cell enlargement for radial growth. The simulations are validated against intra-annual growth patterns derived from xylogenesis data and anatomical analyses. Our efforts advance the process-based understanding of how climate shapes the annual tree-ring structures and could potentially improve our ability to reconstruct the climate of the past and predict future growth under changing climate.
Wang, Gang; Briskot, Till; Hahn, Tobias; Baumann, Pascal; Hubbuch, Jürgen
2017-03-03
Mechanistic modeling has been repeatedly successfully applied in process development and control of protein chromatography. For each combination of adsorbate and adsorbent, the mechanistic models have to be calibrated. Some of the model parameters, such as system characteristics, can be determined reliably by applying well-established experimental methods, whereas others cannot be measured directly. In common practice of protein chromatography modeling, these parameters are identified by applying time-consuming methods such as frontal analysis combined with gradient experiments, curve-fitting, or combined Yamamoto approach. For new components in the chromatographic system, these traditional calibration approaches require to be conducted repeatedly. In the presented work, a novel method for the calibration of mechanistic models based on artificial neural network (ANN) modeling was applied. An in silico screening of possible model parameter combinations was performed to generate learning material for the ANN model. Once the ANN model was trained to recognize chromatograms and to respond with the corresponding model parameter set, it was used to calibrate the mechanistic model from measured chromatograms. The ANN model's capability of parameter estimation was tested by predicting gradient elution chromatograms. The time-consuming model parameter estimation process itself could be reduced down to milliseconds. The functionality of the method was successfully demonstrated in a study with the calibration of the transport-dispersive model (TDM) and the stoichiometric displacement model (SDM) for a protein mixture. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Simulating polar bear energetics during a seasonal fast using a mechanistic model.
Mathewson, Paul D; Porter, Warren P
2013-01-01
In this study we tested the ability of a mechanistic model (Niche Mapper™) to accurately model adult, non-denning polar bear (Ursus maritimus) energetics while fasting during the ice-free season in the western Hudson Bay. The model uses a steady state heat balance approach, which calculates the metabolic rate that will allow an animal to maintain its core temperature in its particular microclimate conditions. Predicted weight loss for a 120 day fast typical of the 1990s was comparable to empirical studies of the population, and the model was able to reach a heat balance at the target metabolic rate for the entire fast, supporting use of the model to explore the impacts of climate change on polar bears. Niche Mapper predicted that all but the poorest condition bears would survive a 120 day fast under current climate conditions. When the fast extended to 180 days, Niche Mapper predicted mortality of up to 18% for males. Our results illustrate how environmental conditions, variation in animal properties, and thermoregulation processes may impact survival during extended fasts because polar bears were predicted to require additional energetic expenditure for thermoregulation during a 180 day fast. A uniform 3°C temperature increase reduced male mortality during a 180 day fast from 18% to 15%. Niche Mapper explicitly links an animal's energetics to environmental conditions and thus can be a valuable tool to help inform predictions of climate-related population changes. Since Niche Mapper is a generic model, it can make energetic predictions for other species threatened by climate change.
Simulating Polar Bear Energetics during a Seasonal Fast Using a Mechanistic Model
Mathewson, Paul D.; Porter, Warren P.
2013-01-01
In this study we tested the ability of a mechanistic model (Niche Mapper™) to accurately model adult, non-denning polar bear (Ursus maritimus) energetics while fasting during the ice-free season in the western Hudson Bay. The model uses a steady state heat balance approach, which calculates the metabolic rate that will allow an animal to maintain its core temperature in its particular microclimate conditions. Predicted weight loss for a 120 day fast typical of the 1990s was comparable to empirical studies of the population, and the model was able to reach a heat balance at the target metabolic rate for the entire fast, supporting use of the model to explore the impacts of climate change on polar bears. Niche Mapper predicted that all but the poorest condition bears would survive a 120 day fast under current climate conditions. When the fast extended to 180 days, Niche Mapper predicted mortality of up to 18% for males. Our results illustrate how environmental conditions, variation in animal properties, and thermoregulation processes may impact survival during extended fasts because polar bears were predicted to require additional energetic expenditure for thermoregulation during a 180 day fast. A uniform 3°C temperature increase reduced male mortality during a 180 day fast from 18% to 15%. Niche Mapper explicitly links an animal’s energetics to environmental conditions and thus can be a valuable tool to help inform predictions of climate-related population changes. Since Niche Mapper is a generic model, it can make energetic predictions for other species threatened by climate change. PMID:24019883
Dose selection based on physiologically based pharmacokinetic (PBPK) approaches.
Jones, Hannah M; Mayawala, Kapil; Poulin, Patrick
2013-04-01
Physiologically based pharmacokinetic (PBPK) models are built using differential equations to describe the physiology/anatomy of different biological systems. Readily available in vitro and in vivo preclinical data can be incorporated into these models to not only estimate pharmacokinetic (PK) parameters and plasma concentration-time profiles, but also to gain mechanistic insight into compound properties. They provide a mechanistic framework to understand and extrapolate PK and dose across in vitro and in vivo systems and across different species, populations and disease states. Using small molecule and large molecule examples from the literature and our own company, we have shown how PBPK techniques can be utilised for human PK and dose prediction. Such approaches have the potential to increase efficiency, reduce the need for animal studies, replace clinical trials and increase PK understanding. Given the mechanistic nature of these models, the future use of PBPK modelling in drug discovery and development is promising, however some limitations need to be addressed to realise its application and utility more broadly.
STOCHASTIC SIMULATION OF FIELD-SCALE PESTICIDE TRANSPORT USING OPUS AND GLEAMS
Incorporating variability in soil and chemical properties into root zone leaching models should provide a better representation of pollutant distribution in natural field conditions. Our objective was to determine if a more mechanistic rate-based model (Opus) would predict soil w...
Bioinformatics, or in silico biology, is a rapidly growing field that encompasses the theory and application of computational approaches to model, predict, and explain biological function at the molecular level. This information rich field requires new ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J; Gehl, S M
1979-01-01
GRASS-SST and FASTGRASS are mechanistic computer codes for predicting fission-gas behavior in UO/sub 2/-base fuels during steady-state and transient conditions. FASTGRASS was developed in order to satisfy the need for a fast-running alternative to GRASS-SST. Althrough based on GRASS-SST, FASTGRASS is approximately an order of magnitude quicker in execution. The GRASS-SST transient analysis has evolved through comparisons of code predictions with the fission-gas release and physical phenomena that occur during reactor operation and transient direct-electrical-heating (DEH) testing of irradiated light-water reactor fuel. The FASTGRASS calculational procedure is described in this paper, along with models of key physical processes included inmore » both FASTGRASS and GRASS-SST. Predictions of fission-gas release obtained from GRASS-SST and FASTGRASS analyses are compared with experimental observations from a series of DEH tests. The major conclusions is that the computer codes should include an improved model for the evolution of the grain-edge porosity.« less
Cremer, Jonas; Arnoldini, Markus; Hwa, Terence
2017-06-20
The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth, which ultimately dictates microbiota composition. Combining measurements of bacterial physiology with analysis of published data on human physiology into a quantitative, comprehensive modeling framework, we show how water flow in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla. Mechanistically, our model shows that local pH values in the lumen, which differentially affect the growth of different bacteria, drive changes in microbiota composition. It identifies key factors influencing the delicate regulation of colonic pH, including epithelial water absorption, nutrient inflow, and luminal buffering capacity, and generates testable predictions on their effects. Our findings show that a predictive and mechanistic understanding of microbial ecology in the gut is possible. Such predictive understanding is needed for the rational design of intervention strategies to actively control the microbiota.
Identifiability, reducibility, and adaptability in allosteric macromolecules.
Bohner, Gergő; Venkataraman, Gaurav
2017-05-01
The ability of macromolecules to transduce stimulus information at one site into conformational changes at a distant site, termed "allostery," is vital for cellular signaling. Here, we propose a link between the sensitivity of allosteric macromolecules to their underlying biophysical parameters, the interrelationships between these parameters, and macromolecular adaptability. We demonstrate that the parameters of a canonical model of the mSlo large-conductance Ca 2+ -activated K + (BK) ion channel are non-identifiable with respect to the equilibrium open probability-voltage relationship, a common functional assay. We construct a reduced model with emergent parameters that are identifiable and expressed as combinations of the original mechanistic parameters. These emergent parameters indicate which coordinated changes in mechanistic parameters can leave assay output unchanged. We predict that these coordinated changes are used by allosteric macromolecules to adapt, and we demonstrate how this prediction can be tested experimentally. We show that these predicted parameter compensations are used in the first reported allosteric phenomena: the Bohr effect, by which hemoglobin adapts to varying pH. © 2017 Bohner and Venkataraman.
Identifiability, reducibility, and adaptability in allosteric macromolecules
Bohner, Gergő
2017-01-01
The ability of macromolecules to transduce stimulus information at one site into conformational changes at a distant site, termed “allostery,” is vital for cellular signaling. Here, we propose a link between the sensitivity of allosteric macromolecules to their underlying biophysical parameters, the interrelationships between these parameters, and macromolecular adaptability. We demonstrate that the parameters of a canonical model of the mSlo large-conductance Ca2+-activated K+ (BK) ion channel are non-identifiable with respect to the equilibrium open probability-voltage relationship, a common functional assay. We construct a reduced model with emergent parameters that are identifiable and expressed as combinations of the original mechanistic parameters. These emergent parameters indicate which coordinated changes in mechanistic parameters can leave assay output unchanged. We predict that these coordinated changes are used by allosteric macromolecules to adapt, and we demonstrate how this prediction can be tested experimentally. We show that these predicted parameter compensations are used in the first reported allosteric phenomena: the Bohr effect, by which hemoglobin adapts to varying pH. PMID:28416647
Cremer, Jonas; Arnoldini, Markus; Hwa, Terence
2017-01-01
The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth, which ultimately dictates microbiota composition. Combining measurements of bacterial physiology with analysis of published data on human physiology into a quantitative, comprehensive modeling framework, we show how water flow in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla. Mechanistically, our model shows that local pH values in the lumen, which differentially affect the growth of different bacteria, drive changes in microbiota composition. It identifies key factors influencing the delicate regulation of colonic pH, including epithelial water absorption, nutrient inflow, and luminal buffering capacity, and generates testable predictions on their effects. Our findings show that a predictive and mechanistic understanding of microbial ecology in the gut is possible. Such predictive understanding is needed for the rational design of intervention strategies to actively control the microbiota. PMID:28588144
Robust PBPK/PD-Based Model Predictive Control of Blood Glucose.
Schaller, Stephan; Lippert, Jorg; Schaupp, Lukas; Pieber, Thomas R; Schuppert, Andreas; Eissing, Thomas
2016-07-01
Automated glucose control (AGC) has not yet reached the point where it can be applied clinically [3]. Challenges are accuracy of subcutaneous (SC) glucose sensors, physiological lag times, and both inter- and intraindividual variability. To address above issues, we developed a novel scheme for MPC that can be applied to AGC. An individualizable generic whole-body physiology-based pharmacokinetic and dynamics (PBPK/PD) model of the glucose, insulin, and glucagon metabolism has been used as the predictive kernel. The high level of mechanistic detail represented by the model takes full advantage of the potential of MPC and may make long-term prediction possible as it captures at least some relevant sources of variability [4]. Robustness against uncertainties was increased by a control cascade relying on proportional-integrative derivative-based offset control. The performance of this AGC scheme was evaluated in silico and retrospectively using data from clinical trials. This analysis revealed that our approach handles sensor noise with a MARD of 10%-14%, and model uncertainties and disturbances. The results suggest that PBPK/PD models are well suited for MPC in a glucose control setting, and that their predictive power in combination with the integrated database-driven (a priori individualizable) model framework will help overcome current challenges in the development of AGC systems. This study provides a new, generic, and robust mechanistic approach to AGC using a PBPK platform with extensive a priori (database) knowledge for individualization.
Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice
2017-01-01
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24–25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. PMID:27994170
Freund, Anat; Drach-Zahavy, Anat
2007-06-01
Teamwork in community clinics was examined to propose and test a model that views the different kinds of commitment (job involvement and organizational commitment) and the potential conflict between them, as mediators between personal and organizational factors (mechanistic structuring and organic structuring) and the effectiveness of interprofessional teamwork. Differences among the professional groups became evident with regard to their views of the goals of teamwork and the ways to achieve them. As for mechanistic structuring, although the clinic members saw their mechanistic structuring in a more bureaucratic sense, the combination of mechanistic structuring and organic structuring led to effective teamwork. In terms of commitment, while staff members were committed primarily to their job and not the organization, commitment to the organization produced effective teamwork in the clinics.
USDA-ARS?s Scientific Manuscript database
In order to control algal blooms, stressor-response relationships between water quality metrics, environmental variables, and algal growth should be understood and modeled. Machine-learning methods were suggested to express stressor-response relationships found by application of mechanistic water qu...
DOT National Transportation Integrated Search
2001-02-01
A new version of the CRCP computer program, CRCP-9, has been developed in this study. The numerical model of the CRC pavements was developed using finite element theories, the crack spacing prediction model was developed using the Monte Carlo method,...
ABSTRACT Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We developed a mechanistic mathematical model of the hypothalamic pituitary-gonadal (HPG) axis in female fathead minnows to predic...
Mechanistic Enzyme Models: Pyridoxal and Metal Ions.
ERIC Educational Resources Information Center
Hamilton, S. E.; And Others
1984-01-01
Background information, procedures, and results are presented for experiments on the pyridoxal/metal ion model system. These experiments illustrate catalysis through Schiff's base formation between aldehydes/ketones and primary amines, catalysis by metal ions, and the predictable manner in which metal ions inhibit or catalyze reactions. (JN)
Defence mechanisms: the role of physiology in current and future environmental protection paradigms
Glover, Chris N
2018-01-01
Abstract Ecological risk assessments principally rely on simplified metrics of organismal sensitivity that do not consider mechanism or biological traits. As such, they are unable to adequately extrapolate from standard laboratory tests to real-world settings, and largely fail to account for the diversity of organisms and environmental variables that occur in natural environments. However, an understanding of how stressors influence organism health can compensate for these limitations. Mechanistic knowledge can be used to account for species differences in basal biological function and variability in environmental factors, including spatial and temporal changes in the chemical, physical and biological milieu. Consequently, physiological understanding of biological function, and how this is altered by stressor exposure, can facilitate proactive, predictive risk assessment. In this perspective article, existing frameworks that utilize physiological knowledge (e.g. biotic ligand models, adverse outcomes pathways and mechanistic effect models), are outlined, and specific examples of how mechanistic understanding has been used to predict risk are highlighted. Future research approaches and data needs for extending the incorporation of physiological information into ecological risk assessments are discussed. Although the review focuses on chemical toxicants in aquatic systems, physical and biological stressors and terrestrial environments are also briefly considered. PMID:29564135
Fugit, Kyle D; Anderson, Bradley D
2017-04-01
Actively loaded liposomal formulations of anticancer agents have been widely explored due to their high drug encapsulation efficiencies and prolonged drug retention. Mathematical models to predict and optimize drug loading and release kinetics from these nanoparticle formulations would be useful in their development and may allow researchers to tune release profiles. Such models must account for the driving forces as influenced by the physicochemical properties of the drug and the microenvironment, and the liposomal barrier properties. This study employed mechanistic modeling to describe the active liposomal loading and release kinetics of the anticancer agent topotecan (TPT). The model incorporates ammonia transport resulting in generation of a pH gradient, TPT dimerization, TPT lactone ring-opening and -closing interconversion kinetics, chloride transport, and transport of TPT-chloride ion-pairs to describe the active loading and release kinetics of TPT in the presence of varying chloride concentrations. Model-based predictions of the kinetics of active loading at varying loading concentrations of TPT and release under dynamic dialysis conditions were in reasonable agreement with experiments. These findings identify key attributes to consider in optimizing and predicting loading and release of liposomal TPT that may also be applicable to liposomal formulations of other weakly basic pharmaceuticals. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
How adverse outcome pathways can aid the development and ...
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. The present manuscript reports on expert opinion and case studies that came out of a European Commission, Joint Research Centre-sponsored work
On Spatially Explicit Models of Epidemic and Endemic Cholera: The Haiti and Lake Kivu Case Studies.
NASA Astrophysics Data System (ADS)
Rinaldo, A.; Bertuzzo, E.; Mari, L.; Finger, F.; Casagrandi, R.; Gatto, M.; Rodriguez-Iturbe, I.
2014-12-01
The first part of the Lecture deals with the predictive ability of mechanistic models for the Haitian cholera epidemic. Predictive models of epidemic cholera need to resolve at suitable aggregation levels spatial data pertaining to local communities, epidemiological records, hydrologic drivers, waterways, patterns of human mobility and proxies of exposure rates. A formal model comparison framework provides a quantitative assessment of the explanatory and predictive abilities of various model settings with different spatial aggregation levels. Intensive computations and objective model comparisons show that parsimonious spatially explicit models accounting for spatial connections have superior explanatory power than spatially disconnected ones for short-to intermediate calibration windows. In general, spatially connected models show better predictive ability than disconnected ones. We suggest limits and validity of the various approaches and discuss the pathway towards the development of case-specific predictive tools in the context of emergency management. The second part deals with approaches suitable to describe patterns of endemic cholera. Cholera outbreaks have been reported in the Democratic Republic of the Congo since the 1970s. Here we employ a spatially explicit, inhomogeneous Markov chain model to describe cholera incidence in eight health zones on the shore of lake Kivu. Remotely sensed datasets of chlorophyll a concentration in the lake, precipitation and indices of global climate anomalies are used as environmental drivers in addition to baseline seasonality. The effect of human mobility is also modelled mechanistically. We test several models on a multi-year dataset of reported cholera cases. Fourteen models, accounting for different environmental drivers, are selected in calibration. Among these, the one accounting for seasonality, El Nino Southern Oscillation, precipitation and human mobility outperforms the others in cross-validation.
The practice of prediction: What can ecologists learn from applied, ecology-related fields?
Pennekamp, Frank; Adamson, Matthew; Petchey, Owen L; Poggiale, Jean-Christophe; Aguiar, Maira; Kooi, Bob W.; Botkin, Daniel B.; DeAngelis, Donald L.
2017-01-01
The pervasive influence of human induced global environmental change affects biodiversity across the globe, and there is great uncertainty as to how the biosphere will react on short and longer time scales. To adapt to what the future holds and to manage the impacts of global change, scientists need to predict the expected effects with some confidence and communicate these predictions to policy makers. However, recent reviews found that we currently lack a clear understanding of how predictable ecology is, with views seeing it as mostly unpredictable to potentially predictable, at least over short time frames. However, in applied, ecology-related fields predictions are more commonly formulated and reported, as well as evaluated in hindsight, potentially allowing one to define baselines of predictive proficiency in these fields. We searched the literature for representative case studies in these fields and collected information about modeling approaches, target variables of prediction, predictive proficiency achieved, as well as the availability of data to parameterize predictive models. We find that some fields such as epidemiology achieve high predictive proficiency, but even in the more predictive fields proficiency is evaluated in different ways. Both phenomenological and mechanistic approaches are used in most fields, but differences are often small, with no clear superiority of one approach over the other. Data availability is limiting in most fields, with long-term studies being rare and detailed data for parameterizing mechanistic models being in short supply. We suggest that ecologists adopt a more rigorous approach to report and assess predictive proficiency, and embrace the challenges of real world decision making to strengthen the practice of prediction in ecology.
James A. Powell; Barbara J. Bentz
2014-01-01
For species with irruptive population behavior, dispersal is an important component of outbreak dynamics. We developed and parameterized a mechanistic model describing mountain pine beetle (Dendroctonus ponderosae Hopkins) population demographics and dispersal across a landscape. Model components include temperature-dependent phenology, host tree colonization...
ACCUMULATION OF PBDE-47 IN PRIMARY CULTURES OF RAT NEOCORTICAL CELLS.
Cell culture models are often used in mechanistic studies of toxicant action. However, one area of uncertainty is the extrapolation of dose from the in vitro model to the in vivo tissue. A common assumption of in vitro studies is that media concentration is a predictive marker of...
Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course ...
Buyukozturk, Fulden; Di Maio, Selena; Budil, David E.; Carrier, Rebecca L.
2014-01-01
Purpose To mechanistically study and model the effect of lipids, either from food or self-emulsifying drug delivery systems (SEDDS), on drug transport in the intestinal lumen. Methods Simultaneous lipid digestion, dissolution/release, and drug partitioning were experimentally studied and modeled for two dosing scenarios: solid drug with a food-associated lipid (soybean oil) and drug solubilized in a model SEDDS (soybean oil and Tween 80 at 1:1 ratio). Rate constants for digestion, permeability of emulsion droplets, and partition coefficients in micellar and oil phases were measured, and used to numerically solve the developed model. Results Strong influence of lipid digestion on drug release from SEDDS and solid drug dissolution into food-associated lipid emulsion were observed and predicted by the developed model. 90 minutes after introduction of SEDDS, there was 9% and 70% drug release in the absence and presence of digestion, respectively. However, overall drug dissolution in the presence of food-associated lipids occurred over a longer period than without digestion. Conclusion A systems-based mechanistic model incorporating simultaneous dynamic processes occurring upon dosing of drug with lipids enabled prediction of aqueous drug concentration profile. This model, once incorporated with a pharmacokinetic model considering processes of drug absorption and drug lymphatic transport in the presence of lipids, could be highly useful for quantitative prediction of impact of lipids on bioavailability of drugs. PMID:24234918
Rodríguez, Sylian; Almquist, Catherine; Lee, Tai Gyu; Furuuchi, Masami; Hedrick, Elizabeth; Biswas, Pratim
2004-02-01
A mechanistic model to predict the capture of gas-phase mercury (Hg) species using in situ-generated titania nanosize particles activated by UV irradiation is developed. The model is an extension of a recently reported model for photochemical reactions by Almquist and Biswas that accounts for the rates of electron-hole pair generation, the adsorption of the compound to be oxidized, and the adsorption of water vapor. The role of water vapor in the removal efficiency of Hg was investigated to evaluate the rates of Hg oxidation at different water vapor concentrations. As the water vapor concentration is increased, more hydroxy radical species are generated on the surface of the titania particle, increasing the number of active sites for the photooxidation and capture of Hg. At very high water vapor concentrations, competitive adsorption is expected to be important and reduce the number of sites available for photooxidation of Hg. The predictions of the developed phenomenological model agreed well with the measured Hg oxidation rates in this study and with the data on oxidation of organic compounds reported in the literature.
Indirect Effects of Environmental Change in Resource Competition Models.
Kleinhesselink, Andrew R; Adler, Peter B
2015-12-01
Anthropogenic environmental change can affect species directly by altering physiological rates or indirectly by changing competitive outcomes. The unknown strength of competition-mediated indirect effects makes it difficult to predict species abundances in the face of ongoing environmental change. Theory developed with phenomenological competition models shows that indirect effects are weak when coexistence is strongly stabilized, but these models lack a mechanistic link between environmental change and species performance. To extend existing theory, we examined the relationship between coexistence and indirect effects in mechanistic resource competition models. We defined environmental change as a change in resource supply points and quantified the resulting competition-mediated indirect effects on species abundances. We found that the magnitude of indirect effects increases in proportion to niche overlap. However, indirect effects also depend on differences in how competitors respond to the change in resource supply, an insight hidden in nonmechanistic models. Our analysis demonstrates the value of using niche overlap to predict the strength of indirect effects and clarifies the types of indirect effects that global change can have on competing species.
Mechanistic species distribution modeling reveals a niche shift during invasion.
Chapman, Daniel S; Scalone, Romain; Štefanić, Edita; Bullock, James M
2017-06-01
Niche shifts of nonnative plants can occur when they colonize novel climatic conditions. However, the mechanistic basis for niche shifts during invasion is poorly understood and has rarely been captured within species distribution models. We quantified the consequence of between-population variation in phenology for invasion of common ragweed (Ambrosia artemisiifolia L.) across Europe. Ragweed is of serious concern because of its harmful effects as a crop weed and because of its impact on public health as a major aeroallergen. We developed a forward mechanistic species distribution model based on responses of ragweed development rates to temperature and photoperiod. The model was parameterized and validated from the literature and by reanalyzing data from a reciprocal common garden experiment in which native and invasive populations were grown within and beyond the current invaded range. It could therefore accommodate between-population variation in the physiological requirements for flowering, and predict the potentially invaded ranges of individual populations. Northern-origin populations that were established outside the generally accepted climate envelope of the species had lower thermal requirements for bud development, suggesting local adaptation of phenology had occurred during the invasion. The model predicts that this will extend the potentially invaded range northward and increase the average suitability across Europe by 90% in the current climate and 20% in the future climate. Therefore, trait variation observed at the population scale can trigger a climatic niche shift at the biogeographic scale. For ragweed, earlier flowering phenology in established northern populations could allow the species to spread beyond its current invasive range, substantially increasing its risk to agriculture and public health. Mechanistic species distribution models offer the possibility to represent niche shifts by varying the traits and niche responses of individual populations. Ignoring such effects could substantially underestimate the extent and impact of invasions. © 2017 by the Ecological Society of America.
Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M E Bette; Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M; Whelan, Maurice
2017-02-01
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24-25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenquist, Ian; Tonks, Michael
2016-10-01
Light water reactor fuel pellets are fabricated using sintering to final densities of 95% or greater. During reactor operation, the porosity remaining in the fuel after fabrication decreases further due to irradiation-assisted densification. While empirical models have been developed to describe this densification process, a mechanistic model is needed as part of the ongoing work by the NEAMS program to develop a more predictive fuel performance code. In this work we will develop a phase field model of sintering of UO 2 in the MARMOT code, and validate it by comparing to published sintering data. We will then add themore » capability to capture irradiation effects into the model, and use it to develop a mechanistic model of densification that will go into the BISON code and add another essential piece to the microstructure-based materials models. The final step will be to add the effects of applied fields, to model field-assisted sintering of UO 2. The results of the phase field model will be validated by comparing to data from field-assisted sintering. Tasks over three years: 1. Develop a sintering model for UO 2 in MARMOT 2. Expand model to account for irradiation effects 3. Develop a mechanistic macroscale model of densification for BISON« less
Understanding the effect of carbon status on stem diameter variations
De Swaef, Tom; Driever, Steven M.; Van Meulebroek, Lieven; Vanhaecke, Lynn; Marcelis, Leo F. M.; Steppe, Kathy
2013-01-01
Background Carbon assimilation and leaf-to-fruit sugar transport are, along with plant water status, the driving mechanisms for fruit growth. An integrated comprehension of the plant water and carbon relationships is therefore essential to better understand water and dry matter accumulation. Variations in stem diameter result from an integrated response to plant water and carbon status and are as such a valuable source of information. Methods A mechanistic water flow and storage model was used to relate variations in stem diameter to phloem sugar loading and sugar concentration dynamics in tomato. The simulation results were compared with an independent model, simulating phloem sucrose loading at the leaf level based on photosynthesis and sugar metabolism kinetics and enabled a mechanistic interpretation of the ‘one common assimilate pool’ concept for tomato. Key Results Combining stem diameter variation measurements and mechanistic modelling allowed us to distinguish instantaneous dynamics in the plant water relations and gradual variations in plant carbon status. Additionally, the model combined with stem diameter measurements enabled prediction of dynamic variables which are difficult to measure in a continuous and non-destructive way, such as xylem water potential and phloem hydrostatic potential. Finally, dynamics in phloem sugar loading and sugar concentration were distilled from stem diameter variations. Conclusions Stem diameter variations, when used in mechanistic models, have great potential to continuously monitor and interpret plant water and carbon relations under natural growing conditions. PMID:23186836
Van Bockstal, Pieter-Jan; Mortier, Séverine Thérèse F C; De Meyer, Laurens; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas
2017-05-01
Conventional pharmaceutical freeze-drying is an inefficient and expensive batch-wise process, associated with several disadvantages leading to an uncontrolled end product variability. The proposed continuous alternative, based on spinning the vials during freezing and on optimal energy supply during drying, strongly increases process efficiency and improves product quality (uniformity). The heat transfer during continuous drying of the spin frozen vials is provided via non-contact infrared (IR) radiation. The energy transfer to the spin frozen vials should be optimised to maximise the drying efficiency while avoiding cake collapse. Therefore, a mechanistic model was developed which allows computing the optimal, dynamic IR heater temperature in function of the primary drying progress and which, hence, also allows predicting the primary drying endpoint based on the applied dynamic IR heater temperature. The model was validated by drying spin frozen vials containing the model formulation (3.9mL in 10R vials) according to the computed IR heater temperature profile. In total, 6 validation experiments were conducted. The primary drying endpoint was experimentally determined via in-line near-infrared (NIR) spectroscopy and compared with the endpoint predicted by the model (50min). The mean ratio of the experimental drying time to the predicted value was 0.91, indicating a good agreement between the model predictions and the experimental data. The end product had an elegant product appearance (visual inspection) and an acceptable residual moisture content (Karl Fischer). Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
McNellis, B.; Hudiburg, T. W.
2017-12-01
Tree mortality due to drought is predicted to have increasing impacts on ecosystem structure and function during the 21st century. Models can attempt to predict which forests are most at risk from drought, but novel environments may preclude analysis that relies on past observations. The inclusion of more mechanistic detail may reduce uncertainty in predictions, but can also compound model complexity, especially in global models. The Community Land Model version 5 (CLM5), itself a component of the Community Earth System Model (CESM), has recently integrated cohort-based demography into its dynamic vegetation component and is in the process of coupling this demography to a model of plant hydraulic physiology (FATES-Hydro). Previous treatment of drought stress and plant mortality within CLM has been relatively broad, but a detailed hydraulics module represents a key step towards accurate mortality prognosis. Here, we examine the structure of FATES-Hydro with respect to two key physiological attributes: tissue osmotic potentials and embolism refilling. Specifically, we ask how FATES-Hydro captures mechanistic realism within each attribute and how much support there is within the physiological literature for its further elaboration within the model structure. Additionally, connections to broader aspects of carbon metabolism within FATES are explored to better resolve emergent consequences of drought stress on ecosystem function and tree demographics. An on-going field experiment in managed stands of Pinus ponderosa and mixed conifers is assessed for model parameterization and performance across PNW forests, with important implications for future forest management strategy.
Comparison of Two-Phase Pipe Flow in OpenFOAM with a Mechanistic Model
NASA Astrophysics Data System (ADS)
Shuard, Adrian M.; Mahmud, Hisham B.; King, Andrew J.
2016-03-01
Two-phase pipe flow is a common occurrence in many industrial applications such as power generation and oil and gas transportation. Accurate prediction of liquid holdup and pressure drop is of vast importance to ensure effective design and operation of fluid transport systems. In this paper, a Computational Fluid Dynamics (CFD) study of a two-phase flow of air and water is performed using OpenFOAM. The two-phase solver, interFoam is used to identify flow patterns and generate values of liquid holdup and pressure drop, which are compared to results obtained from a two-phase mechanistic model developed by Petalas and Aziz (2002). A total of 60 simulations have been performed at three separate pipe inclinations of 0°, +10° and -10° respectively. A three dimensional, 0.052m diameter pipe of 4m length is used with the Shear Stress Transport (SST) k - ɷ turbulence model to solve the turbulent mixtures of air and water. Results show that the flow pattern behaviour and numerical values of liquid holdup and pressure drop compare reasonably well to the mechanistic model.
Rhoden, John J.; Dyas, Gregory L.
2016-01-01
Despite the increasing number of multivalent antibodies, bispecific antibodies, fusion proteins, and targeted nanoparticles that have been generated and studied, the mechanism of multivalent binding to cell surface targets is not well understood. Here, we describe a conceptual and mathematical model of multivalent antibody binding to cell surface antigens. Our model predicts that properties beyond 1:1 antibody:antigen affinity to target antigens have a strong influence on multivalent binding. Predicted crucial properties include the structure and flexibility of the antibody construct, the target antigen(s) and binding epitope(s), and the density of antigens on the cell surface. For bispecific antibodies, the ratio of the expression levels of the two target antigens is predicted to be critical to target binding, particularly for the lower expressed of the antigens. Using bispecific antibodies of different valencies to cell surface antigens including MET and EGF receptor, we have experimentally validated our modeling approach and its predictions and observed several nonintuitive effects of avidity related to antigen density, target ratio, and antibody affinity. In some biological circumstances, the effect we have predicted and measured varied from the monovalent binding interaction by several orders of magnitude. Moreover, our mathematical framework affords us a mechanistic interpretation of our observations and suggests strategies to achieve the desired antibody-antigen binding goals. These mechanistic insights have implications in antibody engineering and structure/activity relationship determination in a variety of biological contexts. PMID:27022022
Jones, Matt; Love, Bradley C
2011-08-01
The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls that have plagued previous theoretical movements.
There is international concern about chemicals that alter endocrine system function in humans and/or wildlife and subsequently cause adverse effects. We previously developed a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minno...
TIME AND CONCENTRATION DEPENDENT ACCUMULATION OF [3H]-DELTAMETHRIN IN XENOPUS LAEVIS OOCYTES.
Cell culture models are often used in mechanistic studies of toxicant action. However, one area of uncertainty is the extrapolation of dose from the in vitro model to the in vivo tissue. A common assumption of in vitro studies is that media concentration is a predictive marker of...
Visible Machine Learning for Biomedicine.
Yu, Michael K; Ma, Jianzhu; Fisher, Jasmin; Kreisberg, Jason F; Raphael, Benjamin J; Ideker, Trey
2018-06-14
A major ambition of artificial intelligence lies in translating patient data to successful therapies. Machine learning models face particular challenges in biomedicine, however, including handling of extreme data heterogeneity and lack of mechanistic insight into predictions. Here, we argue for "visible" approaches that guide model structure with experimental biology. Copyright © 2018. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zapol, Peter; Bourg, Ian; Criscenti, Louise Jacqueline
2011-10-01
This report summarizes research performed for the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Subcontinuum and Upscaling Task. The work conducted focused on developing a roadmap to include molecular scale, mechanistic information in continuum-scale models of nuclear waste glass dissolution. This information is derived from molecular-scale modeling efforts that are validated through comparison with experimental data. In addition to developing a master plan to incorporate a subcontinuum mechanistic understanding of glass dissolution into continuum models, methods were developed to generate constitutive dissolution rate expressions from quantum calculations, force field models were selected to generate multicomponent glass structures and gel layers,more » classical molecular modeling was used to study diffusion through nanopores analogous to those in the interfacial gel layer, and a micro-continuum model (K{mu}C) was developed to study coupled diffusion and reaction at the glass-gel-solution interface.« less
A mechanistic investigation of the algae growth "Droop" model.
Lemesle, V; Mailleret, L
2008-06-01
In this work a mechanistic explanation of the classical algae growth model built by M. R. Droop in the late sixties is proposed. We first recall the history of the construction of the "predictive" variable yield Droop model as well as the meaning of the introduced cell quota. We then introduce some theoretical hypotheses on the biological phenomena involved in nutrient storage by the algae that lead us to a "conceptual" model. Though more complex than Droop's one, our model remains accessible to a complete mathematical study: its confrontation to the Droop model shows both have the same asymptotic behavior. However, while Droop's cell quota comes from experimental bio-chemical measurements not related to intra-cellular biological phenomena, its analogous in our model directly follows our theoretical hypotheses. This new model should then be looked at as a re-interpretation of Droop's work from a theoretical biologist's point of view.
DOT National Transportation Integrated Search
2007-08-01
The objective of this research study was to develop performance characteristics or variables (e.g., ride quality, rutting, : fatigue cracking, transverse cracking) of flexible pavements in Montana, and to use these characteristics in the : implementa...
Livestock Helminths in a Changing Climate: Approaches and Restrictions to Meaningful Predictions
Fox, Naomi J.; Marion, Glenn; Davidson, Ross S.; White, Piran C. L.; Hutchings, Michael R.
2012-01-01
Simple Summary Parasitic helminths represent one of the most pervasive challenges to livestock, and their intensity and distribution will be influenced by climate change. There is a need for long-term predictions to identify potential risks and highlight opportunities for control. We explore the approaches to modelling future helminth risk to livestock under climate change. One of the limitations to model creation is the lack of purpose driven data collection. We also conclude that models need to include a broad view of the livestock system to generate meaningful predictions. Abstract Climate change is a driving force for livestock parasite risk. This is especially true for helminths including the nematodes Haemonchus contortus, Teladorsagia circumcincta, Nematodirus battus, and the trematode Fasciola hepatica, since survival and development of free-living stages is chiefly affected by temperature and moisture. The paucity of long term predictions of helminth risk under climate change has driven us to explore optimal modelling approaches and identify current bottlenecks to generating meaningful predictions. We classify approaches as correlative or mechanistic, exploring their strengths and limitations. Climate is one aspect of a complex system and, at the farm level, husbandry has a dominant influence on helminth transmission. Continuing environmental change will necessitate the adoption of mitigation and adaptation strategies in husbandry. Long term predictive models need to have the architecture to incorporate these changes. Ultimately, an optimal modelling approach is likely to combine mechanistic processes and physiological thresholds with correlative bioclimatic modelling, incorporating changes in livestock husbandry and disease control. Irrespective of approach, the principal limitation to parasite predictions is the availability of active surveillance data and empirical data on physiological responses to climate variables. By combining improved empirical data and refined models with a broad view of the livestock system, robust projections of helminth risk can be developed. PMID:26486780
Lueken, Ulrike; Hahn, Tim
2016-01-01
The review provides an update of functional neuroimaging studies that identify neural processes underlying psychotherapy and predict outcomes following psychotherapeutic treatment in anxiety and depressive disorders. Following current developments in this field, studies were classified as 'mechanistic' or 'predictor' studies (i.e., informing neurobiological models about putative mechanisms versus aiming to provide predictive information). Mechanistic evidence points toward a dual-process model of psychotherapy in anxiety disorders with abnormally increased limbic activation being decreased, while prefrontal activity is increased. Partly overlapping findings are reported for depression, albeit with a stronger focus on prefrontal activation following treatment. No studies directly comparing neural pathways of psychotherapy between anxiety and depression were detected. Consensus is accumulating for an overarching role of the anterior cingulate cortex in modulating treatment response across disorders. When aiming to quantify clinical utility, the need for single-subject predictions is increasingly recognized and predictions based on machine learning approaches show high translational potential. Present findings encourage the search for predictors providing clinically meaningful information for single patients. However, independent validation as a crucial prerequisite for clinical use is still needed. Identifying nonresponders a priori creates the need for alternative treatment options that can be developed based on an improved understanding of those neural mechanisms underlying effective interventions.
Maling, T; Diggle, A J; Thackray, D J; Siddique, K H M; Jones, R A C
2008-12-01
A hybrid mechanistic/statistical model was developed to predict vector activity and epidemics of vector-borne viruses spreading from external virus sources to an adjacent crop. The pathosystem tested was Bean yellow mosaic virus (BYMV) spreading from annually self-regenerating, legume-based pastures to adjacent crops of narrow-leafed lupin (Lupinus angustifolius) in the winter-spring growing season in a region with a Mediterranean-type environment where the virus persists over summer within dormant seed of annual clovers. The model uses a combination of daily rainfall and mean temperature during late summer and early fall to drive aphid population increase, migration of aphids from pasture to lupin crops, and the spread of BYMV. The model predicted time of arrival of aphid vectors and resulting BYMV spread successfully for seven of eight datasets from 2 years of field observations at four sites representing different rainfall and geographic zones of the southwestern Australian grainbelt. Sensitivity analysis was performed to determine the relative importance of the main parameters that describe the pathosystem. The hybrid mechanistic/statistical approach used created a flexible analytical tool for vector-mediated plant pathosystems that made useful predictions even when field data were not available for some components of the system.
A mechanistic model for the prediction of in-use moisture uptake by packaged dosage forms.
Remmelgas, Johan; Simonutti, Anne-Laure; Ronkvist, Asa; Gradinarsky, Lubomir; Löfgren, Anders
2013-01-30
A mechanistic model for the prediction of in-use moisture uptake of solid dosage forms in bottles is developed. The model considers moisture transport into the bottle and moisture uptake by the dosage form both when the bottle is closed and when it is open. Experiments are carried out by placing tablets and desiccant canisters in bottles and monitoring their moisture content. Each bottle is opened once a day to remove one tablet or desiccant canister. Opening the bottle to remove a tablet or canister also causes some exchange of air between the bottle headspace and the environment. In order to ascertain how this air exchange might depend on the customer, tablets and desiccant canisters are removed from the bottles by either carefully removing only one or by pouring all of the tablets or desiccant canisters out of the bottle, removing one, and pouring the remaining ones back into the bottle. The predictions of the model are found to be in good agreement with experimental data for moisture sorption by desiccant canisters. Moreover, it is found experimentally that the manner in which the tablets or desiccant canisters were removed does not appreciably affect their moisture content. Copyright © 2012 Elsevier B.V. All rights reserved.
Dynamic, mechanistic, molecular-level modelling of cyanobacteria: Anabaena and nitrogen interaction.
Hellweger, Ferdi L; Fredrick, Neil D; McCarthy, Mark J; Gardner, Wayne S; Wilhelm, Steven W; Paerl, Hans W
2016-09-01
Phytoplankton (eutrophication, biogeochemical) models are important tools for ecosystem research and management, but they generally have not been updated to include modern biology. Here, we present a dynamic, mechanistic, molecular-level (i.e. gene, transcript, protein, metabolite) model of Anabaena - nitrogen interaction. The model was developed using the pattern-oriented approach to model definition and parameterization of complex agent-based models. It simulates individual filaments, each with individual cells, each with genes that are expressed to yield transcripts and proteins. Cells metabolize various forms of N, grow and divide, and differentiate heterocysts when fixed N is depleted. The model is informed by observations from 269 laboratory experiments from 55 papers published from 1942 to 2014. Within this database, we identified 331 emerging patterns, and, excluding inconsistencies in observations, the model reproduces 94% of them. To explore a practical application, we used the model to simulate nutrient reduction scenarios for a hypothetical lake. For a 50% N only loading reduction, the model predicts that N fixation increases, but this fixed N does not compensate for the loading reduction, and the chlorophyll a concentration decreases substantially (by 33%). When N is reduced along with P, the model predicts an additional 8% reduction (compared to P only). © 2016 Society for Applied Microbiology and John Wiley & Sons Ltd.
Predicting subsurface uranium transport: Mechanistic modeling constrained by experimental data
NASA Astrophysics Data System (ADS)
Ottman, Michael; Schenkeveld, Walter D. C.; Kraemer, Stephan
2017-04-01
Depleted uranium (DU) munitions and their widespread use throughout conflict zones around the world pose a persistent health threat to the inhabitants of those areas long after the conclusion of active combat. However, little emphasis has been put on developing a comprehensive, quantitative tool for use in remediation and hazard avoidance planning in a wide range of environments. In this context, we report experimental data on U interaction with soils and sediments. Here, we strive to improve existing risk assessment modeling paradigms by incorporating a variety of experimental data into a mechanistic U transport model for subsurface environments. 20 different soils and sediments from a variety of environments were chosen to represent a range of geochemical parameters that are relevant to U transport. The parameters included pH, organic matter content, CaCO3, Fe content and speciation, and clay content. pH ranged from 3 to 10, organic matter content from 6 to 120 g kg-1, CaCO3 from 0 to 700 g kg-1, amorphous Fe content from 0.3 to 6 g kg-1 and clay content from 4 to 580 g kg-1. Sorption experiments were then performed, and linear isotherms were constructed. Sorption experiment results show that among separate sets of sediments and soils, there is an inverse correlation between both soil pH and CaCO¬3 concentration relative to U sorptive affinity. The geological materials with the highest and lowest sorptive affinities for U differed in CaCO3 and organic matter concentrations, as well as clay content and pH. In a further step, we are testing if transport behavior in saturated porous media can be predicted based on adsorption isotherms and generic geochemical parameters, and comparing these modeling predictions with the results from column experiments. The comparison of these two data sets will examine if U transport can be effectively predicted from reactive transport modeling that incorporates the generic geochemical parameters. This work will serve to show whether a more mechanistic approach offers an improvement over statistical regression-based risk assessment models.
Geerts, Hugo; Spiros, Athan; Roberts, Patrick; Twyman, Roy; Alphs, Larry; Grace, Anthony A.
2012-01-01
The tremendous advances in understanding the neurobiological circuits involved in schizophrenia have not translated into more effective treatments. An alternative strategy is to use a recently published ‘Quantitative Systems Pharmacology’ computer-based mechanistic disease model of cortical/subcortical and striatal circuits based upon preclinical physiology, human pathology and pharmacology. The physiology of 27 relevant dopamine, serotonin, acetylcholine, norepinephrine, gamma-aminobutyric acid (GABA) and glutamate-mediated targets is calibrated using retrospective clinical data on 24 different antipsychotics. The model was challenged to predict quantitatively the clinical outcome in a blinded fashion of two experimental antipsychotic drugs; JNJ37822681, a highly selective low-affinity dopamine D2 antagonist and ocaperidone, a very high affinity dopamine D2 antagonist, using only pharmacology and human positron emission tomography (PET) imaging data. The model correctly predicted the lower performance of JNJ37822681 on the positive and negative syndrome scale (PANSS) total score and the higher extra-pyramidal symptom (EPS) liability compared to olanzapine and the relative performance of ocaperidone against olanzapine, but did not predict the absolute PANSS total score outcome and EPS liability for ocaperidone, possibly due to placebo responses and EPS assessment methods. Because of its virtual nature, this modeling approach can support central nervous system research and development by accounting for unique human drug properties, such as human metabolites, exposure, genotypes and off-target effects and can be a helpful tool for drug discovery and development. PMID:23251349
Metal accumulation in the earthworm Lumbricus rubellus. Model predictions compared to field data
Veltman, K.; Huijbregts, M.A.J.; Vijver, M.G.; Peijnenburg, W.J.G.M.; Hobbelen, P.H.F.; Koolhaas, J.E.; van Gestel, C.A.M.; van Vliet, P.C.J.; Jan, Hendriks A.
2007-01-01
The mechanistic bioaccumulation model OMEGA (Optimal Modeling for Ecotoxicological Applications) is used to estimate accumulation of zinc (Zn), copper (Cu), cadmium (Cd) and lead (Pb) in the earthworm Lumbricus rubellus. Our validation to field accumulation data shows that the model accurately predicts internal cadmium concentrations. In addition, our results show that internal metal concentrations in the earthworm are less than linearly (slope < 1) related to the total concentration in soil, while risk assessment procedures often assume the biota-soil accumulation factor (BSAF) to be constant. Although predicted internal concentrations of all metals are generally within a factor 5 compared to field data, incorporation of regulation in the model is necessary to improve predictability of the essential metals such as zinc and copper. ?? 2006 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kafka, Orion L.; Yu, Cheng; Shakoor, Modesar; Liu, Zeliang; Wagner, Gregory J.; Liu, Wing Kam
2018-04-01
A data-driven mechanistic modeling technique is applied to a system representative of a broken-up inclusion ("stringer") within drawn nickel-titanium wire or tube, e.g., as used for arterial stents. The approach uses a decomposition of the problem into a training stage and a prediction stage. It is applied to compute the fatigue crack incubation life of a microstructure of interest under high-cycle fatigue. A parametric study of a matrix-inclusion-void microstructure is conducted. The results indicate that, within the range studied, a larger void between halves of the inclusion increases fatigue life, while larger inclusion diameter reduces fatigue life.
Ground-Based Gas-Liquid Flow Research in Microgravity Conditions: State of Knowledge
NASA Technical Reports Server (NTRS)
McQuillen, J.; Colin, C.; Fabre, J.
1999-01-01
During the last decade, ground-based microgravity facilities have been utilized in order to obtain predictions for spacecraft system designers and further the fundamental understanding of two-phase flow. Although flow regime, pressure drop and heat transfer coefficient data has been obtained for straight tubes and a limited number of fittings, measurements of the void fraction, film thickness, wall shear stress, local velocity and void information are also required in order to develop general mechanistic models that can be utilized to ascertain the effects of fluid properties, tube geometry and acceleration levels. A review of this research is presented and includes both empirical data and mechanistic models of the flow behavior.
Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration
NASA Astrophysics Data System (ADS)
Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.
2017-12-01
Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.
NASA Astrophysics Data System (ADS)
Worman, Stacey; Furbish, David; Fathel, Siobhan
2014-05-01
In arid landscapes, desert shrubs individually and collectively modify how sediment is transported (e.g by wind, overland-flow, and rain-splash). Addressing how desert shrubs modify landscapes on geomorphic timescales therefore necessitates spanning multiple shrub lifetimes and accounting for how processes affecting shrub dynamics on these longer timescales (e.g. fire, grazing, drought, and climate change) may in turn impact sediment transport. To fulfill this need, we present a mechanistic model of the spatiotemporal dynamics of a desert-shrub population that uses a simple accounting framework and tracks individual shrubs as they enter, age, and exit the population (via recruitment, growth, and mortality). Our model is novel insomuch as it (1) features a strong biophysical foundation, (2) mimics well-documented aspects of how shrub populations respond to changes in precipitation, and (3) possesses the process granularity appropriate for use in geomorphic simulations. In a complimentary abstract (Fathel et al. 2014), we demonstrate the potential of this biological model by coupling it to a physical model of rain-splash sediment transport: We mechanistically reproduce the empirical observation that the erosion rate of a hillslope decreases as its vegetation coverage increases and we predict erosion rates under different climate-change scenarios.
THE EFFECTS OF NITROGEN LOADING AND FRESHWATER RESIDENCE TIME ON THE ESTUARINE ECOSYSTEM
A simple mechanistic model, designed to predict annual average concentrations of total nitrogen (TN) concentrations from nitrogen inputs and freshwater residence time in estuaries, was applied to data for several North American estuaries from previously published literature. The ...
DOT National Transportation Integrated Search
2007-08-01
The objective of this research study was to develop performance characteristics or variables (e.g., ride quality, rutting, : fatigue cracking, transverse cracking) of flexible pavements in Montana, and to use these characteristics in the : implementa...
RISK 0301 - MOLECULAR MODELING
Risk assessment practices, in general, for a range of diseases now encourages the use of mechanistic data to enhance the ability to predict responses at low, environmental exposures. In particular, the pathway from normal biology to pathologic state can be dcscribed by a set of m...
Levy, Karen; Zimmerman, Julie; Elliott, Mark; Bartram, Jamie; Carlton, Elizabeth; Clasen, Thomas; Dillingham, Rebecca; Eisenberg, Joseph; Guerrant, Richard; Lantagne, Daniele; Mihelcic, James; Nelson, Kara
2016-01-01
Increased precipitation and temperature variability as well as extreme events related to climate change are predicted to affect the availability and quality of water globally. Already heavily burdened with diarrheal diseases due to poor access to water, sanitation and hygiene facilities, communities throughout the developing world lack the adaptive capacity to sufficiently respond to the additional adversity caused by climate change. Studies suggest that diarrhea rates are positively correlated with increased temperature, and show a complex relationship with precipitation. Although climate change will likely increase rates of diarrheal diseases on average, there is a poor mechanistic understanding of the underlying disease transmission processes and substantial uncertainty surrounding current estimates. This makes it difficult to recommend appropriate adaptation strategies. We review the relevant climate-related mechanisms behind transmission of diarrheal disease pathogens and argue that systems-based mechanistic approaches incorporating human, engineered and environmental components are urgently needed. We then review successful systems-based approaches used in other environmental health fields and detail one modeling framework to predict climate change impacts on diarrheal diseases and design adaptation strategies. PMID:26799810
Comparison of different stomatal conductance algorithms for ozone flux modelling
P. Buker; L.D. Emberson; M. R. Ashmore; H. M. Cambridge; C. M. Jacobs; W. J. Massman; J. Muller; N. Nikolov; K. Novak; E. Oksanen; M. Schaub; D. de la Torre
2007-01-01
A multiplicative and a semi-mechanistic, BWB-type [Ball, J.T., Woodrow, I.E., Berry, J.A., 1987. A model predicting stomatal conductance and its contribution to the control of photosynthesis under different environmental conditions. In: Biggens, J. (Ed.), Progress in Photosynthesis Research, vol. IV. Martinus Nijhoff, Dordrecht, pp. 221-224.] algorithm for calculating...
Varma, Manthena V; El-Kattan, Ayman F
2016-07-01
A large body of evidence suggests hepatic uptake transporters, organic anion-transporting polypeptides (OATPs), are of high clinical relevance in determining the pharmacokinetics of substrate drugs, based on which recent regulatory guidances to industry recommend appropriate assessment of investigational drugs for the potential drug interactions. We recently proposed an extended clearance classification system (ECCS) framework in which the systemic clearance of class 1B and 3B drugs is likely determined by hepatic uptake. The ECCS framework therefore predicts the possibility of drug-drug interactions (DDIs) involving OATPs and the effects of genetic variants of SLCO1B1 early in the discovery and facilitates decision making in the candidate selection and progression. Although OATP-mediated uptake is often the rate-determining process in the hepatic clearance of substrate drugs, metabolic and/or biliary components also contribute to the overall hepatic disposition and, more importantly, to liver exposure. Clinical evidence suggests that alteration in biliary efflux transport or metabolic enzymes associated with genetic polymorphism leads to change in the pharmacodynamic response of statins, for which the pharmacological target resides in the liver. Perpetrator drugs may show inhibitory and/or induction effects on transporters and enzymes simultaneously. It is therefore important to adopt models that frame these multiple processes in a mechanistic sense for quantitative DDI predictions and to deconvolute the effects of individual processes on the plasma and hepatic exposure. In vitro data-informed mechanistic static and physiologically based pharmacokinetic models are proven useful in rationalizing and predicting transporter-mediated DDIs and the complex DDIs involving transporter-enzyme interplay. © 2016, The American College of Clinical Pharmacology.
Renner, Simone; Dobenecker, Britta; Blutke, Andreas; Zöls, Susanne; Wanke, Rüdiger; Ritzmann, Mathias; Wolf, Eckhard
2016-07-01
The prevalence of diabetes mellitus, which currently affects 387 million people worldwide, is permanently rising in both adults and adolescents. Despite numerous treatment options, diabetes mellitus is a progressive disease with severe comorbidities, such as nephropathy, neuropathy, and retinopathy, as well as cardiovascular disease. Therefore, animal models predictive of the efficacy and safety of novel compounds in humans are of great value to address the unmet need for improved therapeutics. Although rodent models provide important mechanistic insights, their predictive value for therapeutic outcomes in humans is limited. In recent years, the pig has gained importance for biomedical research because of its close similarity to human anatomy, physiology, size, and, in contrast to non-human primates, better ethical acceptance. In this review, anatomic, biochemical, physiological, and morphologic aspects relevant to diabetes research will be compared between different animal species, that is, mouse, rat, rabbit, pig, and non-human primates. The value of the pig as a model organism for diabetes research will be highlighted, and (dis)advantages of the currently available approaches for the generation of pig models exhibiting characteristics of metabolic syndrome or type 2 diabetes mellitus will be discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
Modeling Physiological Processes That Relate Toxicant Exposure and Bacterial Population Dynamics
Klanjscek, Tin; Nisbet, Roger M.; Priester, John H.; Holden, Patricia A.
2012-01-01
Quantifying effects of toxicant exposure on metabolic processes is crucial to predicting microbial growth patterns in different environments. Mechanistic models, such as those based on Dynamic Energy Budget (DEB) theory, can link physiological processes to microbial growth. Here we expand the DEB framework to include explicit consideration of the role of reactive oxygen species (ROS). Extensions considered are: (i) additional terms in the equation for the “hazard rate” that quantifies mortality risk; (ii) a variable representing environmental degradation; (iii) a mechanistic description of toxic effects linked to increase in ROS production and aging acceleration, and to non-competitive inhibition of transport channels; (iv) a new representation of the “lag time” based on energy required for acclimation. We estimate model parameters using calibrated Pseudomonas aeruginosa optical density growth data for seven levels of cadmium exposure. The model reproduces growth patterns for all treatments with a single common parameter set, and bacterial growth for treatments of up to 150 mg(Cd)/L can be predicted reasonably well using parameters estimated from cadmium treatments of 20 mg(Cd)/L and lower. Our approach is an important step towards connecting levels of biological organization in ecotoxicology. The presented model reveals possible connections between processes that are not obvious from purely empirical considerations, enables validation and hypothesis testing by creating testable predictions, and identifies research required to further develop the theory. PMID:22328915
Immunogenicity of therapeutic proteins: the use of animal models.
Brinks, Vera; Jiskoot, Wim; Schellekens, Huub
2011-10-01
Immunogenicity of therapeutic proteins lowers patient well-being and drastically increases therapeutic costs. Preventing immunogenicity is an important issue to consider when developing novel therapeutic proteins and applying them in the clinic. Animal models are increasingly used to study immunogenicity of therapeutic proteins. They are employed as predictive tools to assess different aspects of immunogenicity during drug development and have become vital in studying the mechanisms underlying immunogenicity of therapeutic proteins. However, the use of animal models needs critical evaluation. Because of species differences, predictive value of such models is limited, and mechanistic studies can be restricted. This review addresses the suitability of animal models for immunogenicity prediction and summarizes the insights in immunogenicity that they have given so far.
ODE constrained mixture modelling: a method for unraveling subpopulation structures and dynamics.
Hasenauer, Jan; Hasenauer, Christine; Hucho, Tim; Theis, Fabian J
2014-07-01
Functional cell-to-cell variability is ubiquitous in multicellular organisms as well as bacterial populations. Even genetically identical cells of the same cell type can respond differently to identical stimuli. Methods have been developed to analyse heterogeneous populations, e.g., mixture models and stochastic population models. The available methods are, however, either incapable of simultaneously analysing different experimental conditions or are computationally demanding and difficult to apply. Furthermore, they do not account for biological information available in the literature. To overcome disadvantages of existing methods, we combine mixture models and ordinary differential equation (ODE) models. The ODE models provide a mechanistic description of the underlying processes while mixture models provide an easy way to capture variability. In a simulation study, we show that the class of ODE constrained mixture models can unravel the subpopulation structure and determine the sources of cell-to-cell variability. In addition, the method provides reliable estimates for kinetic rates and subpopulation characteristics. We use ODE constrained mixture modelling to study NGF-induced Erk1/2 phosphorylation in primary sensory neurones, a process relevant in inflammatory and neuropathic pain. We propose a mechanistic pathway model for this process and reconstructed static and dynamical subpopulation characteristics across experimental conditions. We validate the model predictions experimentally, which verifies the capabilities of ODE constrained mixture models. These results illustrate that ODE constrained mixture models can reveal novel mechanistic insights and possess a high sensitivity.
Xu, Ying; Hubal, Elaine A Cohen; Clausen, Per A; Little, John C
2009-04-01
A two-room model is developed to estimate the emission rate of di-2-ethylhexyl phthalate (DEHP) from vinyl flooring and the evolving gas-phase and adsorbed surface concentrations in a realistic indoor environment. Because the DEHP emission rate measured in a test chamber may be quite different from the emission rate from the same material in the indoor environment the model provides a convenient means to predict emissions and transport in a more realistic setting. Adsorption isotherms for phthalates and plasticizers on interior surfaces, such as carpet, wood, dust, and human skin, are derived from previous field and laboratory studies. Log-linear relationships between equilibrium parameters and chemical vapor pressure are obtained. The predicted indoor air DEHP concentration at steady state is 0.15 microg/m3. Room 1 reaches steady state within about one year, while the adjacent room reaches steady state about three months later. Ventilation rate has a strong influence on DEHP emission rate while total suspended particle concentration has a substantial impact on gas-phase concentration. Exposure to DEHP via inhalation, dermal absorption, and oral ingestion of dust is evaluated. The model clarifies the mechanisms that govern the release of DEHP from vinyl flooring and the subsequent interactions with interior surfaces, airborne particles, dust, and human skin. Although further model development, parameter identification, and model validation are needed, our preliminary model provides a mechanistic framework that elucidates exposure pathways for phthalate plasticizers, and can most likely be adapted to predict emissions and transport of other semivolatile organic compounds, such as brominated flame retardants and biocides, in a residential environment.
Simulating the Risk of Liver Fluke Infection using a Mechanistic Hydro-epidemiological Model
NASA Astrophysics Data System (ADS)
Beltrame, Ludovica; Dunne, Toby; Rose, Hannah; Walker, Josephine; Morgan, Eric; Vickerman, Peter; Wagener, Thorsten
2016-04-01
Liver Fluke (Fasciola hepatica) is a common parasite found in livestock and responsible for considerable economic losses throughout the world. Risk of infection is strongly influenced by climatic and hydrological conditions, which characterise the host environment for parasite development and transmission. Despite on-going control efforts, increases in fluke outbreaks have been reported in recent years in the UK, and have been often attributed to climate change. Currently used fluke risk models are based on empirical relationships derived between historical climate and incidence data. However, hydro-climate conditions are becoming increasingly non-stationary due to climate change and direct anthropogenic impacts such as land use change, making empirical models unsuitable for simulating future risk. In this study we introduce a mechanistic hydro-epidemiological model for Liver Fluke, which explicitly simulates habitat suitability for disease development in space and time, representing the parasite life cycle in connection with key environmental conditions. The model is used to assess patterns of Liver Fluke risk for two catchments in the UK under current and potential future climate conditions. Comparisons are made with a widely used empirical model employing different datasets, including data from regional veterinary laboratories. Results suggest that mechanistic models can achieve adequate predictive ability and support adaptive fluke control strategies under climate change scenarios.
Prediction of episodic acidification in North-eastern USA: An empirical/mechanistic approach
Davies, T.D.; Tranter, M.; Wigington, P.J.; Eshleman, K.N.; Peters, N.E.; Van Sickle, J.; DeWalle, David R.; Murdoch, Peter S.
1999-01-01
Observations from the US Environmental Protection Agency's Episodic Response Project (ERP) in the North-eastern United States are used to develop an empirical/mechanistic scheme for prediction of the minimum values of acid neutralizing capacity (ANC) during episodes. An acidification episode is defined as a hydrological event during which ANC decreases. The pre-episode ANC is used to index the antecedent condition, and the stream flow increase reflects how much the relative contributions of sources of waters change during the episode. As much as 92% of the total variation in the minimum ANC in individual catchments can be explained (with levels of explanation >70% for nine of the 13 streams) by a multiple linear regression model that includes pre-episode ANC and change in discharge as independent variable. The predictive scheme is demonstrated to be regionally robust, with the regional variance explained ranging from 77 to 83%. The scheme is not successful for each ERP stream, and reasons are suggested for the individual failures. The potential for applying the predictive scheme to other watersheds is demonstrated by testing the model with data from the Panola Mountain Research Watershed in the South-eastern United States, where the variance explained by the model was 74%. The model can also be utilized to assess 'chemically new' and 'chemically old' water sources during acidification episodes.Observations from the US Environmental Protection Agency's Episodic Response Project (ERP) in the Northeastern United States are used to develop an empirical/mechanistic scheme for prediction of the minimum values of acid neutralizing capacity (ANC) during episodes. An acidification episode is defined as a hydrological event during which ANC decreases. The pre-episode ANC is used to index the antecedent condition, and the stream flow increase reflects how much the relative contributions of sources of waters change during the episode. As much as 92% of the total variation in the minimum ANC in individual catchments can be explained (with levels of explanation >70% for nine of the 13 streams) by a multiple linear regression model that includes pre-episode ANC and change in discharge as independent variables. The predictive scheme is demonstrated to be regionally robust, with the regional variance explained ranging from 77 to 83%. The scheme is not successful for each ERP stream, and reasons are suggested for the individual failures. The potential for applying the predictive scheme to other watersheds is demonstrated by testing the model with data from the Panola Mountain Research Watershed in the South-eastern United States, where the variance explained by the model was 74%. The model can also be utilized to assess `chemically new' and `chemically old' water sources during acidification episodes.
Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict doseresponse and time-course ...
Rhoden, John J; Dyas, Gregory L; Wroblewski, Victor J
2016-05-20
Despite the increasing number of multivalent antibodies, bispecific antibodies, fusion proteins, and targeted nanoparticles that have been generated and studied, the mechanism of multivalent binding to cell surface targets is not well understood. Here, we describe a conceptual and mathematical model of multivalent antibody binding to cell surface antigens. Our model predicts that properties beyond 1:1 antibody:antigen affinity to target antigens have a strong influence on multivalent binding. Predicted crucial properties include the structure and flexibility of the antibody construct, the target antigen(s) and binding epitope(s), and the density of antigens on the cell surface. For bispecific antibodies, the ratio of the expression levels of the two target antigens is predicted to be critical to target binding, particularly for the lower expressed of the antigens. Using bispecific antibodies of different valencies to cell surface antigens including MET and EGF receptor, we have experimentally validated our modeling approach and its predictions and observed several nonintuitive effects of avidity related to antigen density, target ratio, and antibody affinity. In some biological circumstances, the effect we have predicted and measured varied from the monovalent binding interaction by several orders of magnitude. Moreover, our mathematical framework affords us a mechanistic interpretation of our observations and suggests strategies to achieve the desired antibody-antigen binding goals. These mechanistic insights have implications in antibody engineering and structure/activity relationship determination in a variety of biological contexts. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
Mechanistic Analysis of Cocrystal Dissolution as a Function of pH and Micellar Solubilization
2016-01-01
The purpose of this work is to provide a mechanistic understanding of the dissolution behavior of cocrystals under the influence of ionization and micellar solubilization. Mass transport models were developed by applying Fick’s law of diffusion to dissolution with simultaneous chemical reactions in the hydrodynamic boundary layer adjacent to the dissolving cocrystal surface to predict the pH at the dissolving solid–liquid interface (i.e., interfacial pH) and the flux of cocrystals. To evaluate the predictive power of these models, dissolution studies of carbamazepine–saccharin (CBZ-SAC) and carbamazepine–salicylic acid (CBZ-SLC) cocrystals were performed at varied pH and surfactant concentrations above the critical stabilization concentration (CSC), where the cocrystals were thermodynamically stable. The findings in this work demonstrate that the pH dependent dissolution behavior of cocrystals with ionizable components is dependent on interfacial pH. This mass transport analysis demonstrates the importance of pH, cocrystal solubility, diffusivity, and micellar solubilization on the dissolution rates of cocrystals. PMID:26877267
Mechanistic Analysis of Cocrystal Dissolution as a Function of pH and Micellar Solubilization.
Cao, Fengjuan; Amidon, Gordon L; Rodriguez-Hornedo, Nair; Amidon, Gregory E
2016-03-07
The purpose of this work is to provide a mechanistic understanding of the dissolution behavior of cocrystals under the influence of ionization and micellar solubilization. Mass transport models were developed by applying Fick's law of diffusion to dissolution with simultaneous chemical reactions in the hydrodynamic boundary layer adjacent to the dissolving cocrystal surface to predict the pH at the dissolving solid-liquid interface (i.e., interfacial pH) and the flux of cocrystals. To evaluate the predictive power of these models, dissolution studies of carbamazepine-saccharin (CBZ-SAC) and carbamazepine-salicylic acid (CBZ-SLC) cocrystals were performed at varied pH and surfactant concentrations above the critical stabilization concentration (CSC), where the cocrystals were thermodynamically stable. The findings in this work demonstrate that the pH dependent dissolution behavior of cocrystals with ionizable components is dependent on interfacial pH. This mass transport analysis demonstrates the importance of pH, cocrystal solubility, diffusivity, and micellar solubilization on the dissolution rates of cocrystals.
Health Management and Service Life for Air Force Missiles
2011-09-26
prediction of performance will be conducted DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. PA# TBD 24 • Empiricism ...Strategic Missile A&S Approach Overview Empiricism DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. PA# TBD...Extrapolation Simulated Data 25 • Empiricism cannot always predict future state • Mechanistic method enables enhanced predictions • Mechanistic will not be
Using energy budgets to combine ecology and toxicology in a mammalian sentinel species
NASA Astrophysics Data System (ADS)
Desforges, Jean-Pierre W.; Sonne, Christian; Dietz, Rune
2017-04-01
Process-driven modelling approaches can resolve many of the shortcomings of traditional descriptive and non-mechanistic toxicology. We developed a simple dynamic energy budget (DEB) model for the mink (Mustela vison), a sentinel species in mammalian toxicology, which coupled animal physiology, ecology and toxicology, in order to mechanistically investigate the accumulation and adverse effects of lifelong dietary exposure to persistent environmental toxicants, most notably polychlorinated biphenyls (PCBs). Our novel mammalian DEB model accurately predicted, based on energy allocations to the interconnected metabolic processes of growth, development, maintenance and reproduction, lifelong patterns in mink growth, reproductive performance and dietary accumulation of PCBs as reported in the literature. Our model results were consistent with empirical data from captive and free-ranging studies in mink and other wildlife and suggest that PCB exposure can have significant population-level impacts resulting from targeted effects on fetal toxicity, kit mortality and growth and development. Our approach provides a simple and cross-species framework to explore the mechanistic interactions of physiological processes and ecotoxicology, thus allowing for a deeper understanding and interpretation of stressor-induced adverse effects at all levels of biological organization.
Mechanistic ecohydrological modeling with Tethys-Chloris: an attempt to unravel complexity
NASA Astrophysics Data System (ADS)
Fatichi, S.; Ivanov, V. Y.; Caporali, E.
2010-12-01
The role of vegetation in controlling and mediating hydrological states and fluxes at the level of individual processes has been largely explored, which has lead to the improvement of our understanding of mechanisms and patterns in ecohydrological systems. Nonetheless, relatively few efforts have been directed toward the development of continuous, complex, mechanistic ecohydrological models operating at the watershed-scale. This study presents a novel ecohydrological model Tethys-Chloris (T&C) and aims to discuss current limitations and perspectives of the mechanistic approach in ecohydrology. The model attempts to synthesize the state-of-the-art knowledge on individual processes and mechanisms drawn from various disciplines such as hydrology, plant physiology, ecology, and biogeochemistry. The model reproduces all essential components of hydrological cycle resolving the mass and energy budgets at the hourly scale; it includes energy and mass exchanges in the atmospheric boundary layer; a module of saturated and unsaturated soil water dynamics; two layers of vegetation, and a module of snowpack evolution. The vegetation component parsimoniously parameterizes essential plant life-cycle processes, including photosynthesis, phenology, carbon allocation, tissues turnover, and soil biogeochemistry. Quantitative metrics of model performance are discussed and highlight the capabilities of T&C in reproducing ecohydrological dynamics. The simulated patterns mimic the outcome of hydrological dynamics with high realism, given the uncertainty of imposed boundary conditions and limited data availability. Furthermore, highly satisfactory results are obtained without significant (e.g., automated) calibration efforts despite the large phase-space dimensionality of the model. A significant investment into model design and development leads to such desirable behavior. This suggests that while using the presented tool for high-precision predictions can be still problematic, the mechanistic nature of the model can be extremely valuable for designing virtual experiments, testing hypotheses. and focusing questions of scientific inquiry.
A life prediction model for laminated composite structural components
NASA Technical Reports Server (NTRS)
Allen, David H.
1990-01-01
A life prediction methodology for laminated continuous fiber composites subjected to fatigue loading conditions was developed. A summary is presented of research completed. A phenomenological damage evolution law was formulated for matrix cracking which is independent of stacking sequence. Mechanistic and physical support was developed for the phenomenological evolution law proposed above. The damage evolution law proposed above was implemented to a finite element computer program. And preliminary predictions were obtained for a structural component undergoing fatigue loading induced damage.
Krogseth, Ingjerd S; Breivik, Knut; Arnot, Jon A; Wania, Frank; Borgen, Anders R; Schlabach, Martin
2013-12-01
Short chain chlorinated paraffins (SCCPs) raise concerns due to their potential for persistence, bioaccumulation, long-range transport and adverse effects. An understanding of their environmental fate remains limited, partly due to the complexity of the mixture. The purpose of this study was to evaluate whether a mechanistic, integrated, dynamic environmental fate and bioaccumulation multimedia model (CoZMoMAN) can reconcile what is known about environmental emissions and human exposure of SCCPs in the Nordic environment. Realistic SCCP emission scenarios, resolved by formula group, were estimated and used to predict the composition and concentrations of SCCPs in the environment and the human food chain. Emissions at the upper end of the estimated range resulted in predicted total concentrations that were often within a factor of 6 of observations. Similar model performance for a complex group of organic contaminants as for the well-known polychlorinated biphenyls strengthens the confidence in the CoZMoMAN model and implies a relatively good mechanistic understanding of the environmental fate of SCCPs. However, the degree of chlorination predicted for SCCPs in sediments, fish, and humans was higher than observed and poorly established environmental half-lives and biotransformation rate constants contributed to the uncertainties in the predicted composition and ∑SCCP concentrations. Improving prediction of the SCCP composition will also require better constrained estimates of the composition of SCCP emissions. There is, however, also large uncertainty and lack of coherence in the existing observations, and better model-measurement agreement will require improved analytical methods and more strategic sampling. More measurements of SCCP levels and compositions in samples from background regions are particularly important.
DOT National Transportation Integrated Search
1998-04-01
The study reported here was conducted to assess how well some of the existing asphalt pavement mechanistic-empirical distress prediction models performed when used in conjunction with the data being collected as part of the national Long Term Pavemen...
A two-room model is developed to estimate the emission rate of di-2-ethylhexyl phthalate (DEHP) from vinyl flooring and the evolving gas-phase and adsorbed surface concentrations in a realistic indoor environment. Adsorption isotherms for phthalates and plasticizers on interior ...
On the predictive ability of mechanistic models for the Haitian cholera epidemic.
Mari, Lorenzo; Bertuzzo, Enrico; Finger, Flavio; Casagrandi, Renato; Gatto, Marino; Rinaldo, Andrea
2015-03-06
Predictive models of epidemic cholera need to resolve at suitable aggregation levels spatial data pertaining to local communities, epidemiological records, hydrologic drivers, waterways, patterns of human mobility and proxies of exposure rates. We address the above issue in a formal model comparison framework and provide a quantitative assessment of the explanatory and predictive abilities of various model settings with different spatial aggregation levels and coupling mechanisms. Reference is made to records of the recent Haiti cholera epidemics. Our intensive computations and objective model comparisons show that spatially explicit models accounting for spatial connections have better explanatory power than spatially disconnected ones for short-to-intermediate calibration windows, while parsimonious, spatially disconnected models perform better with long training sets. On average, spatially connected models show better predictive ability than disconnected ones. We suggest limits and validity of the various approaches and discuss the pathway towards the development of case-specific predictive tools in the context of emergency management. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Linking 3D spatial models of fuels and fire: Effects of spatial heterogeneity on fire behavior
Russell A. Parsons; William E. Mell; Peter McCauley
2011-01-01
Crownfire endangers fire fighters and can have severe ecological consequences. Prediction of fire behavior in tree crowns is essential to informed decisions in fire management. Current methods used in fire management do not address variability in crown fuels. New mechanistic physics-based fire models address convective heat transfer with computational fluid dynamics (...
Pal, Siladitya; Tsamis, Alkiviadis; Pasta, Salvatore; D'Amore, Antonio; Gleason, Thomas G.; Vorp, David A.; Maiti, Spandan
2014-01-01
Aortic dissection (AoD) is a common condition that often leads to life-threatening cardiovaular emergency. From a biomechanics viewpoint, AoD involves failure of load-bearing microstructural components of the aortic wall, mainly elastin and collagen fibers. Delamination strength of the aortic wall depends on the load-bearing capacity and local micro-architecture of these fibers, which may vary with age, disease and aortic location. Therefore, quantifying the role of fiber micro-architecture on the delamination strength of the aortic wall may lead to improved understanding of AoD. We present an experimentally-driven modeling paradigm towards this goal. Specifically, we utilize collagen fiber microarchitecture, obtained in a parallel study from multi-photon microopy, in a predictive mechanistic framework to characterize the delamination strength. We then validate our model against peel test experiments on human aortic strips and utilize the model to predict the delamination strength of separate aortic strips and compare with experimental findings. We observe that the number density and failure energy of the radially-running collagen fibers control the peel strength. Furthermore, our model suggests that the lower delamination strength previously found for the circumferential direction in human aorta is related to a lower number density of radially-running collagen fibers in that direction. Our model sets the stage for an expanded future study that could predict AoD propagation in patient-specific aortic geometries and better understand factors that may influence propensity for occurrence. PMID:24484644
NASA Astrophysics Data System (ADS)
Sneddon, R. V.
1982-07-01
The VESY-3-A mechanistic design system for asphalt pavements was field verified for three pavement sections at two test sites in Nebraska. PSI predictions from VESYS were in good agreement with field measurements for a 20 year old 3 layer pavement located near Elmwood, Nebraska. Field measured PSI values for an 8 in. full depth pavement also agreed with VESYS predictions for the study period. Rut depth estimates from the model were small and were in general agreement with field measurements. Cracking estimates were poor and tended to underestimate the time required to develop observable fatigue cracking in the field. Asphalt, base course and subgrade materials were tested in a 4.0 in. diameter modified triaxial cell. Test procedures used dynamic conditioning and rest periods to simulate service conditions.
Melin, Johanna; Parra-Guillen, Zinnia P; Hartung, Niklas; Huisinga, Wilhelm; Ross, Richard J; Whitaker, Martin J; Kloft, Charlotte
2018-04-01
Optimisation of hydrocortisone replacement therapy in children is challenging as there is currently no licensed formulation and dose in Europe for children under 6 years of age. In addition, hydrocortisone has non-linear pharmacokinetics caused by saturable plasma protein binding. A paediatric hydrocortisone formulation, Infacort ® oral hydrocortisone granules with taste masking, has therefore been developed. The objective of this study was to establish a population pharmacokinetic model based on studies in healthy adult volunteers to predict hydrocortisone exposure in paediatric patients with adrenal insufficiency. Cortisol and binding protein concentrations were evaluated in the absence and presence of dexamethasone in healthy volunteers (n = 30). Dexamethasone was used to suppress endogenous cortisol concentrations prior to and after single doses of 0.5, 2, 5 and 10 mg of Infacort ® or 20 mg of Infacort ® /hydrocortisone tablet/hydrocortisone intravenously. A plasma protein binding model was established using unbound and total cortisol concentrations, and sequentially integrated into the pharmacokinetic model. Both specific (non-linear) and non-specific (linear) protein binding were included in the cortisol binding model. A two-compartment disposition model with saturable absorption and constant endogenous cortisol baseline (Baseline cort ,15.5 nmol/L) described the data accurately. The predicted cortisol exposure for a given dose varied considerably within a small body weight range in individuals weighing <20 kg. Our semi-mechanistic population pharmacokinetic model for hydrocortisone captures the complex pharmacokinetics of hydrocortisone in a simplified but comprehensive framework. The predicted cortisol exposure indicated the importance of defining an accurate hydrocortisone dose to mimic physiological concentrations for neonates and infants weighing <20 kg. EudraCT number: 2013-000260-28, 2013-000259-42.
Tilburg, Charles E.; Jordan, Linda M.; Carlson, Amy E.; Zeeman, Stephan I.; Yund, Philip O.
2015-01-01
Faecal pollution in stormwater, wastewater and direct run-off can carry zoonotic pathogens to streams, rivers and the ocean, reduce water quality, and affect both recreational and commercial fishing areas of the coastal ocean. Typically, the closure of beaches and commercial fishing areas is governed by the testing for the presence of faecal bacteria, which requires an 18–24 h period for sample incubation. As water quality can change during this testing period, the need for accurate and timely predictions of coastal water quality has become acute. In this study, we: (i) examine the relationship between water quality, precipitation and river discharge at several locations within the Gulf of Maine, and (ii) use multiple linear regression models based on readily obtainable hydrometeorological measurements to predict water quality events at five coastal locations. Analysis of a 12 year dataset revealed that high river discharge and/or precipitation events can lead to reduced water quality; however, the use of only these two parameters to predict water quality can result in a number of errors. Analysis of a higher frequency, 2 year study using multiple linear regression models revealed that precipitation, salinity, river discharge, winds, seasonality and coastal circulation correlate with variations in water quality. Although there has been extensive development of regression models for freshwater, this is one of the first attempts to create a mechanistic model to predict water quality in coastal marine waters. Model performance is similar to that of efforts in other regions, which have incorporated models into water resource managers' decisions, indicating that the use of a mechanistic model in coastal Maine is feasible. PMID:26587258
Hessel, Ellen V S; Staal, Yvonne C M; Piersma, Aldert H
2018-03-13
Developmental neurotoxicity entails one of the most complex areas in toxicology. Animal studies provide only limited information as to human relevance. A multitude of alternative models have been developed over the years, providing insights into mechanisms of action. We give an overview of fundamental processes in neural tube formation, brain development and neural specification, aiming at illustrating complexity rather than comprehensiveness. We also give a flavor of the wealth of alternative methods in this area. Given the impressive progress in mechanistic knowledge of human biology and toxicology, the time is right for a conceptual approach for designing testing strategies that cover the integral mechanistic landscape of developmental neurotoxicity. The ontology approach provides a framework for defining this landscape, upon which an integral in silico model for predicting toxicity can be built. It subsequently directs the selection of in vitro assays for rate-limiting events in the biological network, to feed parameter tuning in the model, leading to prediction of the toxicological outcome. Validation of such models requires primary attention to coverage of the biological domain, rather than classical predictive value of individual tests. Proofs of concept for such an approach are already available. The challenge is in mining modern biology, toxicology and chemical information to feed intelligent designs, which will define testing strategies for neurodevelopmental toxicity testing. Copyright © 2018 Elsevier Inc. All rights reserved.
Farmer, William H.; Knight, Rodney R.; Eash, David A.; Kasey J. Hutchinson,; Linhart, S. Mike; Christiansen, Daniel E.; Archfield, Stacey A.; Over, Thomas M.; Kiang, Julie E.
2015-08-24
Daily records of streamflow are essential to understanding hydrologic systems and managing the interactions between human and natural systems. Many watersheds and locations lack streamgages to provide accurate and reliable records of daily streamflow. In such ungaged watersheds, statistical tools and rainfall-runoff models are used to estimate daily streamflow. Previous work compared 19 different techniques for predicting daily streamflow records in the southeastern United States. Here, five of the better-performing methods are compared in a different hydroclimatic region of the United States, in Iowa. The methods fall into three classes: (1) drainage-area ratio methods, (2) nonlinear spatial interpolations using flow duration curves, and (3) mechanistic rainfall-runoff models. The first two classes are each applied with nearest-neighbor and map-correlated index streamgages. Using a threefold validation and robust rank-based evaluation, the methods are assessed for overall goodness of fit of the hydrograph of daily streamflow, the ability to reproduce a daily, no-fail storage-yield curve, and the ability to reproduce key streamflow statistics. As in the Southeast study, a nonlinear spatial interpolation of daily streamflow using flow duration curves is found to be a method with the best predictive accuracy. Comparisons with previous work in Iowa show that the accuracy of mechanistic models with at-site calibration is substantially degraded in the ungaged framework.
Modeling process-structure-property relationships for additive manufacturing
NASA Astrophysics Data System (ADS)
Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Yu, Cheng; Liu, Zeliang; Lian, Yanping; Wolff, Sarah; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam
2018-02-01
This paper presents our latest work on comprehensive modeling of process-structure-property relationships for additive manufacturing (AM) materials, including using data-mining techniques to close the cycle of design-predict-optimize. To illustrate the processstructure relationship, the multi-scale multi-physics process modeling starts from the micro-scale to establish a mechanistic heat source model, to the meso-scale models of individual powder particle evolution, and finally to the macro-scale model to simulate the fabrication process of a complex product. To link structure and properties, a highefficiency mechanistic model, self-consistent clustering analyses, is developed to capture a variety of material response. The model incorporates factors such as voids, phase composition, inclusions, and grain structures, which are the differentiating features of AM metals. Furthermore, we propose data-mining as an effective solution for novel rapid design and optimization, which is motivated by the numerous influencing factors in the AM process. We believe this paper will provide a roadmap to advance AM fundamental understanding and guide the monitoring and advanced diagnostics of AM processing.
To simulate the long-term effects of ozone on forests in the US, we linked TREGRO, a mechanistic model of an individual tree, to ZELIG, a forest stand model, to examine the response of forests to 5 ozone exposure regimes (0 to 100 ppm-hr SUM06 per year) in 100 year simulations. ...
Development of a Mechanistic-Based Healing Model for Self-Healing Glass Seals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Wei; Stephens, Elizabeth V.; Sun, Xin
Self-healing glass, a recent development of hermetic sealant materials, has the ability to effectively repair damage when heated to elevated temperatures; thus, able to extend its service life. Since crack healing morphological changes in the glass material are usually temperature and stress dependent, quantitative studies to determine the effects of thermo-mechanical conditions on the healing behavior of the self-healing glass sealants are extremely useful to accommodate the design and optimization of the sealing systems within SOFCs. The goal of this task is to develop a mechanistic-based healing model to quantify the stress and temperature dependent healing behavior. A two-step healingmore » mechanism was developed and implemented into finite element (FE) models through user-subroutines. Integrated experimental/kinetic Monte Carlo (kMC) simulation methodology was taken to calibrate the model parameters. The crack healing model is able to investigate the effects of various thermo-mechanical factors; therefore, able to determine the critical conditions under which the healing mechanism will be activated. Furthermore, the predicted results can be used to formulate the continuum damage-healing model and to assist the SOFC stack level simulations in predicting and evaluating the effectiveness and the performance of various engineering seal designs.« less
Butler Ellis, M Clare; Kennedy, Marc C; Kuster, Christian J; Alanis, Rafael; Tuck, Clive R
2018-05-28
The BREAM (Bystander and Resident Exposure Assessment Model) (Kennedy et al. in BREAM: A probabilistic bystander and resident exposure assessment model of spray drift from an agricultural boom sprayer. Comput Electron Agric 2012;88:63-71) for bystander and resident exposure to spray drift from boom sprayers has recently been incorporated into the European Food Safety Authority (EFSA) guidance for determining non-dietary exposures of humans to plant protection products. The component of BREAM, which relates airborne spray concentrations to bystander and resident dermal exposure, has been reviewed to identify whether it is possible to improve this and its description of variability captured in the model. Two approaches have been explored: a more rigorous statistical analysis of the empirical data and a semi-mechanistic model based on established studies combined with new data obtained in a wind tunnel. A statistical comparison between field data and model outputs was used to determine which approach gave the better prediction of exposures. The semi-mechanistic approach gave the better prediction of experimental data and resulted in a reduction in the proposed regulatory values for the 75th and 95th percentiles of the exposure distribution.
Denny, M W; Dowd, W W
2012-03-15
As the air temperature of the Earth rises, ecological relationships within a community might shift, in part due to differences in the thermal physiology of species. Prediction of these shifts - an urgent task for ecologists - will be complicated if thermal tolerance itself can rapidly evolve. Here, we employ a mechanistic approach to predict the potential for rapid evolution of thermal tolerance in the intertidal limpet Lottia gigantea. Using biophysical principles to predict body temperature as a function of the state of the environment, and an environmental bootstrap procedure to predict how the environment fluctuates through time, we create hypothetical time-series of limpet body temperatures, which are in turn used as a test platform for a mechanistic evolutionary model of thermal tolerance. Our simulations suggest that environmentally driven stochastic variation of L. gigantea body temperature results in rapid evolution of a substantial 'safety margin': the average lethal limit is 5-7°C above the average annual maximum temperature. This predicted safety margin approximately matches that found in nature, and once established is sufficient, in our simulations, to allow some limpet populations to survive a drastic, century-long increase in air temperature. By contrast, in the absence of environmental stochasticity, the safety margin is dramatically reduced. We suggest that the risk of exceeding the safety margin, rather than the absolute value of the safety margin, plays an underappreciated role in the evolution of thermal tolerance. Our predictions are based on a simple, hypothetical, allelic model that connects genetics to thermal physiology. To move beyond this simple model - and thereby potentially to predict differential evolution among populations and among species - will require significant advances in our ability to translate the details of thermal histories into physiological and population-genetic consequences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.
1995-08-01
This report describes the primary physical models that form the basis of the DART mechanistic computer model for calculating fission-product-induced swelling of aluminum dispersion fuels; the calculated results are compared with test data. In addition, DART calculates irradiation-induced changes in the thermal conductivity of the dispersion fuel, as well as fuel restructuring due to aluminum fuel reaction, amorphization, and recrystallization. Input instructions for execution on mainframe, workstation, and personal computers are provided, as is a description of DART output. The theory of fission gas behavior and its effect on fuel swelling is discussed. The behavior of these fission products inmore » both crystalline and amorphous fuel and in the presence of irradiation-induced recrystallization and crystalline-to-amorphous-phase change phenomena is presented, as are models for these irradiation-induced processes.« less
Identification of mechanisms responsible for adverse developmental effects is the first step in creating predictive toxicity models. Identification of putative mechanisms was performed by co-analyzing three datasets for the effects of ToxCast phase Ia and II chemicals: 1.In vitro...
In its Computational Toxicology Program, EPA/ORD proposes to integrate genomics and computational methods to provide a mechanistic basis for the prediction of toxicity of chemicals and the pathogenicity of microorganisms. The goal of microbiological water testing is to be able to...
DOT National Transportation Integrated Search
2013-08-01
The overall goal of Global Sensitivity Analysis (GSA) is to determine sensitivity of pavement performance prediction models to the variation in the design input values. The main difference between GSA and detailed sensitivity analyses is the way the ...
Recent advances in vitro assays, in silico tools, and systems biology approaches provide opportunities for refined mechanistic understanding for chemical safety assessment that will ultimately lead to reduced reliance on animal-based methods. With the U.S. commercial chemical lan...
Quantifying fat, oil, and grease deposit formation kinetics.
Iasmin, Mahbuba; Dean, Lisa O; Ducoste, Joel J
2016-01-01
Fat, oil, and grease (FOG) deposits formed in sanitary sewers are calcium-based saponified solids that are responsible for a significant number of nationwide sanitary sewer overflows (SSOs) across United States. In the current study, the kinetics of lab-based saponified solids were determined to understand the kinetics of FOG deposit formation in sewers for two types of fat (Canola and Beef Tallow) and two types of calcium sources (calcium chloride and calcium sulfate) under three pH (7 ± 0.5, 10 ± 0.5, and ≈14) and two temperature conditions (22 ± 0.5 and 45 ± 0.5 °C). The results of this study displayed quick reactions of a fraction of fats with calcium ions to form calcium based saponified solids. Results further showed that increased palmitic fatty acid content in source fats, the magnitude of the pH, and temperature significantly affect the FOG deposit formation and saponification rates. The experimental data of the kinetics were compared with two empirical models: a) Cotte saponification model and b) Foubert crystallization model and a mass-action based mechanistic model that included alkali driven hydrolysis of triglycerides. Results showed that the mass action based mechanistic model was able to predict changes in the rate of formation of saponified solids under the different experimental conditions compared to both empirical models. The mass-action based saponification model also revealed that the hydrolysis of Beef Tallow was slower compared to liquid Canola fat resulting in smaller quantities of saponified solids. This mechanistic saponification model, with its ability to track the saponified solids chemical precursors, may provide an initial framework to predict the spatial formation of FOG deposits in municipal sewers using system wide sewer collection modeling software. Copyright © 2015 Elsevier Ltd. All rights reserved.
ODE Constrained Mixture Modelling: A Method for Unraveling Subpopulation Structures and Dynamics
Hasenauer, Jan; Hasenauer, Christine; Hucho, Tim; Theis, Fabian J.
2014-01-01
Functional cell-to-cell variability is ubiquitous in multicellular organisms as well as bacterial populations. Even genetically identical cells of the same cell type can respond differently to identical stimuli. Methods have been developed to analyse heterogeneous populations, e.g., mixture models and stochastic population models. The available methods are, however, either incapable of simultaneously analysing different experimental conditions or are computationally demanding and difficult to apply. Furthermore, they do not account for biological information available in the literature. To overcome disadvantages of existing methods, we combine mixture models and ordinary differential equation (ODE) models. The ODE models provide a mechanistic description of the underlying processes while mixture models provide an easy way to capture variability. In a simulation study, we show that the class of ODE constrained mixture models can unravel the subpopulation structure and determine the sources of cell-to-cell variability. In addition, the method provides reliable estimates for kinetic rates and subpopulation characteristics. We use ODE constrained mixture modelling to study NGF-induced Erk1/2 phosphorylation in primary sensory neurones, a process relevant in inflammatory and neuropathic pain. We propose a mechanistic pathway model for this process and reconstructed static and dynamical subpopulation characteristics across experimental conditions. We validate the model predictions experimentally, which verifies the capabilities of ODE constrained mixture models. These results illustrate that ODE constrained mixture models can reveal novel mechanistic insights and possess a high sensitivity. PMID:24992156
Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform.
Marshall-Colon, Amy; Long, Stephen P; Allen, Douglas K; Allen, Gabrielle; Beard, Daniel A; Benes, Bedrich; von Caemmerer, Susanne; Christensen, A J; Cox, Donna J; Hart, John C; Hirst, Peter M; Kannan, Kavya; Katz, Daniel S; Lynch, Jonathan P; Millar, Andrew J; Panneerselvam, Balaji; Price, Nathan D; Prusinkiewicz, Przemyslaw; Raila, David; Shekar, Rachel G; Shrivastava, Stuti; Shukla, Diwakar; Srinivasan, Venkatraman; Stitt, Mark; Turk, Matthew J; Voit, Eberhard O; Wang, Yu; Yin, Xinyou; Zhu, Xin-Guang
2017-01-01
Multi-scale models can facilitate whole plant simulations by linking gene networks, protein synthesis, metabolic pathways, physiology, and growth. Whole plant models can be further integrated with ecosystem, weather, and climate models to predict how various interactions respond to environmental perturbations. These models have the potential to fill in missing mechanistic details and generate new hypotheses to prioritize directed engineering efforts. Outcomes will potentially accelerate improvement of crop yield, sustainability, and increase future food security. It is time for a paradigm shift in plant modeling, from largely isolated efforts to a connected community that takes advantage of advances in high performance computing and mechanistic understanding of plant processes. Tools for guiding future crop breeding and engineering, understanding the implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment are urgently needed. The purpose of this perspective is to introduce Crops in silico (cropsinsilico.org), an integrative and multi-scale modeling platform, as one solution that combines isolated modeling efforts toward the generation of virtual crops, which is open and accessible to the entire plant biology community. The major challenges involved both in the development and deployment of a shared, multi-scale modeling platform, which are summarized in this prospectus, were recently identified during the first Crops in silico Symposium and Workshop.
Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform
Marshall-Colon, Amy; Long, Stephen P.; Allen, Douglas K.; Allen, Gabrielle; Beard, Daniel A.; Benes, Bedrich; von Caemmerer, Susanne; Christensen, A. J.; Cox, Donna J.; Hart, John C.; Hirst, Peter M.; Kannan, Kavya; Katz, Daniel S.; Lynch, Jonathan P.; Millar, Andrew J.; Panneerselvam, Balaji; Price, Nathan D.; Prusinkiewicz, Przemyslaw; Raila, David; Shekar, Rachel G.; Shrivastava, Stuti; Shukla, Diwakar; Srinivasan, Venkatraman; Stitt, Mark; Turk, Matthew J.; Voit, Eberhard O.; Wang, Yu; Yin, Xinyou; Zhu, Xin-Guang
2017-01-01
Multi-scale models can facilitate whole plant simulations by linking gene networks, protein synthesis, metabolic pathways, physiology, and growth. Whole plant models can be further integrated with ecosystem, weather, and climate models to predict how various interactions respond to environmental perturbations. These models have the potential to fill in missing mechanistic details and generate new hypotheses to prioritize directed engineering efforts. Outcomes will potentially accelerate improvement of crop yield, sustainability, and increase future food security. It is time for a paradigm shift in plant modeling, from largely isolated efforts to a connected community that takes advantage of advances in high performance computing and mechanistic understanding of plant processes. Tools for guiding future crop breeding and engineering, understanding the implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment are urgently needed. The purpose of this perspective is to introduce Crops in silico (cropsinsilico.org), an integrative and multi-scale modeling platform, as one solution that combines isolated modeling efforts toward the generation of virtual crops, which is open and accessible to the entire plant biology community. The major challenges involved both in the development and deployment of a shared, multi-scale modeling platform, which are summarized in this prospectus, were recently identified during the first Crops in silico Symposium and Workshop. PMID:28555150
Sensitivity Analysis of Fatigue Crack Growth Model for API Steels in Gaseous Hydrogen.
Amaro, Robert L; Rustagi, Neha; Drexler, Elizabeth S; Slifka, Andrew J
2014-01-01
A model to predict fatigue crack growth of API pipeline steels in high pressure gaseous hydrogen has been developed and is presented elsewhere. The model currently has several parameters that must be calibrated for each pipeline steel of interest. This work provides a sensitivity analysis of the model parameters in order to provide (a) insight to the underlying mathematical and mechanistic aspects of the model, and (b) guidance for model calibration of other API steels.
Mellor, Jonathan E; Levy, Karen; Zimmerman, Julie; Elliott, Mark; Bartram, Jamie; Carlton, Elizabeth; Clasen, Thomas; Dillingham, Rebecca; Eisenberg, Joseph; Guerrant, Richard; Lantagne, Daniele; Mihelcic, James; Nelson, Kara
2016-04-01
Increased precipitation and temperature variability as well as extreme events related to climate change are predicted to affect the availability and quality of water globally. Already heavily burdened with diarrheal diseases due to poor access to water, sanitation and hygiene facilities, communities throughout the developing world lack the adaptive capacity to sufficiently respond to the additional adversity caused by climate change. Studies suggest that diarrhea rates are positively correlated with increased temperature, and show a complex relationship with precipitation. Although climate change will likely increase rates of diarrheal diseases on average, there is a poor mechanistic understanding of the underlying disease transmission processes and substantial uncertainty surrounding current estimates. This makes it difficult to recommend appropriate adaptation strategies. We review the relevant climate-related mechanisms behind transmission of diarrheal disease pathogens and argue that systems-based mechanistic approaches incorporating human, engineered and environmental components are urgently needed. We then review successful systems-based approaches used in other environmental health fields and detail one modeling framework to predict climate change impacts on diarrheal diseases and design adaptation strategies. Copyright © 2016 Elsevier B.V. All rights reserved.
Comparing mechanistic and empirical approaches to modeling the thermal niche of almond
NASA Astrophysics Data System (ADS)
Parker, Lauren E.; Abatzoglou, John T.
2017-09-01
Delineating locations that are thermally viable for cultivating high-value crops can help to guide land use planning, agronomics, and water management. Three modeling approaches were used to identify the potential distribution and key thermal constraints on on almond cultivation across the southwestern United States (US), including two empirical species distribution models (SDMs)—one using commonly used bioclimatic variables (traditional SDM) and the other using more physiologically relevant climate variables (nontraditional SDM)—and a mechanistic model (MM) developed using published thermal limitations from field studies. While models showed comparable results over the majority of the domain, including over existing croplands with high almond density, the MM suggested the greatest potential for the geographic expansion of almond cultivation, with frost susceptibility and insufficient heat accumulation being the primary thermal constraints in the southwestern US. The traditional SDM over-predicted almond suitability in locations shown by the MM to be limited by frost, whereas the nontraditional SDM showed greater agreement with the MM in these locations, indicating that incorporating physiologically relevant variables in SDMs can improve predictions. Finally, opportunities for geographic expansion of almond cultivation under current climatic conditions in the region may be limited, suggesting that increasing production may rely on agronomical advances and densifying current almond plantations in existing locations.
An evaluation of selected in silico models for the assessment ...
Skin sensitization remains an important endpoint for consumers, manufacturers and regulators. Although the development of alternative approaches to assess skin sensitization potential has been extremely active over many years, the implication of regulations such as REACH and the Cosmetics Directive in EU has provided a much stronger impetus to actualize this research into practical tools for decision making. Thus there has been considerable focus on the development, evaluation, and integration of alternative approaches for skin sensitization hazard and risk assessment. This includes in silico approaches such as (Q)SARs and expert systems. This study aimed to evaluate the predictive performance of a selection of in silico models and then to explore whether combining those models led to an improvement in accuracy. A dataset of 473 substances that had been tested in the local lymph node assay (LLNA) was compiled. This comprised 295 sensitizers and 178 non-sensitizers. Four freely available models were identified - 2 statistical models VEGA and MultiCASE model A33 for skin sensitization (MCASE A33) from the Danish National Food Institute and two mechanistic models Toxtree’s Skin sensitization Reaction domains (Toxtree SS Rxn domains) and the OASIS v1.3 protein binding alerts for skin sensitization from the OECD Toolbox (OASIS). VEGA and MCASE A33 aim to predict sensitization as a binary score whereas the mechanistic models identified reaction domains or structura
Predicting Biological Information Flow in a Model Oxygen Minimum Zone
NASA Astrophysics Data System (ADS)
Louca, S.; Hawley, A. K.; Katsev, S.; Beltran, M. T.; Bhatia, M. P.; Michiels, C.; Capelle, D.; Lavik, G.; Doebeli, M.; Crowe, S.; Hallam, S. J.
2016-02-01
Microbial activity drives marine biochemical fluxes and nutrient cycling at global scales. Geochemical measurements as well as molecular techniques such as metagenomics, metatranscriptomics and metaproteomics provide great insight into microbial activity. However, an integration of molecular and geochemical data into mechanistic biogeochemical models is still lacking. Recent work suggests that microbial metabolic pathways are, at the ecosystem level, strongly shaped by stoichiometric and energetic constraints. Hence, models rooted in fluxes of matter and energy may yield a holistic understanding of biogeochemistry. Furthermore, such pathway-centric models would allow a direct consolidation with meta'omic data. Here we present a pathway-centric biogeochemical model for the seasonal oxygen minimum zone in Saanich Inlet, a fjord off the coast of Vancouver Island. The model considers key dissimilatory nitrogen and sulfur fluxes, as well as the population dynamics of the genes that mediate them. By assuming a direct translation of biocatalyzed energy fluxes to biosynthesis rates, we make predictions about the distribution and activity of the corresponding genes. A comparison of the model to molecular measurements indicates that the model explains observed DNA, RNA, protein and cell depth profiles. This suggests that microbial activity in marine ecosystems such as oxygen minimum zones is well described by DNA abundance, which, in conjunction with geochemical constraints, determines pathway expression and process rates. Our work further demonstrates how meta'omic data can be mechanistically linked to environmental redox conditions and biogeochemical processes.
Energy efficiency drives the global seasonal distribution of birds.
Somveille, Marius; Rodrigues, Ana S L; Manica, Andrea
2018-06-01
The uneven distribution of biodiversity on Earth is one of the most general and puzzling patterns in ecology. Many hypotheses have been proposed to explain it, based on evolutionary processes or on constraints related to geography and energy. However, previous studies investigating these hypotheses have been largely descriptive due to the logistical difficulties of conducting controlled experiments on such large geographical scales. Here, we use bird migration-the seasonal redistribution of approximately 15% of bird species across the world-as a natural experiment for testing the species-energy relationship, the hypothesis that animal diversity is driven by energetic constraints. We develop a mechanistic model of bird distributions across the world, and across seasons, based on simple ecological and energetic principles. Using this model, we show that bird species distributions optimize the balance between energy acquisition and energy expenditure while taking into account competition with other species. These findings support, and provide a mechanistic explanation for, the species-energy relationship. The findings also provide a general explanation of migration as a mechanism that allows birds to optimize their energy budget in the face of seasonality and competition. Finally, our mechanistic model provides a tool for predicting how ecosystems will respond to global anthropogenic change.
An Interoceptive Predictive Coding Model of Conscious Presence
Seth, Anil K.; Suzuki, Keisuke; Critchley, Hugo D.
2011-01-01
We describe a theoretical model of the neurocognitive mechanisms underlying conscious presence and its disturbances. The model is based on interoceptive prediction error and is informed by predictive models of agency, general models of hierarchical predictive coding and dopaminergic signaling in cortex, the role of the anterior insular cortex (AIC) in interoception and emotion, and cognitive neuroscience evidence from studies of virtual reality and of psychiatric disorders of presence, specifically depersonalization/derealization disorder. The model associates presence with successful suppression by top-down predictions of informative interoceptive signals evoked by autonomic control signals and, indirectly, by visceral responses to afferent sensory signals. The model connects presence to agency by allowing that predicted interoceptive signals will depend on whether afferent sensory signals are determined, by a parallel predictive-coding mechanism, to be self-generated or externally caused. Anatomically, we identify the AIC as the likely locus of key neural comparator mechanisms. Our model integrates a broad range of previously disparate evidence, makes predictions for conjoint manipulations of agency and presence, offers a new view of emotion as interoceptive inference, and represents a step toward a mechanistic account of a fundamental phenomenological property of consciousness. PMID:22291673
Staff, T; Eken, T; Wik, L; Røislien, J; Søvik, S
2014-01-01
Current literature on motor vehicle accidents (MVAs) has few reports regarding field factors that predict the degree of injury. Also, studies of mechanistic factors rarely consider concurrent predictive effects of on-scene patient physiology. The New Injury Severity Score (NISS) has previously been found to correlate with mortality, need for ICU admission, length of hospital stay, and functional recovery after trauma. To potentially increase future precision of trauma triage, we assessed how the NISS is associated with physiologic, demographic and mechanistic variables from the accident site. Using mixed-model linear regression analyses, we explored the association between NISS and pre-hospital Glasgow Coma Scale (GCS) score, Revised Trauma Score (RTS) categories of respiratory rate (RR) and systolic blood pressure (SBP), gender, age, subject position in the vehicle, seatbelt use, airbag deployment, and the estimated squared change in vehicle velocity on impact ((Δv)(2)). Missing values were handled with multiple imputation. We included 190 accidents with 353 dead or injured subjects (mean NISS 17, median NISS 8, IQR 1-27). For the 307 subjects in front-impact MVAs, the mean increase in NISS was -2.58 per GCS point, -2.52 per RR category level, -2.77 per SBP category level, -1.08 for male gender, 0.18 per year of age, 4.98 for driver vs. rear passengers, 4.83 for no seatbelt use, 13.52 for indeterminable seatbelt use, 5.07 for no airbag deployment, and 0.0003 per (km/h)(2) velocity change (all p<0.002). This study in victims of MVAs demonstrated that injury severity (NISS) was concurrently and independently predicted by poor pre-hospital physiologic status, increasing age and female gender, and several mechanistic measures of localised and generalised trauma energy. Our findings underscore the need for precise information from the site of trauma, to reduce undertriage, target diagnostic efforts, and anticipate need for high-level care and rehabilitative resources. Copyright © 2012 Elsevier Ltd. All rights reserved.
Modeling Bird Migration under Climate Change: A Mechanistic Approach
NASA Technical Reports Server (NTRS)
Smith, James A.
2009-01-01
How will migrating birds respond to changes in the environment under climate change? What are the implications for migratory success under the various accelerated climate change scenarios as forecast by the Intergovernmental Panel on Climate Change? How will reductions or increased variability in the number or quality of wetland stop-over sites affect migratory bird species? The answers to these questions have important ramifications for conservation biology and wildlife management. Here, we describe the use of continental scale simulation modeling to explore how spatio-temporal changes along migratory flyways affect en-route migration success. We use an individually based, biophysical, mechanistic, bird migration model to simulate the movement of shorebirds in North America as a tool to study how such factors as drought and wetland loss may impact migratory success and modify migration patterns. Our model is driven by remote sensing and climate data and incorporates important landscape variables. The energy budget components of the model include resting, foraging, and flight, but presently predation is ignored. Results/Conclusions We illustrate our model by studying the spring migration of sandpipers through the Great Plains to their Arctic breeding grounds. Why many species of shorebirds have shown significant declines remains a puzzle. Shorebirds are sensitive to stop-over quality and spacing because of their need for frequent refueling stops and their opportunistic feeding patterns. We predict bird "hydrographs that is, stop-over frequency with latitude, that are in agreement with the literature. Mean stop-over durations predicted from our model for nominal cases also are consistent with the limited, but available data. For the shorebird species simulated, our model predicts that shorebirds exhibit significant plasticity and are able to shift their migration patterns in response to changing drought conditions. However, the question remains as to whether this behavior can be maintained over increasing and sustained environmental change. Also, the problem is much more complex than described by the current processes captured in our model. We have taken some important and interesting steps, and our model does demonstrate how local scale information about individual stop-over sites can be linked into the migratory flyway as a whole. We are incorporating additional, species specific, mechanistic processes to better reflect different climate change scenarios
Gaussian process regression for forecasting battery state of health
NASA Astrophysics Data System (ADS)
Richardson, Robert R.; Osborne, Michael A.; Howey, David A.
2017-07-01
Accurately predicting the future capacity and remaining useful life of batteries is necessary to ensure reliable system operation and to minimise maintenance costs. The complex nature of battery degradation has meant that mechanistic modelling of capacity fade has thus far remained intractable; however, with the advent of cloud-connected devices, data from cells in various applications is becoming increasingly available, and the feasibility of data-driven methods for battery prognostics is increasing. Here we propose Gaussian process (GP) regression for forecasting battery state of health, and highlight various advantages of GPs over other data-driven and mechanistic approaches. GPs are a type of Bayesian non-parametric method, and hence can model complex systems whilst handling uncertainty in a principled manner. Prior information can be exploited by GPs in a variety of ways: explicit mean functions can be used if the functional form of the underlying degradation model is available, and multiple-output GPs can effectively exploit correlations between data from different cells. We demonstrate the predictive capability of GPs for short-term and long-term (remaining useful life) forecasting on a selection of capacity vs. cycle datasets from lithium-ion cells.
The spatial structure of a nonlinear receptive field.
Schwartz, Gregory W; Okawa, Haruhisa; Dunn, Felice A; Morgan, Josh L; Kerschensteiner, Daniel; Wong, Rachel O; Rieke, Fred
2012-11-01
Understanding a sensory system implies the ability to predict responses to a variety of inputs from a common model. In the retina, this includes predicting how the integration of signals across visual space shapes the outputs of retinal ganglion cells. Existing models of this process generalize poorly to predict responses to new stimuli. This failure arises in part from properties of the ganglion cell response that are not well captured by standard receptive-field mapping techniques: nonlinear spatial integration and fine-scale heterogeneities in spatial sampling. Here we characterize a ganglion cell's spatial receptive field using a mechanistic model based on measurements of the physiological properties and connectivity of only the primary excitatory circuitry of the retina. The resulting simplified circuit model successfully predicts ganglion-cell responses to a variety of spatial patterns and thus provides a direct correspondence between circuit connectivity and retinal output.
Phenemenological vs. biophysical models of thermal stress in aquatic eggs
NASA Astrophysics Data System (ADS)
Martin, B.
2016-12-01
Predicting species responses to climate change is a central challenge in ecology, with most efforts relying on lab derived phenomenological relationships between temperature and fitness metrics. We tested one of these models using the embryonic stage of a Chinook salmon population. We parameterized the model with laboratory data, applied it to predict survival in the field, and found that it significantly underestimated field-derived estimates of thermal mortality. We used a biophysical model based on mass-transfer theory to show that the discrepancy was due to the differences in water flow velocities between the lab and the field. This mechanistic approach provides testable predictions for how the thermal tolerance of embryos depends on egg size and flow velocity of the surrounding water. We found support for these predictions across more than 180 fish species, suggesting that flow and temperature mediated oxygen limitation is a general mechanism underlying the thermal tolerance of embryos.
A mechanistic physicochemical model of carbon dioxide transport in blood.
O'Neill, David P; Robbins, Peter A
2017-02-01
A number of mathematical models have been produced that, given the Pco 2 and Po 2 of blood, will calculate the total concentrations for CO 2 and O 2 in blood. However, all these models contain at least some empirical features, and thus do not represent all of the underlying physicochemical processes in an entirely mechanistic manner. The aim of this study was to develop a physicochemical model of CO 2 carriage by the blood to determine whether our understanding of the physical chemistry of the major chemical components of blood together with their interactions is sufficiently strong to predict the physiological properties of CO 2 carriage by whole blood. Standard values are used for the ionic composition of the blood, the plasma albumin concentration, and the hemoglobin concentration. All K m values required for the model are taken from the literature. The distribution of bicarbonate, chloride, and H + ions across the red blood cell membrane follows that of a Gibbs-Donnan equilibrium. The system of equations that results is solved numerically using constraints for mass balance and electroneutrality. The model reproduces the phenomena associated with CO 2 carriage, including the magnitude of the Haldane effect, very well. The structural nature of the model allows various hypothetical scenarios to be explored. Here we examine the effects of 1) removing the ability of hemoglobin to form carbamino compounds; 2) allowing a degree of Cl - binding to deoxygenated hemoglobin; and 3) removing the chloride (Hamburger) shift. The insights gained could not have been obtained from empirical models. This study is the first to incorporate a mechanistic model of chloride-bicarbonate exchange between the erythrocyte and plasma into a full physicochemical model of the carriage of carbon dioxide in blood. The mechanistic nature of the model allowed a theoretical study of the quantitative significance for carbon dioxide transport of carbamino compound formation; the putative binding of chloride to deoxygenated hemoglobin, and the chloride (Hamburger) shift. Copyright © 2017 the American Physiological Society.
A mechanistic physicochemical model of carbon dioxide transport in blood
O’Neill, David P.
2017-01-01
A number of mathematical models have been produced that, given the Pco2 and Po2 of blood, will calculate the total concentrations for CO2 and O2 in blood. However, all these models contain at least some empirical features, and thus do not represent all of the underlying physicochemical processes in an entirely mechanistic manner. The aim of this study was to develop a physicochemical model of CO2 carriage by the blood to determine whether our understanding of the physical chemistry of the major chemical components of blood together with their interactions is sufficiently strong to predict the physiological properties of CO2 carriage by whole blood. Standard values are used for the ionic composition of the blood, the plasma albumin concentration, and the hemoglobin concentration. All Km values required for the model are taken from the literature. The distribution of bicarbonate, chloride, and H+ ions across the red blood cell membrane follows that of a Gibbs-Donnan equilibrium. The system of equations that results is solved numerically using constraints for mass balance and electroneutrality. The model reproduces the phenomena associated with CO2 carriage, including the magnitude of the Haldane effect, very well. The structural nature of the model allows various hypothetical scenarios to be explored. Here we examine the effects of 1) removing the ability of hemoglobin to form carbamino compounds; 2) allowing a degree of Cl− binding to deoxygenated hemoglobin; and 3) removing the chloride (Hamburger) shift. The insights gained could not have been obtained from empirical models. NEW & NOTEWORTHY This study is the first to incorporate a mechanistic model of chloride-bicarbonate exchange between the erythrocyte and plasma into a full physicochemical model of the carriage of carbon dioxide in blood. The mechanistic nature of the model allowed a theoretical study of the quantitative significance for carbon dioxide transport of carbamino compound formation; the putative binding of chloride to deoxygenated hemoglobin, and the chloride (Hamburger) shift. PMID:27881667
BioAge: Toward A Multi-Determined, Mechanistic Account of Cognitive Aging
DeCarlo, Correne A.; Tuokko, Holly A.; Williams, Dorothy; Dixon, Roger A.; MacDonald, Stuart W.S.
2014-01-01
The search for reliable early indicators of age-related cognitive decline represents a critical avenue for progress in aging research. Chronological age is a commonly used developmental index; however, it offers little insight into the mechanisms underlying cognitive decline. In contrast, biological age (BioAge), reflecting the vitality of essential biological systems, represents a promising operationalization of developmental time. Current BioAge models have successfully predicted age-related cognitive deficits. Research on aging-related cognitive function indicates that the interaction of multiple risk and protective factors across the human lifespan confers individual risk for late-life cognitive decline, implicating a multi-causal explanation. In this review, we explore current BioAge models, describe three broad yet pathologically relevant biological processes linked to cognitive decline, and propose a novel operationalization of BioAge accounting for both moderating and causal mechanisms of cognitive decline and dementia. We argue that a multivariate and mechanistic BioAge approach will lead to a greater understanding of disease pathology as well as more accurate prediction and early identification of late-life cognitive decline. PMID:25278166
BioAge: toward a multi-determined, mechanistic account of cognitive aging.
DeCarlo, Correne A; Tuokko, Holly A; Williams, Dorothy; Dixon, Roger A; MacDonald, Stuart W S
2014-11-01
The search for reliable early indicators of age-related cognitive decline represents a critical avenue for progress in aging research. Chronological age is a commonly used developmental index; however, it offers little insight into the mechanisms underlying cognitive decline. In contrast, biological age (BioAge), reflecting the vitality of essential biological systems, represents a promising operationalization of developmental time. Current BioAge models have successfully predicted age-related cognitive deficits. Research on aging-related cognitive function indicates that the interaction of multiple risk and protective factors across the human lifespan confers individual risk for late-life cognitive decline, implicating a multi-causal explanation. In this review, we explore current BioAge models, describe three broad yet pathologically relevant biological processes linked to cognitive decline, and propose a novel operationalization of BioAge accounting for both moderating and causal mechanisms of cognitive decline and dementia. We argue that a multivariate and mechanistic BioAge approach will lead to a greater understanding of disease pathology as well as more accurate prediction and early identification of late-life cognitive decline. Copyright © 2014 Elsevier B.V. All rights reserved.
Application of PBPK modelling in drug discovery and development at Pfizer.
Jones, Hannah M; Dickins, Maurice; Youdim, Kuresh; Gosset, James R; Attkins, Neil J; Hay, Tanya L; Gurrell, Ian K; Logan, Y Raj; Bungay, Peter J; Jones, Barry C; Gardner, Iain B
2012-01-01
Early prediction of human pharmacokinetics (PK) and drug-drug interactions (DDI) in drug discovery and development allows for more informed decision making. Physiologically based pharmacokinetic (PBPK) modelling can be used to answer a number of questions throughout the process of drug discovery and development and is thus becoming a very popular tool. PBPK models provide the opportunity to integrate key input parameters from different sources to not only estimate PK parameters and plasma concentration-time profiles, but also to gain mechanistic insight into compound properties. Using examples from the literature and our own company, we have shown how PBPK techniques can be utilized through the stages of drug discovery and development to increase efficiency, reduce the need for animal studies, replace clinical trials and to increase PK understanding. Given the mechanistic nature of these models, the future use of PBPK modelling in drug discovery and development is promising, however, some limitations need to be addressed to realize its application and utility more broadly.
Finding Furfural Hydrogenation Catalysts via Predictive Modelling
Strassberger, Zea; Mooijman, Maurice; Ruijter, Eelco; Alberts, Albert H; Maldonado, Ana G; Orru, Romano V A; Rothenberg, Gadi
2010-01-01
Abstract We combine multicomponent reactions, catalytic performance studies and predictive modelling to find transfer hydrogenation catalysts. An initial set of 18 ruthenium-carbene complexes were synthesized and screened in the transfer hydrogenation of furfural to furfurol with isopropyl alcohol complexes gave varied yields, from 62% up to >99.9%, with no obvious structure/activity correlations. Control experiments proved that the carbene ligand remains coordinated to the ruthenium centre throughout the reaction. Deuterium-labelling studies showed a secondary isotope effect (kH:kD=1.5). Further mechanistic studies showed that this transfer hydrogenation follows the so-called monohydride pathway. Using these data, we built a predictive model for 13 of the catalysts, based on 2D and 3D molecular descriptors. We tested and validated the model using the remaining five catalysts (cross-validation, R2=0.913). Then, with this model, the conversion and selectivity were predicted for four completely new ruthenium-carbene complexes. These four catalysts were then synthesized and tested. The results were within 3% of the model’s predictions, demonstrating the validity and value of predictive modelling in catalyst optimization. PMID:23193388
Goya Jorge, Elizabeth; Rayar, Anita Maria; Barigye, Stephen J; Jorge Rodríguez, María Elisa; Sylla-Iyarreta Veitía, Maité
2016-06-07
A quantitative structure-activity relationship (QSAR) study of the 2,2-diphenyl-l-picrylhydrazyl (DPPH•) radical scavenging ability of 1373 chemical compounds, using DRAGON molecular descriptors (MD) and the neural network technique, a technique based on the multilayer multilayer perceptron (MLP), was developed. The built model demonstrated a satisfactory performance for the training ( R 2 = 0.713 ) and test set ( Q ext 2 = 0.654 ) , respectively. To gain greater insight on the relevance of the MD contained in the MLP model, sensitivity and principal component analyses were performed. Moreover, structural and mechanistic interpretation was carried out to comprehend the relationship of the variables in the model with the modeled property. The constructed MLP model was employed to predict the radical scavenging ability for a group of coumarin-type compounds. Finally, in order to validate the model's predictions, an in vitro assay for one of the compounds (4-hydroxycoumarin) was performed, showing a satisfactory proximity between the experimental and predicted pIC50 values.
Predicting community composition from pairwise interactions
NASA Astrophysics Data System (ADS)
Friedman, Jonathan; Higgins, Logan; Gore, Jeff
The ability to predict the structure of complex, multispecies communities is crucial for understanding the impact of species extinction and invasion on natural communities, as well as for engineering novel, synthetic communities. Communities are often modeled using phenomenological models, such as the classical generalized Lotka-Volterra (gLV) model. While a lot of our intuition comes from such models, their predictive power has rarely been tested experimentally. To directly assess the predictive power of this approach, we constructed synthetic communities comprised of up to 8 soil bacteria. We measured the outcome of competition between all species pairs, and used these measurements to predict the composition of communities composed of more than 2 species. The pairwise competitions resulted in a diverse set of outcomes, including coexistence, exclusion, and bistability, and displayed evidence for both interference and facilitation. Most pair outcomes could be captured by the gLV framework, and the composition of multispecies communities could be predicted for communities composed solely of such pairs. Our results demonstrate the predictive ability and utility of simple phenomenology, which enables accurate predictions in the absence of mechanistic details.
Xu, Dake; Li, Yingchao; Gu, Tingyue
2016-08-01
Biocorrosion is also known as microbiologically influenced corrosion (MIC). Most anaerobic MIC cases can be classified into two major types. Type I MIC involves non-oxygen oxidants such as sulfate and nitrate that require biocatalysis for their reduction in the cytoplasm of microbes such as sulfate reducing bacteria (SRB) and nitrate reducing bacteria (NRB). This means that the extracellular electrons from the oxidation of metal such as iron must be transported across cell walls into the cytoplasm. Type II MIC involves oxidants such as protons that are secreted by microbes such as acid producing bacteria (APB). The biofilms in this case supply the locally high concentrations of oxidants that are corrosive without biocatalysis. This work describes a mechanistic model that is based on the biocatalytic cathodic sulfate reduction (BCSR) theory. The model utilizes charge transfer and mass transfer concepts to describe the SRB biocorrosion process. The model also includes a mechanism to describe APB attack based on the local acidic pH at a pit bottom. A pitting prediction software package has been created based on the mechanisms. It predicts long-term pitting rates and worst-case scenarios after calibration using SRB short-term pit depth data. Various parameters can be investigated through computer simulation. Copyright © 2016 Elsevier B.V. All rights reserved.
Ye, Han; Zhou, Jiadong; Er, Dequan; Price, Christopher C; Yu, Zhongyuan; Liu, Yumin; Lowengrub, John; Lou, Jun; Liu, Zheng; Shenoy, Vivek B
2017-12-26
Vertical stacking of monolayers via van der Waals (vdW) interaction opens promising routes toward engineering physical properties of two-dimensional (2D) materials and designing atomically thin devices. However, due to the lack of mechanistic understanding, challenges remain in the controlled fabrication of these structures via scalable methods such as chemical vapor deposition (CVD) onto substrates. In this paper, we develop a general multiscale model to describe the size evolution of 2D layers and predict the necessary growth conditions for vertical (initial + subsequent layers) versus in-plane lateral (monolayer) growth. An analytic thermodynamic criterion is established for subsequent layer growth that depends on the sizes of both layers, the vdW interaction energies, and the edge energy of 2D layers. Considering the time-dependent growth process, we find that temperature and adatom flux from vapor are the primary criteria affecting the self-assembled growth. The proposed model clearly demonstrates the distinct roles of thermodynamic and kinetic mechanisms governing the final structure. Our model agrees with experimental observations of various monolayer and bilayer transition metal dichalcogenides grown by CVD and provides a predictive framework to guide the fabrication of vertically stacked 2D materials.
Sun, Dajun D; Lee, Ping I
2013-11-04
The combination of a rapidly dissolving and supersaturating "spring" with a precipitation retarding "parachute" has often been pursued as an effective formulation strategy for amorphous solid dispersions (ASDs) to enhance the rate and extent of oral absorption. However, the interplay between these two rate processes in achieving and maintaining supersaturation remains inadequately understood, and the effect of rate of supersaturation buildup on the overall time evolution of supersaturation during the dissolution of amorphous solids has not been explored. The objective of this study is to investigate the effect of supersaturation generation rate on the resulting kinetic solubility profiles of amorphous pharmaceuticals and to delineate the evolution of supersaturation from a mechanistic viewpoint. Experimental concentration-time curves under varying rates of supersaturation generation and recrystallization for model drugs, indomethacin (IND), naproxen (NAP) and piroxicam (PIR), were generated from infusing dissolved drug (e.g., in ethanol) into the dissolution medium and compared with that predicted from a comprehensive mechanistic model based on the classical nucleation theory taking into account both the particle growth and ripening processes. In the absence of any dissolved polymer to inhibit drug precipitation, both our experimental and predicted results show that the maximum achievable supersaturation (i.e., kinetic solubility) of the amorphous solids increases, the time to reach maximum decreases, and the rate of concentration decline in the de-supersaturation phase increases, with increasing rate of supersaturation generation (i.e., dissolution rate). Our mechanistic model also predicts the existence of an optimal supersaturation rate which maximizes the area under the curve (AUC) of the kinetic solubility concentration-time profile, which agrees well with experimental data. In the presence of a dissolved polymer from ASD dissolution, these observed trends also hold true except the de-supersaturation phase is more extended due to the crystallization inhibition effect. Since the observed kinetic solubility of nonequilibrium amorphous solids depends on the rate of supersaturation generation, our results also highlight the underlying difficulty in determining a reproducible solubility advantage for amorphous solids.
Roberts, David W; Patlewicz, Grace; Kern, Petra S; Gerberick, Frank; Kimber, Ian; Dearman, Rebecca J; Ryan, Cindy A; Basketter, David A; Aptula, Aynur O
2007-07-01
The goal of eliminating animal testing in the predictive identification of chemicals with the intrinsic ability to cause skin sensitization is an important target, the attainment of which has recently been brought into even sharper relief by the EU Cosmetics Directive and the requirements of the REACH legislation. Development of alternative methods requires that the chemicals used to evaluate and validate novel approaches comprise not only confirmed skin sensitizers and non-sensitizers but also substances that span the full chemical mechanistic spectrum associated with skin sensitization. To this end, a recently published database of more than 200 chemicals tested in the mouse local lymph node assay (LLNA) has been examined in relation to various chemical reaction mechanistic domains known to be associated with sensitization. It is demonstrated here that the dataset does cover the main reaction mechanistic domains. In addition, it is shown that assignment to a reaction mechanistic domain is a critical first step in a strategic approach to understanding, ultimately on a quantitative basis, how chemical properties influence the potency of skin sensitizing chemicals. This understanding is necessary if reliable non-animal approaches, including (quantitative) structure-activity relationships (Q)SARs, read-across, and experimental chemistry based models, are to be developed.
NASA Astrophysics Data System (ADS)
Mukherjee, S.; Chauhan, P.; Osterman, M.; Dasgupta, A.; Pecht, M.
2016-07-01
Mechanistic microstructural models have been developed to capture the effect of isothermal aging on time dependent viscoplastic response of Sn3.0Ag0.5Cu (SAC305) solders. SnAgCu (SAC) solders undergo continuous microstructural coarsening during both storage and service because of their high homologous temperature. The microstructures of these low melting point alloys continuously evolve during service. This results in evolution of creep properties of the joint over time, thereby influencing the long term reliability of microelectronic packages. It is well documented that isothermal aging degrades the creep resistance of SAC solder. SAC305 alloy is aged for (24-1000) h at (25-100)°C (~0.6-0.8 × T melt). Cross-sectioning and image processing techniques were used to periodically quantify the effect of isothermal aging on phase coarsening and evolution. The parameters monitored during isothermal aging include size, area fraction, and inter-particle spacing of nanoscale Ag3Sn intermetallic compounds (IMCs) and the volume fraction of micronscale Cu6Sn5 IMCs, as well as the area fraction of pure tin dendrites. Effects of microstructural evolution on secondary creep constitutive response of SAC305 solder joints were then modeled using a mechanistic multiscale creep model. The mechanistic phenomena modeled include: (1) dispersion strengthening by coarsened nanoscale Ag3Sn IMCs in the eutectic phase; and (2) load sharing between pro-eutectic Sn dendrites and the surrounding coarsened eutectic Sn-Ag phase and microscale Cu6Sn5 IMCs. The coarse-grained polycrystalline Sn microstructure in SAC305 solder was not captured in the above model because isothermal aging does not cause any significant change in the initial grain size and orientation of SAC305 solder joints. The above mechanistic model can successfully capture the drop in creep resistance due to the influence of isothermal aging on SAC305 single crystals. Contribution of grain boundary sliding to the creep strain of coarse grained joints has not been modeled in this study.
Equation-free mechanistic ecosystem forecasting using empirical dynamic modeling
Ye, Hao; Beamish, Richard J.; Glaser, Sarah M.; Grant, Sue C. H.; Hsieh, Chih-hao; Richards, Laura J.; Schnute, Jon T.; Sugihara, George
2015-01-01
It is well known that current equilibrium-based models fall short as predictive descriptions of natural ecosystems, and particularly of fisheries systems that exhibit nonlinear dynamics. For example, model parameters assumed to be fixed constants may actually vary in time, models may fit well to existing data but lack out-of-sample predictive skill, and key driving variables may be misidentified due to transient (mirage) correlations that are common in nonlinear systems. With these frailties, it is somewhat surprising that static equilibrium models continue to be widely used. Here, we examine empirical dynamic modeling (EDM) as an alternative to imposed model equations and that accommodates both nonequilibrium dynamics and nonlinearity. Using time series from nine stocks of sockeye salmon (Oncorhynchus nerka) from the Fraser River system in British Columbia, Canada, we perform, for the the first time to our knowledge, real-data comparison of contemporary fisheries models with equivalent EDM formulations that explicitly use spawning stock and environmental variables to forecast recruitment. We find that EDM models produce more accurate and precise forecasts, and unlike extensions of the classic Ricker spawner–recruit equation, they show significant improvements when environmental factors are included. Our analysis demonstrates the strategic utility of EDM for incorporating environmental influences into fisheries forecasts and, more generally, for providing insight into how environmental factors can operate in forecast models, thus paving the way for equation-free mechanistic forecasting to be applied in management contexts. PMID:25733874
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMahon, S; Queen’s University, Belfast, Belfast; McNamara, A
2016-06-15
Purpose Uncertainty in the Relative Biological Effectiveness (RBE) of heavy charged particles compared to photons remains one of the major uncertainties in particle therapy. As RBEs depend strongly on clinical variables such as tissue type, dose, and radiation quality, more accurate individualised models are needed to fully optimise treatments. MethodsWe have developed a model of DNA damage and repair following X-ray irradiation in a number of settings, incorporating mechanistic descriptions of DNA repair pathways, geometric effects on DNA repair, cell cycle effects and cell death. Our model has previously been shown to accurately predict a range of biological endpoints includingmore » chromosome aberrations, mutations, and cell death. This model was combined with nanodosimetric models of individual ion tracks to calculate the additional probability of lethal damage forming within a single track. These lethal damage probabilities can be used to predict survival and RBE for cells irradiated with ions of different Linear Energy Transfer (LET). ResultsBy combining the X-ray response model with nanodosimetry information, predictions of RBE can be made without cell-line specific fitting. The model’s RBE predictions were found to agree well with empirical proton RBE models (Mean absolute difference between models of 1.9% and 1.8% for cells with α/β ratios of 9 and 1.4, respectively, for LETs between 0 and 15 keV/µm). The model also accurately recovers the impact of high-LET carbon ion exposures, showing both the reduced efficacy of ions at extremely high LET, as well as the impact of defects in non-homologous end joining on RBE values in Chinese Hamster Ovary cells.ConclusionOur model is predicts RBE without the inclusion of empirical LET fitting parameters for a range of experimental conditions. This approach has the potential to deliver improved personalisation of particle therapy, with future developments allowing for the calculation of individualised RBEs. SJM is supported by a Marie Curie International Outgoing Fellowship from the European Commission’s FP7 program (EC FP7 MC-IOF-623630)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Brian D.
2013-11-04
Biogeochemical reactive transport processes in the subsurface environment are important to many contemporary environmental issues of significance to DOE. Quantification of risks and impacts associated with environmental management options, and design of remediation systems where needed, require that we have at our disposal reliable predictive tools (usually in the form of numerical simulation models). However, it is well known that even the most sophisticated reactive transport models available today have poor predictive power, particularly when applied at the field scale. Although the lack of predictive ability is associated in part with our inability to characterize the subsurface and limitations inmore » computational power, significant advances have been made in both of these areas in recent decades and can be expected to continue. In this research, we examined the upscaling (pore to Darcy and Darcy to field) the problem of bioremediation via biofilms in porous media. The principle idea was to start with a conceptual description of the bioremediation process at the pore scale, and apply upscaling methods to formally develop the appropriate upscaled model at the so-called Darcy scale. The purpose was to determine (1) what forms the upscaled models would take, and (2) how one might parameterize such upscaled models for applications to bioremediation in the field. We were able to effectively upscale the bioremediation process to explain how the pore-scale phenomena were linked to the field scale. The end product of this research was to produce a set of upscaled models that could be used to help predict field-scale bioremediation. These models were mechanistic, in the sense that they directly incorporated pore-scale information, but upscaled so that only the essential features of the process were needed to predict the effective parameters that appear in the model. In this way, a direct link between the microscale and the field scale was made, but the upscaling process helped inform potential users of the model what kinds of information would be needed to accurately characterize the system.« less
Predictive modeling of mosquito abundance and dengue transmission in Kenya
NASA Astrophysics Data System (ADS)
Caldwell, J.; Krystosik, A.; Mutuku, F.; Ndenga, B.; LaBeaud, D.; Mordecai, E.
2017-12-01
Approximately 390 million people are exposed to dengue virus every year, and with no widely available treatments or vaccines, predictive models of disease risk are valuable tools for vector control and disease prevention. The aim of this study was to modify and improve climate-driven predictive models of dengue vector abundance (Aedes spp. mosquitoes) and viral transmission to people in Kenya. We simulated disease transmission using a temperature-driven mechanistic model and compared model predictions with vector trap data for larvae, pupae, and adult mosquitoes collected between 2014 and 2017 at four sites across urban and rural villages in Kenya. We tested predictive capacity of our models using four temperature measurements (minimum, maximum, range, and anomalies) across daily, weekly, and monthly time scales. Our results indicate seasonal temperature variation is a key driving factor of Aedes mosquito abundance and disease transmission. These models can help vector control programs target specific locations and times when vectors are likely to be present, and can be modified for other Aedes-transmitted diseases and arboviral endemic regions around the world.
Song, Ling; Zhang, Yi; Jiang, Ji; Ren, Shuang; Chen, Li; Liu, Dongyang; Chen, Xijing; Hu, Pei
2018-04-06
The objective of this study was to develop a physiologically based pharmacokinetic (PBPK) model for sinogliatin (HMS-5552, dorzagliatin) by integrating allometric scaling (AS), in vitro to in vivo exploration (IVIVE), and steady-state concentration-mean residence time (C ss -MRT) methods and to provide mechanistic insight into its pharmacokinetic properties in humans. Human major pharmacokinetic parameters were analyzed using AS, IVIVE, and C ss -MRT methods with available preclinical in vitro and in vivo data to understand sinogliatin drug metabolism and pharmacokinetic (DMPK) characteristics and underlying mechanisms. On this basis, an initial mechanistic PBPK model of sinogliatin was developed. The initial PBPK model was verified using observed data from a single ascending dose (SAD) study and further optimized with various strategies. The final model was validated by simulating sinogliatin pharmacokinetics under a fed condition. The validated model was applied to support a clinical drug-drug interaction (DDI) study design and to evaluate the effects of intrinsic (hepatic cirrhosis, genetic) factors on drug exposure. The two-species scaling method using rat and dog data (TS- rat,dog ) was the best AS method in predicting human systemic clearance in the central compartment (CL). The IVIVE method confirmed that sinogliatin was predominantly metabolized by cytochrome P450 (CYP) 3A4. The C ss -MRT method suggested dog pharmacokinetic profiles were more similar to human pharmacokinetic profiles. The estimated CL using the AS and IVIVE approaches was within 1.5-fold of that observed. The C ss -MRT method in dogs also provided acceptable prediction of human pharmacokinetic characteristics. For the PBPK approach, the 90% confidence intervals (CIs) of the simulated maximum concentration (C max ), CL, and area under the plasma concentration-time curve (AUC) of sinogliatin were within those observed and the 90% CI of simulated time to C max (t max ) was closed to that observed for a dose range of 5-50 mg in the SAD study. The final PBPK model was validated by simulating sinogliatin pharmacokinetics with food. The 90% CIs of the simulated C max , CL, and AUC values for sinogliatin were within those observed and the 90% CI of the simulated t max was partially within that observed for the dose range of 25-200 mg in the multiple ascending dose (MAD) study. This PBPK model selected a final clinical DDI study design with itraconazole from four potential designs and also evaluated the effects of intrinsic (hepatic cirrhosis, genetic) factors on drug exposure. Sinogliatin pharmacokinetic properties were mechanistically understood by integrating all four methods and a mechanistic PBPK model was successfully developed and validated using clinical data. This PBPK model was applied to support the development of sinogliatin.
Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V
2017-03-01
A mechanistic model-based soft sensor is developed and validated for 550L filamentous fungus fermentations operated at Novozymes A/S. The soft sensor is comprised of a parameter estimation block based on a stoichiometric balance, coupled to a dynamic process model. The on-line parameter estimation block models the changing rates of formation of product, biomass, and water, and the rate of consumption of feed using standard, available on-line measurements. This parameter estimation block, is coupled to a mechanistic process model, which solves the current states of biomass, product, substrate, dissolved oxygen and mass, as well as other process parameters including k L a, viscosity and partial pressure of CO 2 . State estimation at this scale requires a robust mass model including evaporation, which is a factor not often considered at smaller scales of operation. The model is developed using a historical data set of 11 batches from the fermentation pilot plant (550L) at Novozymes A/S. The model is then implemented on-line in 550L fermentation processes operated at Novozymes A/S in order to validate the state estimator model on 14 new batches utilizing a new strain. The product concentration in the validation batches was predicted with an average root mean sum of squared error (RMSSE) of 16.6%. In addition, calculation of the Janus coefficient for the validation batches shows a suitably calibrated model. The robustness of the model prediction is assessed with respect to the accuracy of the input data. Parameter estimation uncertainty is also carried out. The application of this on-line state estimator allows for on-line monitoring of pilot scale batches, including real-time estimates of multiple parameters which are not able to be monitored on-line. With successful application of a soft sensor at this scale, this allows for improved process monitoring, as well as opening up further possibilities for on-line control algorithms, utilizing these on-line model outputs. Biotechnol. Bioeng. 2017;114: 589-599. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghimire, Bardan; Riley, William J.; Koven, Charles D.
In many ecosystems, nitrogen is the most limiting nutrient for plant growth and productivity. However, current Earth System Models (ESMs) do not mechanistically represent functional nitrogen allocation for photosynthesis or the linkage between nitrogen uptake and root traits. The current version of CLM (4.5) links nitrogen availability and plant productivity via (1) an instantaneous downregulation of potential photosynthesis rates based on soil mineral nitrogen availability, and (2) apportionment of soil nitrogen between plants and competing nitrogen consumers assumed to be proportional to their relative N demands. However, plants do not photosynthesize at potential rates and then downregulate; instead photosynthesis ratesmore » are governed by nitrogen that has been allocated to the physiological processes underpinning photosynthesis. Furthermore, the role of plant roots in nutrient acquisition has also been largely ignored in ESMs. We therefore present a new plant nitrogen model for CLM4.5 with (1) improved representations of linkages between leaf nitrogen and plant productivity based on observed relationships in a global plant trait database and (2) plant nitrogen uptake based on root-scale Michaelis-Menten uptake kinetics. Our model improvements led to a global bias reduction in GPP, LAI, and biomass of 70%, 11%, and 49%, respectively. Furthermore, water use efficiency predictions were improved conceptually, qualitatively, and in magnitude. The new model's GPP responses to nitrogen deposition, CO 2 fertilization, and climate also differed from the baseline model. The mechanistic representation of leaf-level nitrogen allocation and a theoretically consistent treatment of competition with belowground consumers led to overall improvements in global carbon cycling predictions.« less
Ghimire, Bardan; Riley, William J.; Koven, Charles D.; ...
2016-05-01
In many ecosystems, nitrogen is the most limiting nutrient for plant growth and productivity. However, current Earth System Models (ESMs) do not mechanistically represent functional nitrogen allocation for photosynthesis or the linkage between nitrogen uptake and root traits. The current version of CLM (4.5) links nitrogen availability and plant productivity via (1) an instantaneous downregulation of potential photosynthesis rates based on soil mineral nitrogen availability, and (2) apportionment of soil nitrogen between plants and competing nitrogen consumers assumed to be proportional to their relative N demands. However, plants do not photosynthesize at potential rates and then downregulate; instead photosynthesis ratesmore » are governed by nitrogen that has been allocated to the physiological processes underpinning photosynthesis. Furthermore, the role of plant roots in nutrient acquisition has also been largely ignored in ESMs. We therefore present a new plant nitrogen model for CLM4.5 with (1) improved representations of linkages between leaf nitrogen and plant productivity based on observed relationships in a global plant trait database and (2) plant nitrogen uptake based on root-scale Michaelis-Menten uptake kinetics. Our model improvements led to a global bias reduction in GPP, LAI, and biomass of 70%, 11%, and 49%, respectively. Furthermore, water use efficiency predictions were improved conceptually, qualitatively, and in magnitude. The new model's GPP responses to nitrogen deposition, CO 2 fertilization, and climate also differed from the baseline model. The mechanistic representation of leaf-level nitrogen allocation and a theoretically consistent treatment of competition with belowground consumers led to overall improvements in global carbon cycling predictions.« less
NASA Astrophysics Data System (ADS)
Ghimire, Bardan; Riley, William J.; Koven, Charles D.; Mu, Mingquan; Randerson, James T.
2016-06-01
In many ecosystems, nitrogen is the most limiting nutrient for plant growth and productivity. However, current Earth System Models (ESMs) do not mechanistically represent functional nitrogen allocation for photosynthesis or the linkage between nitrogen uptake and root traits. The current version of CLM (4.5) links nitrogen availability and plant productivity via (1) an instantaneous downregulation of potential photosynthesis rates based on soil mineral nitrogen availability, and (2) apportionment of soil nitrogen between plants and competing nitrogen consumers assumed to be proportional to their relative N demands. However, plants do not photosynthesize at potential rates and then downregulate; instead photosynthesis rates are governed by nitrogen that has been allocated to the physiological processes underpinning photosynthesis. Furthermore, the role of plant roots in nutrient acquisition has also been largely ignored in ESMs. We therefore present a new plant nitrogen model for CLM4.5 with (1) improved representations of linkages between leaf nitrogen and plant productivity based on observed relationships in a global plant trait database and (2) plant nitrogen uptake based on root-scale Michaelis-Menten uptake kinetics. Our model improvements led to a global bias reduction in GPP, LAI, and biomass of 70%, 11%, and 49%, respectively. Furthermore, water use efficiency predictions were improved conceptually, qualitatively, and in magnitude. The new model's GPP responses to nitrogen deposition, CO2 fertilization, and climate also differed from the baseline model. The mechanistic representation of leaf-level nitrogen allocation and a theoretically consistent treatment of competition with belowground consumers led to overall improvements in global carbon cycling predictions.
Price, Peter W; Hunter, Mark D
2015-06-01
The interaction between the arroyo willow, Salix lasiolepis Bentham, and its specialist herbivore, the arroyo willow stem-galling sawfly, Euura lasiolepis Smith (Hymenoptera: Tenthredinidae), was studied for 32 yr in Flagstaff, AZ, emphasizing a mechanistic understanding of insect population dynamics. Long-term weather records were evaluated to provide a climatic context for this study. Previously, predictive models of sawfly dynamics were developed from estimates of sawfly gall density made between 1981 and 2002; one model each for drier and wetter sites. Predictor variables in these models included winter precipitation and the Palmer Drought Severity Index, which impact the willow growth, with strong bottom-up effects on sawflies. We now evaluate original model predictions of sawfly population dynamics using new data (from 2003-2012). Additionally, willow resources were evaluated in 1986 and in 2012, using as criteria clone area, shoot density, and shoot length. The dry site model accounted for 40% of gall population density variation between 2003 and 2012 (69% over the 32 yr), providing strong support for the bottom-up, mechanistic hypothesis that water supply to willow hosts impacts sawfly populations. The current drying trend stressed willow clones: in drier sites, willow resources declined and gall density decreased by 98%. The wet site model accounted for 23% of variation in gall population density between 2003 and 2012 (48% over 30 yr), consistent with less water limitation. Nonetheless, gall populations were reduced by 72%. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Yamana, Teresa K.; Eltahir, Elfatih A. B.
2011-02-01
This paper describes the use of satellite-based estimates of rainfall to force the Hydrology, Entomology and Malaria Transmission Simulator (HYDREMATS), a hydrology-based mechanistic model of malaria transmission. We first examined the temporal resolution of rainfall input required by HYDREMATS. Simulations conducted over Banizoumbou village in Niger showed that for reasonably accurate simulation of mosquito populations, the model requires rainfall data with at least 1 h resolution. We then investigated whether HYDREMATS could be effectively forced by satellite-based estimates of rainfall instead of ground-based observations. The Climate Prediction Center morphing technique (CMORPH) precipitation estimates distributed by the National Oceanic and Atmospheric Administration are available at a 30 min temporal resolution and 8 km spatial resolution. We compared mosquito populations simulated by HYDREMATS when the model is forced by adjusted CMORPH estimates and by ground observations. The results demonstrate that adjusted rainfall estimates from satellites can be used with a mechanistic model to accurately simulate the dynamics of mosquito populations.
DOT National Transportation Integrated Search
2016-10-01
The Georgia Department of Transportation (GDOT) has initiated a Georgia Long-Term Pavement Performance (GALTPP) monitoring program 1) to provide data for calibrating the prediction models in the AASHTO Mechanistic-Empirical Pavement Design Guide (MEP...
Schuwirth, Nele; Reichert, Peter
2013-02-01
For the first time, we combine concepts of theoretical food web modeling, the metabolic theory of ecology, and ecological stoichiometry with the use of functional trait databases to predict the coexistence of invertebrate taxa in streams. We developed a mechanistic model that describes growth, death, and respiration of different taxa dependent on various environmental influence factors to estimate survival or extinction. Parameter and input uncertainty is propagated to model results. Such a model is needed to test our current quantitative understanding of ecosystem structure and function and to predict effects of anthropogenic impacts and restoration efforts. The model was tested using macroinvertebrate monitoring data from a catchment of the Swiss Plateau. Even without fitting model parameters, the model is able to represent key patterns of the coexistence structure of invertebrates at sites varying in external conditions (litter input, shading, water quality). This confirms the suitability of the model concept. More comprehensive testing and resulting model adaptations will further increase the predictive accuracy of the model.
Mechanistic modeling of reactive soil nitrogen emissions across agricultural management practices
NASA Astrophysics Data System (ADS)
Rasool, Q. Z.; Miller, D. J.; Bash, J. O.; Venterea, R. T.; Cooter, E. J.; Hastings, M. G.; Cohan, D. S.
2017-12-01
The global reactive nitrogen (N) budget has increased by a factor of 2-3 from pre-industrial levels. This increase is especially pronounced in highly N fertilized agricultural regions in summer. The reactive N emissions from soil to atmosphere can be in reduced (NH3) or oxidized (NO, HONO, N2O) forms, depending on complex biogeochemical transformations of soil N reservoirs. Air quality models like CMAQ typically neglect soil emissions of HONO and N2O. Previously, soil NO emissions estimated by models like CMAQ remained parametric and inconsistent with soil NH3 emissions. Thus, there is a need to more mechanistically and consistently represent the soil N processes that lead to reactive N emissions to the atmosphere. Our updated approach estimates soil NO, HONO and N2O emissions by incorporating detailed agricultural fertilizer inputs from EPIC, and CMAQ-modeled N deposition, into the soil N pool. EPIC addresses the nitrification, denitrification and volatilization rates along with soil N pools for agricultural soils. Suitable updates to account for factors like nitrite (NO2-) accumulation not addressed in EPIC, will also be made. The NO and N2O emissions from nitrification and denitrification are computed mechanistically using the N sub-model of DAYCENT. These mechanistic definitions use soil water content, temperature, NH4+ and NO3- concentrations, gas diffusivity and labile C availability as dependent parameters at various soil layers. Soil HONO emissions found to be most probable under high NO2- availability will be based on observed ratios of HONO to NO emissions under different soil moistures, pH and soil types. The updated scheme will utilize field-specific soil properties and N inputs across differing manure management practices such as tillage. Comparison of the modeled soil NO emission rates from the new mechanistic and existing schemes against field measurements will be discussed. Our updated framework will help to predict the diurnal and daily variability of different reactive N emissions (NO, HONO, N2O) with soil temperature, moisture and N inputs.
A new mechanistic framework to predict OCS fluxes in soils
NASA Astrophysics Data System (ADS)
Sauze, Joana; Ogee, Jérôme; Launois, Thomas; Kesselmeier, Jürgen; Van Diest, Heidi; Wingate, Lisa
2015-04-01
A better description of the amplitude of photosynthetic and respiratory gross CO2 fluxes at large scales is needed to improve our predictions of the current and future global CO2 cycle. Carbonyl sulfide (COS) is the most abundant sulphur gas in the atmosphere and has been proposed as a new tracer of gross photosynthesis, as the uptake of COS from the atmosphere is dominated by the activity of carbonic anhydrase (CA), an enzyme abundant in leaves that also catalyses CO2 hydration during photosynthesis. However, soils also exchange COS with the atmosphere and there is growing evidence that this flux must also be accounted for in atmospheric budgets. In this context a new mechanistic description of soil-atmosphere COS exchange is clearly needed. Soils can take up COS from the atmosphere as the soil biota also contain CA, and COS emissions from soils have also been reported in agricultural fields or anoxic soils. Previous studies have also shown that soil COS fluxes present an optimum soil water content and soil temperature. Here we propose a new mechanistic framework to predict the fluxes of COS between the soils and the atmosphere. We describe the COS soil budget by a first-order reaction-diffusion-production equation, assuming that the hydrolysis of COS by CA is total and irreversible. To describe COS diffusion through the soil matrix, we use different formulations of soil air-filled pore space and temperature, depending on the turbulence level above the soil surface. Using this model we are able to explain the observed presence of an optimum temperature for soil COS uptake and show how this optimum can shift to cooler temperatures in the presence of soil COS emissions. Our model can also explain the observed optimum with soil moisture content previously described in the literature (e.g. Van Diest & Kesselmeier, 2008) as a result of diffusional constraints on COS hydrolysis. These diffusional constraints are also responsible for the response of COS uptake to soil weight and depth observed by Kesselmeier et al. (1999). In order to simulate the exact COS uptake rates and patterns observed on several soils collected from a range of biomes (Van Diest & Kesselmeier, 2008) different CA activities had to be evoked in each soil type, coherent with the expected soil microbial population size and diversity. A better description of the drivers governing soil CA activity and COS emissions from soils is needed before incorporating our new mechanistic model of soil-atmosphere COS uptake in large-scale ecosystem models and COS atmospheric budgets.
Automated adaptive inference of phenomenological dynamical models.
Daniels, Bryan C; Nemenman, Ilya
2015-08-21
Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved.
Automated adaptive inference of phenomenological dynamical models
Daniels, Bryan C.; Nemenman, Ilya
2015-01-01
Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved. PMID:26293508
Comparison of simplified models in the prediction of two phase flow in pipelines
NASA Astrophysics Data System (ADS)
Jerez-Carrizales, M.; Jaramillo, J. E.; Fuentes, D.
2014-06-01
Prediction of two phase flow in pipelines is a common task in engineering. It is a complex phenomenon and many models have been developed to find an approximate solution to the problem. Some old models, such as the Hagedorn & Brown (HB) model, have been highlighted by many authors to give very good performance. Furthermore, many modifications have been applied to this method to improve its predictions. In this work two simplified models which are based on empiricism (HB and Mukherjee and Brill, MB) are considered. One mechanistic model which is based on the physics of the phenomenon (AN) and it still needs some correlations called closure relations is also used. Moreover, a drift flux model defined in steady state that is flow pattern dependent (HK model) is implemented. The implementation of these methods was tested using published data in the scientific literature for vertical upward flows. Furthermore, a comparison of the predictive performance of the four models is done against a well from Campo Escuela Colorado. Difference among four models is smaller than difference with experimental data from the well in Campo Escuela Colorado.
Cunning, Ross; Muller, Erik B; Gates, Ruth D; Nisbet, Roger M
2017-10-27
Coral reef ecosystems owe their ecological success - and vulnerability to climate change - to the symbiotic metabolism of corals and Symbiodinium spp. The urgency to understand and predict the stability and breakdown of these symbioses (i.e., coral 'bleaching') demands the development and application of theoretical tools. Here, we develop a dynamic bioenergetic model of coral-Symbiodinium symbioses that demonstrates realistic steady-state patterns in coral growth and symbiont abundance across gradients of light, nutrients, and feeding. Furthermore, by including a mechanistic treatment of photo-oxidative stress, the model displays dynamics of bleaching and recovery that can be explained as transitions between alternate stable states. These dynamics reveal that "healthy" and "bleached" states correspond broadly to nitrogen- and carbon-limitation in the system, with transitions between them occurring as integrated responses to multiple environmental factors. Indeed, a suite of complex emergent behaviors reproduced by the model (e.g., bleaching is exacerbated by nutrients and attenuated by feeding) suggests it captures many important attributes of the system; meanwhile, its modular framework and open source R code are designed to facilitate further problem-specific development. We see significant potential for this modeling framework to generate testable hypotheses and predict integrated, mechanistic responses of corals to environmental change, with important implications for understanding the performance and maintenance of symbiotic systems. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Monte Carlo modeling of atomic oxygen attack of polymers with protective coatings on LDEF
NASA Technical Reports Server (NTRS)
Banks, Bruce A.; Degroh, Kim K.; Auer, Bruce M.; Gebauer, Linda; Edwards, Jonathan L.
1993-01-01
Characterization of the behavior of atomic oxygen interaction with materials on the Long Duration Exposure Facility (LDEF) assists in understanding of the mechanisms involved. Thus the reliability of predicting in-space durability of materials based on ground laboratory testing should be improved. A computational model which simulates atomic oxygen interaction with protected polymers was developed using Monte Carlo techniques. Through the use of an assumed mechanistic behavior of atomic oxygen interaction based on in-space atomic oxygen erosion of unprotected polymers and ground laboratory atomic oxygen interaction with protected polymers, prediction of atomic oxygen interaction with protected polymers on LDEF was accomplished. However, the results of these predictions are not consistent with the observed LDEF results at defect sites in protected polymers. Improved agreement between observed LDEF results and predicted Monte Carlo modeling can be achieved by modifying of the atomic oxygen interactive assumptions used in the model. LDEF atomic oxygen undercutting results, modeling assumptions, and implications are presented.
Gomez-Ramirez, Jaime; Costa, Tommaso
2017-12-01
Here we investigate whether systems that minimize prediction error e.g. predictive coding, can also show creativity, or on the contrary, prediction error minimization unqualifies for the design of systems that respond in creative ways to non-recurrent problems. We argue that there is a key ingredient that has been overlooked by researchers that needs to be incorporated to understand intelligent behavior in biological and technical systems. This ingredient is boredom. We propose a mathematical model based on the Black-Scholes-Merton equation which provides mechanistic insights into the interplay between boredom and prediction pleasure as the key drivers of behavior. Copyright © 2017 Elsevier B.V. All rights reserved.
Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models
NASA Astrophysics Data System (ADS)
Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.
2017-12-01
Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream measurements.
Per Aspera ad Astra: Through Complex Population Modeling to Predictive Theory.
Topping, Christopher J; Alrøe, Hugo Fjelsted; Farrell, Katharine N; Grimm, Volker
2015-11-01
Population models in ecology are often not good at predictions, even if they are complex and seem to be realistic enough. The reason for this might be that Occam's razor, which is key for minimal models exploring ideas and concepts, has been too uncritically adopted for more realistic models of systems. This can tie models too closely to certain situations, thereby preventing them from predicting the response to new conditions. We therefore advocate a new kind of parsimony to improve the application of Occam's razor. This new parsimony balances two contrasting strategies for avoiding errors in modeling: avoiding inclusion of nonessential factors (false inclusions) and avoiding exclusion of sometimes-important factors (false exclusions). It involves a synthesis of traditional modeling and analysis, used to describe the essentials of mechanistic relationships, with elements that are included in a model because they have been reported to be or can arguably be assumed to be important under certain conditions. The resulting models should be able to reflect how the internal organization of populations change and thereby generate representations of the novel behavior necessary for complex predictions, including regime shifts.
Quantitative predictions of streamflow variability in the Susquehanna River Basin
NASA Astrophysics Data System (ADS)
Alexander, R.; Boyer, E. W.; Leonard, L. N.; Duffy, C.; Schwarz, G. E.; Smith, R. A.
2012-12-01
Hydrologic researchers and water managers have increasingly sought an improved understanding of the major processes that control fluxes of water and solutes across diverse environmental settings and large spatial scales. Regional analyses of observed streamflow data have led to advances in our knowledge of relations among land use, climate, and streamflow, with methodologies ranging from statistical assessments of multiple monitoring sites to the regionalization of the parameters of catchment-scale mechanistic simulation models. However, gaps remain in our understanding of the best ways to transfer the knowledge of hydrologic response and governing processes among locations, including methods for regionalizing streamflow measurements and model predictions. We developed an approach to predict variations in streamflow using the SPARROW (SPAtially Referenced Regression On Watershed attributes) modeling infrastructure, with mechanistic functions, mass conservation constraints, and statistical estimation of regional and sub-regional parameters. We used the model to predict discharge in the Susquehanna River Basin (SRB) under varying hydrological regimes that are representative of contemporary flow conditions. The resulting basin-scale water balance describes mean monthly flows in stream reaches throughout the entire SRB (represented at a 1:100,000 scale using the National Hydrologic Data network), with water supply and demand components that are inclusive of a range of hydrologic, climatic, and cultural properties (e.g., precipitation, evapotranspiration, soil and groundwater storage, runoff, baseflow, water use). We compare alternative models of varying complexity that reflect differences in the number and types of explanatory variables and functional expressions as well as spatial and temporal variability in the model parameters. Statistical estimation of the models reveals the levels of complexity that can be uniquely identified, subject to the information content and uncertainties of the hydrologic and climate measurements. Assessment of spatial variations in the model parameters and predictions provides an improved understanding of how much of the hydrologic response to land use, climate, and other properties is unique to specific locations versus more universally observed across catchments of the SRB. This approach advances understanding of water cycle variability at any location throughout the stream network, as a function of both landscape characteristics (e.g., soils, vegetation, land use) and external forcings (e.g., precipitation quantity and frequency). These improvements in predictions of streamflow dynamics will advance the ability to predict spatial and temporal variability in key solutes, such as nutrients, and their delivery to the Chesapeake Bay.
Novel in vitro and mathematical models for the prediction of chemical toxicity.
Williams, Dominic P; Shipley, Rebecca; Ellis, Marianne J; Webb, Steve; Ward, John; Gardner, Iain; Creton, Stuart
2013-01-01
The focus of much scientific and medical research is directed towards understanding the disease process and defining therapeutic intervention strategies. The scientific basis of drug safety is very complex and currently remains poorly understood, despite the fact that adverse drug reactions (ADRs) are a major health concern and a serious impediment to development of new medicines. Toxicity issues account for ∼21% drug attrition during drug development and safety testing strategies require considerable animal use. Mechanistic relationships between drug plasma levels and molecular/cellular events that culminate in whole organ toxicity underpins development of novel safety assessment strategies. Current in vitro test systems are poorly predictive of toxicity of chemicals entering the systemic circulation, particularly to the liver. Such systems fall short because of (1) the physiological gap between cells currently used and human hepatocytes existing in their native state, (2) the lack of physiological integration with other cells/systems within organs, required to amplify the initial toxicological lesion into overt toxicity, (3) the inability to assess how low level cell damage induced by chemicals may develop into overt organ toxicity in a minority of patients, (4) lack of consideration of systemic effects. Reproduction of centrilobular and periportal hepatocyte phenotypes in in vitro culture is crucial for sensitive detection of cellular stress. Hepatocyte metabolism/phenotype is dependent on cell position along the liver lobule, with corresponding differences in exposure to substrate, oxygen and hormone gradients. Application of bioartificial liver (BAL) technology can encompass in vitro predictive toxicity testing with enhanced sensitivity and improved mechanistic understanding. Combining this technology with mechanistic mathematical models describing intracellular metabolism, fluid-flow, substrate, hormone and nutrient distribution provides the opportunity to design the BAL specifically to mimic the in vivo scenario. Such mathematical models enable theoretical hypothesis testing, will inform the design of in vitro experiments, and will enable both refinement and reduction of in vivo animal trials. In this way, development of novel mathematical modelling tools will help to focus and direct in vitro and in vivo research, and can be used as a framework for other areas of drug safety science.
Novel in vitro and mathematical models for the prediction of chemical toxicity
Shipley, Rebecca; Ellis, Marianne J.; Webb, Steve; Ward, John; Gardner, Iain; Creton, Stuart
2013-01-01
The focus of much scientific and medical research is directed towards understanding the disease process and defining therapeutic intervention strategies. The scientific basis of drug safety is very complex and currently remains poorly understood, despite the fact that adverse drug reactions (ADRs) are a major health concern and a serious impediment to development of new medicines. Toxicity issues account for ∼21% drug attrition during drug development and safety testing strategies require considerable animal use. Mechanistic relationships between drug plasma levels and molecular/cellular events that culminate in whole organ toxicity underpins development of novel safety assessment strategies. Current in vitro test systems are poorly predictive of toxicity of chemicals entering the systemic circulation, particularly to the liver. Such systems fall short because of (1) the physiological gap between cells currently used and human hepatocytes existing in their native state, (2) the lack of physiological integration with other cells/systems within organs, required to amplify the initial toxicological lesion into overt toxicity, (3) the inability to assess how low level cell damage induced by chemicals may develop into overt organ toxicity in a minority of patients, (4) lack of consideration of systemic effects. Reproduction of centrilobular and periportal hepatocyte phenotypes in in vitro culture is crucial for sensitive detection of cellular stress. Hepatocyte metabolism/phenotype is dependent on cell position along the liver lobule, with corresponding differences in exposure to substrate, oxygen and hormone gradients. Application of bioartificial liver (BAL) technology can encompass in vitro predictive toxicity testing with enhanced sensitivity and improved mechanistic understanding. Combining this technology with mechanistic mathematical models describing intracellular metabolism, fluid-flow, substrate, hormone and nutrient distribution provides the opportunity to design the BAL specifically to mimic the in vivo scenario. Such mathematical models enable theoretical hypothesis testing, will inform the design of in vitro experiments, and will enable both refinement and reduction of in vivo animal trials. In this way, development of novel mathematical modelling tools will help to focus and direct in vitro and in vivo research, and can be used as a framework for other areas of drug safety science. PMID:26966512
Planning for bird conservation: a tale of two models
Johnson, Douglas H.; Winter, Maiken
2005-01-01
Planning for bird conservation has become increasingly reliant on remote sensing, geographical information systems, and, especially, models used to predict the occurrence of bird species as well as their density and demographics. We address the role of such tools by contrasting two models used in bird conservation. One, the Mallard ( Anas platyrhynchos) productivity model, is very detailed, mechanistic, and based on an enormous body of research. The Mallard model has been extensively used with success to guide management efforts for Mallards and certain other species of ducks. The other model, the concept of Bird Conservation Areas, is more simple, less mechanistic, and less well-grounded in research. This concept proposes that large patches of suitable habitat in a proper landscape will be adequate to maintain populations of birds. The Bird Conservation Area concept recently has been evaluated in the northern tallgrass prairie, where its fundamental assumptions have been found not to hold consistently. We argue that a more comprehensive understanding of the biology of individual species, and how they respond to habitat features, will be essential before we can use remotely sensed information and geographic information system products with confidence.
Mechanistic solutions to the opening of the Gulf of Mexico
Schouten, Hans; Klitgord, Kim D.
1994-01-01
Two mechanistic models-which are unlike the traditional plate-tectonic landfill models used for most proposed Pangea reconstructions of the Yucatán block-relate the Mesozoic opening of the Gulf of Mexico directly to the movement of the North and South American plates: (1) a previous piggyback model in which Yucatán moves with South America out of the western gulf and (2) a new edge-driven model in which the motion of the Yucatán block is caused by forces applied to its margins by the movement of the North and South American plates. In the second model, Yucatán moves out of the northern Gulf of Mexico as a gear or roller bearing. On the basis of magnetic edge anomalies around the gulf, this edge-driven model predicts that from the Bathonian to Tithonian (~170 to ~50 Ma), Yucatán was rotated ~60° counterclockwise as a rigid block between North and South America with rift propagation and extension occurring simultaneously in the Gulf of Mexico and Yucatán Basin.
Mao, Zhun; Saint-André, Laurent; Bourrier, Franck; Stokes, Alexia; Cordonnier, Thomas
2015-01-01
Background and Aims In mountain ecosystems, predicting root density in three dimensions (3-D) is highly challenging due to the spatial heterogeneity of forest communities. This study presents a simple and semi-mechanistic model, named ChaMRoots, that predicts root interception density (RID, number of roots m–2). ChaMRoots hypothesizes that RID at a given point is affected by the presence of roots from surrounding trees forming a polygon shape. Methods The model comprises three sub-models for predicting: (1) the spatial heterogeneity – RID of the finest roots in the top soil layer as a function of tree basal area at breast height, and the distance between the tree and a given point; (2) the diameter spectrum – the distribution of RID as a function of root diameter up to 50 mm thick; and (3) the vertical profile – the distribution of RID as a function of soil depth. The RID data used for fitting in the model were measured in two uneven-aged mountain forest ecosystems in the French Alps. These sites differ in tree density and species composition. Key Results In general, the validation of each sub-model indicated that all sub-models of ChaMRoots had good fits. The model achieved a highly satisfactory compromise between the number of aerial input parameters and the fit to the observed data. Conclusions The semi-mechanistic ChaMRoots model focuses on the spatial distribution of root density at the tree cluster scale, in contrast to the majority of published root models, which function at the level of the individual. Based on easy-to-measure characteristics, simple forest inventory protocols and three sub-models, it achieves a good compromise between the complexity of the case study area and that of the global model structure. ChaMRoots can be easily coupled with spatially explicit individual-based forest dynamics models and thus provides a highly transferable approach for modelling 3-D root spatial distribution in complex forest ecosystems. PMID:26173892
Schoolmaster, Donald; Stagg, Camille L.
2018-01-01
A trade-off between competitive ability and stress tolerance has been hypothesized and empirically supported to explain the zonation of species across stress gradients for a number of systems. Since stress often reduces plant productivity, one might expect a pattern of decreasing productivity across the zones of the stress gradient. However, this pattern is often not observed in coastal wetlands that show patterns of zonation along a salinity gradient. To address the potentially complex relationship between stress, zonation, and productivity in coastal wetlands, we developed a model of plant biomass as a function of resource competition and salinity stress. Analysis of the model confirms the conventional wisdom that a trade-off between competitive ability and stress tolerance is a necessary condition for zonation. It also suggests that a negative relationship between salinity and production can be overcome if (1) the supply of the limiting resource increases with greater salinity stress or (2) nutrient use efficiency increases with increasing salinity. We fit the equilibrium solution of the dynamic model to data from Louisiana coastal wetlands to test its ability to explain patterns of production across the landscape gradient and derive predictions that could be tested with independent data. We found support for a number of the model predictions, including patterns of decreasing competitive ability and increasing nutrient use efficiency across a gradient from freshwater to saline wetlands. In addition to providing a quantitative framework to support the mechanistic hypotheses of zonation, these results suggest that this simple model is a useful platform to further build upon, simulate and test mechanistic hypotheses of more complex patterns and phenomena in coastal wetlands.
Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P
2010-06-01
The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it was possible to interchange certain descriptors (i.e. molecular weight and melting point) without incurring a loss of model quality. Such synergy suggested that a model constructed from discrete terms in an equation may not be the most appropriate way of representing mechanistic understandings of skin absorption.
Benigni, Romualdo; Bossa, Cecilia
2008-01-01
In the past decades, chemical carcinogenicity has been the object of mechanistic studies that have been translated into valuable experimental (e.g., the Salmonella assays system) and theoretical (e.g., compilations of structure alerts for chemical carcinogenicity) models. These findings remain the basis of the science and regulation of mutagens and carcinogens. Recent advances in the organization and treatment of large databases consisting of both biological and chemical information nowadays allows for a much easier and more refined view of data. This paper reviews recent analyses on the predictive performance of various lists of structure alerts, including a new compilation of alerts that combines previous work in an optimized form for computer implementation. The revised compilation is part of the Toxtree 1.50 software (freely available from the European Chemicals Bureau website). The use of structural alerts for the chemical biological profiling of a large database of Salmonella mutagenicity results is also reported. Together with being a repository of the science on the chemical biological interactions at the basis of chemical carcinogenicity, the SAs have a crucial role in practical applications for risk assessment, for: (a) description of sets of chemicals; (b) preliminary hazard characterization; (c) formation of categories for e.g., regulatory purposes; (d) generation of subsets of congeneric chemicals to be analyzed subsequently with QSAR methods; (e) priority setting. An important aspect of SAs as predictive toxicity tools is that they derive directly from mechanistic knowledge. The crucial role of mechanistic knowledge in the process of applying (Q)SAR considerations to risk assessment should be strongly emphasized. Mechanistic knowledge provides a ground for interaction and dialogue between model developers, toxicologists and regulators, and permits the integration of the (Q)SAR results into a wider regulatory framework, where different types of evidence and data concur or complement each other as a basis for making decisions and taking actions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohanty, Subhasish; Soppet, William K.; Majumdar, Saurindranath
Argonne National Laboratory (ANL), under the sponsorship of Department of Energy’s Light Water Reactor Sustainability (LWRS) program, is trying to develop a mechanistic approach for more accurate life estimation of LWR components. In this context, ANL has conducted many fatigue experiments under different test and environment conditions on type 316 stainless steel (316SS) material which is widely used in the US reactors. Contrary to the conventional S~N curve based empirical fatigue life estimation approach, the aim of the present DOE sponsored work is to develop an understanding of the material ageing issues more mechanistically (e.g. time dependent hardening and softening)more » under different test and environmental conditions. Better mechanistic understanding will help develop computer-based advanced modeling tools to better extrapolate stress-strain evolution of reactor components under multi-axial stress states and hence help predict their fatigue life more accurately. In this paper (part-I) the fatigue experiments under different test and environment conditions and related stress-strain results for 316 SS are discussed. In a second paper (part-II) the related evolutionary cyclic plasticity material modeling techniques and results are discussed.« less
González-Domínguez, Elisa; Armengol, Josep; Rossi, Vittorio
2014-01-01
A mechanistic, dynamic model was developed to predict infection of loquat fruit by conidia of Fusicladium eriobotryae, the causal agent of loquat scab. The model simulates scab infection periods and their severity through the sub-processes of spore dispersal, infection, and latency (i.e., the state variables); change from one state to the following one depends on environmental conditions and on processes described by mathematical equations. Equations were developed using published data on F. eriobotryae mycelium growth, conidial germination, infection, and conidial dispersion pattern. The model was then validated by comparing model output with three independent data sets. The model accurately predicts the occurrence and severity of infection periods as well as the progress of loquat scab incidence on fruit (with concordance correlation coefficients >0.95). Model output agreed with expert assessment of the disease severity in seven loquat-growing seasons. Use of the model for scheduling fungicide applications in loquat orchards may help optimise scab management and reduce fungicide applications. PMID:25233340
Ufuk, Ayşe; Assmus, Frauke; Francis, Laura; Plumb, Jonathan; Damian, Valeriu; Gertz, Michael; Houston, J Brian; Galetin, Aleksandra
2017-04-03
Accumulation of respiratory drugs in human alveolar macrophages (AMs) has not been extensively studied in vitro and in silico despite its potential impact on therapeutic efficacy and/or occurrence of phospholipidosis. The current study aims to characterize the accumulation and subcellular distribution of drugs with respiratory indication in human AMs and to develop an in silico mechanistic AM model to predict lysosomal accumulation of investigated drugs. The data set included 9 drugs previously investigated in rat AM cell line NR8383. Cell-to-unbound medium concentration ratio (K p,cell ) of all drugs (5 μM) was determined to assess the magnitude of intracellular accumulation. The extent of lysosomal sequestration in freshly isolated human AMs from multiple donors (n = 5) was investigated for clarithromycin and imipramine (positive control) using an indirect in vitro method (±20 mM ammonium chloride, NH 4 Cl). The AM cell parameters and drug physicochemical data were collated to develop an in silico mechanistic AM model. Three in silico models differing in their description of drug membrane partitioning were evaluated; model (1) relied on octanol-water partitioning of drugs, model (2) used in vitro data to account for this process, and model (3) predicted membrane partitioning by incorporating AM phospholipid fractions. In vitro K p,cell ranged >200-fold for respiratory drugs, with the highest accumulation seen for clarithromycin. A good agreement in K p,cell was observed between human AMs and NR8383 (2.45-fold bias), highlighting NR8383 as a potentially useful in vitro surrogate tool to characterize drug accumulation in AMs. The mean K p,cell of clarithromycin (81, CV = 51%) and imipramine (963, CV = 54%) were reduced in the presence of NH 4 Cl by up to 67% and 81%, respectively, suggesting substantial contribution of lysosomal sequestration and intracellular binding in the accumulation of these drugs in human AMs. The in vitro data showed variability in drug accumulation between individual human AM donors due to possible differences in lysosomal abundance, volume, and phospholipid content, which may have important clinical implications. Consideration of drug-acidic phospholipid interactions significantly improved the performance of the in silico models; use of in vitro K p,cell obtained in the presence of NH 4 Cl as a surrogate for membrane partitioning (model (2)) captured the variability in clarithromycin and imipramine K p,cell observed in vitro and showed the best ability to predict correctly positive and negative lysosomotropic properties. The developed mechanistic AM model represents a useful in silico tool to predict lysosomal and cellular drug concentrations based on drug physicochemical data and system specific properties, with potential application to other cell types.
Avian models for toxicity testing
Hill, E.F.; Hoffman, D.J.
1984-01-01
The use of birds as test models in experimental and environmental toxicology as related to health effects is reviewed, and an overview of descriptive tests routinely used in wildlife toxicology is provided. Toxicologic research on birds may be applicable to human health both directly by their use as models for mechanistic and descriptive studies and indirectly as monitors of environmental quality. Topics include the use of birds as models for study of teratogenesis and embryotoxicity, neurotoxicity, behavior, trends of environmental pollution, and for use in predictive wildlife toxicology. Uses of domestic and wild-captured birds are discussed.
Keane, R E; Ryan, K C; Running, S W
1996-03-01
A mechanistic, biogeochemical succession model, FIRE-BGC, was used to investigate the role of fire on long-term landscape dynamics in northern Rocky Mountain coniferous forests of Glacier National Park, Montana, USA. FIRE-BGC is an individual-tree model-created by merging the gap-phase process-based model FIRESUM with the mechanistic ecosystem biogeochemical model FOREST-BGC-that has mixed spatial and temporal resolution in its simulation architecture. Ecological processes that act at a landscape level, such as fire and seed dispersal, are simulated annually from stand and topographic information. Stand-level processes, such as tree establishment, growth and mortality, organic matter accumulation and decomposition, and undergrowth plant dynamics are simulated both daily and annually. Tree growth is mechanistically modeled based on the ecosystem process approach of FOREST-BGC where carbon is fixed daily by forest canopy photosynthesis at the stand level. Carbon allocated to the tree stem at the end of the year generates the corresponding diameter and height growth. The model also explicitly simulates fire behavior and effects on landscape characteristics. We simulated the effects of fire on ecosystem characteristics of net primary productivity, evapotranspiration, standing crop biomass, nitrogen cycling and leaf area index over 200 years for the 50,000-ha McDonald Drainage in Glacier National Park. Results show increases in net primary productivity and available nitrogen when fires are included in the simulation. Standing crop biomass and evapotranspiration decrease under a fire regime. Shade-intolerant species dominate the landscape when fires are excluded. Model tree increment predictions compared well with field data.
Isazadeh, Siavash; Feng, Min; Urbina Rivas, Luis Enrique; Frigon, Dominic
2014-04-15
Two pilot-scale activated sludge reactors were operated for 98 days to provide the necessary data to develop and validate a new mathematical model predicting the reduction of biosolids production by ozonation of the return activated sludge (RAS). Three ozone doses were tested during the study. In addition to the pilot-scale study, laboratory-scale experiments were conducted with mixed liquor suspended solids and with pure cultures to parameterize the biomass inactivation process during exposure to ozone. The experiments revealed that biomass inactivation occurred even at the lowest doses, but that it was not associated with extensive COD solubilization. For validation, the model was used to simulate the temporal dynamics of the pilot-scale operational data. Increasing the description accuracy of the inactivation process improved the precision of the model in predicting the operational data. Copyright © 2014 Elsevier B.V. All rights reserved.
Lomnitz, Jason G.; Savageau, Michael A.
2016-01-01
Mathematical models of biochemical systems provide a means to elucidate the link between the genotype, environment, and phenotype. A subclass of mathematical models, known as mechanistic models, quantitatively describe the complex non-linear mechanisms that capture the intricate interactions between biochemical components. However, the study of mechanistic models is challenging because most are analytically intractable and involve large numbers of system parameters. Conventional methods to analyze them rely on local analyses about a nominal parameter set and they do not reveal the vast majority of potential phenotypes possible for a given system design. We have recently developed a new modeling approach that does not require estimated values for the parameters initially and inverts the typical steps of the conventional modeling strategy. Instead, this approach relies on architectural features of the model to identify the phenotypic repertoire and then predict values for the parameters that yield specific instances of the system that realize desired phenotypic characteristics. Here, we present a collection of software tools, the Design Space Toolbox V2 based on the System Design Space method, that automates (1) enumeration of the repertoire of model phenotypes, (2) prediction of values for the parameters for any model phenotype, and (3) analysis of model phenotypes through analytical and numerical methods. The result is an enabling technology that facilitates this radically new, phenotype-centric, modeling approach. We illustrate the power of these new tools by applying them to a synthetic gene circuit that can exhibit multi-stability. We then predict values for the system parameters such that the design exhibits 2, 3, and 4 stable steady states. In one example, inspection of the basins of attraction reveals that the circuit can count between three stable states by transient stimulation through one of two input channels: a positive channel that increases the count, and a negative channel that decreases the count. This example shows the power of these new automated methods to rapidly identify behaviors of interest and efficiently predict parameter values for their realization. These tools may be applied to understand complex natural circuitry and to aid in the rational design of synthetic circuits. PMID:27462346
Learning To Fold Proteins Using Energy Landscape Theory
Schafer, N.P.; Kim, B.L.; Zheng, W.; Wolynes, P.G.
2014-01-01
This review is a tutorial for scientists interested in the problem of protein structure prediction, particularly those interested in using coarse-grained molecular dynamics models that are optimized using lessons learned from the energy landscape theory of protein folding. We also present a review of the results of the AMH/AMC/AMW/AWSEM family of coarse-grained molecular dynamics protein folding models to illustrate the points covered in the first part of the article. Accurate coarse-grained structure prediction models can be used to investigate a wide range of conceptual and mechanistic issues outside of protein structure prediction; specifically, the paper concludes by reviewing how AWSEM has in recent years been able to elucidate questions related to the unusual kinetic behavior of artificially designed proteins, multidomain protein misfolding, and the initial stages of protein aggregation. PMID:25308991
Temporal Prediction Errors Affect Short-Term Memory Scanning Response Time.
Limongi, Roberto; Silva, Angélica M
2016-11-01
The Sternberg short-term memory scanning task has been used to unveil cognitive operations involved in time perception. Participants produce time intervals during the task, and the researcher explores how task performance affects interval production - where time estimation error is the dependent variable of interest. The perspective of predictive behavior regards time estimation error as a temporal prediction error (PE), an independent variable that controls cognition, behavior, and learning. Based on this perspective, we investigated whether temporal PEs affect short-term memory scanning. Participants performed temporal predictions while they maintained information in memory. Model inference revealed that PEs affected memory scanning response time independently of the memory-set size effect. We discuss the results within the context of formal and mechanistic models of short-term memory scanning and predictive coding, a Bayes-based theory of brain function. We state the hypothesis that our finding could be associated with weak frontostriatal connections and weak striatal activity.
Predicting climate change impacts on polar bear litter size.
Molnár, Péter K; Derocher, Andrew E; Klanjscek, Tin; Lewis, Mark A
2011-02-08
Predicting the ecological impacts of climate warming is critical for species conservation. Incorporating future warming into population models, however, is challenging because reproduction and survival cannot be measured for yet unobserved environmental conditions. In this study, we use mechanistic energy budget models and data obtainable under current conditions to predict polar bear litter size under future conditions. In western Hudson Bay, we predict climate warming-induced litter size declines that jeopardize population viability: ∼28% of pregnant females failed to reproduce for energetic reasons during the early 1990s, but 40-73% could fail if spring sea ice break-up occurs 1 month earlier than during the 1990s, and 55-100% if break-up occurs 2 months earlier. Simultaneously, mean litter size would decrease by 22-67% and 44-100%, respectively. The expected timeline for these declines varies with climate-model-specific sea ice predictions. Similar litter size declines may occur in over one-third of the global polar bear population.
Predicting climate change impacts on polar bear litter size
Molnár, Péter K.; Derocher, Andrew E.; Klanjscek, Tin; Lewis, Mark A.
2011-01-01
Predicting the ecological impacts of climate warming is critical for species conservation. Incorporating future warming into population models, however, is challenging because reproduction and survival cannot be measured for yet unobserved environmental conditions. In this study, we use mechanistic energy budget models and data obtainable under current conditions to predict polar bear litter size under future conditions. In western Hudson Bay, we predict climate warming-induced litter size declines that jeopardize population viability: ∼28% of pregnant females failed to reproduce for energetic reasons during the early 1990s, but 40–73% could fail if spring sea ice break-up occurs 1 month earlier than during the 1990s, and 55–100% if break-up occurs 2 months earlier. Simultaneously, mean litter size would decrease by 22–67% and 44–100%, respectively. The expected timeline for these declines varies with climate-model-specific sea ice predictions. Similar litter size declines may occur in over one-third of the global polar bear population. PMID:21304515
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peyret, Thomas; Poulin, Patrick; Krishnan, Kannan, E-mail: kannan.krishnan@umontreal.ca
The algorithms in the literature focusing to predict tissue:blood PC (P{sub tb}) for environmental chemicals and tissue:plasma PC based on total (K{sub p}) or unbound concentration (K{sub pu}) for drugs differ in their consideration of binding to hemoglobin, plasma proteins and charged phospholipids. The objective of the present study was to develop a unified algorithm such that P{sub tb}, K{sub p} and K{sub pu} for both drugs and environmental chemicals could be predicted. The development of the unified algorithm was accomplished by integrating all mechanistic algorithms previously published to compute the PCs. Furthermore, the algorithm was structured in such amore » way as to facilitate predictions of the distribution of organic compounds at the macro (i.e. whole tissue) and micro (i.e. cells and fluids) levels. The resulting unified algorithm was applied to compute the rat P{sub tb}, K{sub p} or K{sub pu} of muscle (n = 174), liver (n = 139) and adipose tissue (n = 141) for acidic, neutral, zwitterionic and basic drugs as well as ketones, acetate esters, alcohols, aliphatic hydrocarbons, aromatic hydrocarbons and ethers. The unified algorithm reproduced adequately the values predicted previously by the published algorithms for a total of 142 drugs and chemicals. The sensitivity analysis demonstrated the relative importance of the various compound properties reflective of specific mechanistic determinants relevant to prediction of PC values of drugs and environmental chemicals. Overall, the present unified algorithm uniquely facilitates the computation of macro and micro level PCs for developing organ and cellular-level PBPK models for both chemicals and drugs.« less
Mechanistic origin of dragon-kings in a population of competing agents
NASA Astrophysics Data System (ADS)
Johnson, N.; Tivnan, B.
2012-05-01
We analyze the mechanistic origins of the extreme behaviors that arise in an idealized model of a population of competing agents, such as traders in a market. These extreme behaviors exhibit the defining characteristics of `dragon-kings'. Our model comprises heterogeneous agents who repeatedly compete for some limited resource, making binary choices based on the strategies that they have in their possession. It generalizes the well-known Minority Game by allowing agents whose strategies have not made accurate recent predictions, to step out of the competition until their strategies improve. This generates a complex dynamical interplay between the number V of active agents (mimicking market volume) and the imbalance D between the decisions made (mimicking excess demand). The wide spectrum of extreme behaviors which emerge, helps to explain why no unique relationship has been identified between the price and volume during real market crashes and rallies.
NASA Astrophysics Data System (ADS)
Darmon, David
2018-03-01
In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.
The human adrenocortical carcinoma cell line H295R is being used as an in vitro steroidogenesis screening assay to assess the impact of endocrine active chemicals (EACs) capable of altering steroid biosynthesis. To enhance the interpretation and quantitative application of measur...
The US EPA ToxCast program aims to develop methods for mechanistically-based chemical prioritization using a suite of high throughput, in vitro assays that probe relevant biological pathways, and coupling them with statistical and machine learning methods that produce predictive ...
NASA Astrophysics Data System (ADS)
Smith, P. J.; Beven, K.; Panziera, L.
2012-04-01
The issuing of timely flood alerts may be dependant upon the ability to predict future values of water level or discharge at locations where observations are available. Catchments at risk of flash flooding often have a rapid natural response time, typically less then the forecast lead time desired for issuing alerts. This work focuses on the provision of short-range (up to 6 hours lead time) predictions of discharge in small catchments based on utilising radar forecasts to drive a hydrological model. An example analysis based upon the Verzasca catchment (Ticino, Switzerland) is presented. Parsimonious time series models with a mechanistic interpretation (so called Data-Based Mechanistic model) have been shown to provide reliable accurate forecasts in many hydrological situations. In this study such a model is developed to predict the discharge at an observed location from observed precipitation data. The model is shown to capture the snow melt response at this site. Observed discharge data is assimilated to improve the forecasts, of up to two hours lead time, that can be generated from observed precipitation. To generate forecasts with greater lead time ensemble precipitation forecasts are utilised. In this study the Nowcasting ORographic precipitation in the Alps (NORA) product outlined in more detail elsewhere (Panziera et al. Q. J. R. Meteorol. Soc. 2011; DOI:10.1002/qj.878) is utilised. NORA precipitation forecasts are derived from historical analogues based on the radar field and upper atmospheric conditions. As such, they avoid the need to explicitly model the evolution of the rainfall field through for example Lagrangian diffusion. The uncertainty in the forecasts is represented by characterisation of the joint distribution of the observed discharge, the discharge forecast using the (in operational conditions unknown) future observed precipitation and that forecast utilising the NORA ensembles. Constructing the joint distribution in this way allows the full historic record of data at the site to inform the predictive distribution. It is shown that, in part due to the limited availability of forecasts, the uncertainty in the relationship between the NORA based forecasts and other variates dominated the resulting predictive uncertainty.
Niblett, Daniel; Porter, Stuart; Reynolds, Gavin; Morgan, Tomos; Greenamoyer, Jennifer; Hach, Ronald; Sido, Stephanie; Karan, Kapish; Gabbott, Ian
2017-08-07
A mathematical, mechanistic tablet film-coating model has been developed for pharmaceutical pan coating systems based on the mechanisms of atomisation, tablet bed movement and droplet drying with the main purpose of predicting tablet appearance quality. Two dimensionless quantities were used to characterise the product properties and operating parameters: the dimensionless Spray Flux (relating to area coverage of the spray droplets) and the Niblett Number (relating to the time available for drying of coating droplets). The Niblett Number is the ratio between the time a droplet needs to dry under given thermodynamic conditions and the time available for the droplet while on the surface of the tablet bed. The time available for drying on the tablet bed surface is critical for appearance quality. These two dimensionless quantities were used to select process parameters for a set of 22 coating experiments, performed over a wide range of multivariate process parameters. The dimensionless Regime Map created can be used to visualise the effect of interacting process parameters on overall tablet appearance quality and defects such as picking and logo bridging. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Marçais, J.; Gupta, H. V.; De Dreuzy, J. R.; Troch, P. A. A.
2016-12-01
Geomorphological structure and geological heterogeneity of hillslopes are major controls on runoff responses. The diversity of hillslopes (morphological shapes and geological structures) on one hand, and the highly non linear runoff mechanism response on the other hand, make it difficult to transpose what has been learnt at one specific hillslope to another. Therefore, making reliable predictions on runoff appearance or river flow for a given hillslope is a challenge. Applying a classic model calibration (based on inverse problems technique) requires doing it for each specific hillslope and having some data available for calibration. When applied to thousands of cases it cannot always be promoted. Here we propose a novel modeling framework based on coupling process based models with data based approach. First we develop a mechanistic model, based on hillslope storage Boussinesq equations (Troch et al. 2003), able to model non linear runoff responses to rainfall at the hillslope scale. Second we set up a model database, representing thousands of non calibrated simulations. These simulations investigate different hillslope shapes (real ones obtained by analyzing 5m digital elevation model of Brittany and synthetic ones), different hillslope geological structures (i.e. different parametrizations) and different hydrologic forcing terms (i.e. different infiltration chronicles). Then, we use this model library to train a machine learning model on this physically based database. Machine learning model performance is then assessed by a classic validating phase (testing it on new hillslopes and comparing machine learning with mechanistic outputs). Finally we use this machine learning model to learn what are the hillslope properties controlling runoffs. This methodology will be further tested combining synthetic datasets with real ones.
How and why does the immunological synapse form? Physical chemistry meets cell biology.
Chakraborty, Arup K
2002-03-05
During T lymphocyte (T cell) recognition of an antigen, a highly organized and specific pattern of membrane proteins forms in the junction between the T cell and the antigen-presenting cell (APC). This specialized cell-cell junction is called the immunological synapse. It is several micrometers large and forms over many minutes. A plethora of experiments are being performed to study the mechanisms that underlie synapse formation and the way in which information transfer occurs across the synapse. The wealth of experimental data that is beginning to emerge must be understood within a mechanistic framework if it is to prove useful in developing modalities to control the immune response. Quantitative models can complement experiments in the quest for such a mechanistic understanding by suggesting experimentally testable hypotheses. Here, a quantitative synapse assembly model is described. The model uses concepts developed in physical chemistry and cell biology and is able to predict the spatiotemporal evolution of cell shape and receptor protein patterns observed during synapse formation. Attention is directed to how the juxtaposition of model predictions and experimental data has led to intriguing hypotheses regarding the role of null and self peptides during synapse assembly, as well as correlations between T cell effector functions and the robustness of synapse assembly. We remark on some ways in which synergistic experiments and modeling studies can improve current models, and we take steps toward a better understanding of information transfer across the T cell-APC junction.
Biomechanics-based in silico medicine: the manifesto of a new science.
Viceconti, Marco
2015-01-21
In this perspective article we discuss the role of contemporary biomechanics in the light of recent applications such as the development of the so-called Virtual Physiological Human technologies for physiology-based in silico medicine. In order to build Virtual Physiological Human (VPH) models, computer models that capture and integrate the complex systemic dynamics of living organisms across radically different space-time scales, we need to re-formulate a vast body of existing biology and physiology knowledge so that it is formulated as a quantitative hypothesis, which can be expressed in mathematical terms. Once the predictive accuracy of these models is confirmed against controlled experiments and against clinical observations, we will have VPH model that can reliably predict certain quantitative changes in health status of a given patient, but also, more important, we will have a theory, in the true meaning this word has in the scientific method. In this scenario, biomechanics plays a very important role, biomechanics is one of the few areas of life sciences where we attempt to build full mechanistic explanations based on quantitative observations, in other words, we investigate living organisms like physical systems. This is in our opinion a Copernican revolution, around which the scope of biomechanics should be re-defined. Thus, we propose a new definition for our research domain "Biomechanics is the study of living organisms as mechanistic systems". Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Thomas, Stephanie Margarete; Beierkuhnlein, Carl
2013-05-01
The occurrence of ectotherm disease vectors outside of their previous distribution area and the emergence of vector-borne diseases can be increasingly observed at a global scale and are accompanied by a growing number of studies which investigate the vast range of determining factors and their causal links. Consequently, a broad span of scientific disciplines is involved in tackling these complex phenomena. First, we evaluate the citation behaviour of relevant scientific literature in order to clarify the question "do scientists consider results of other disciplines to extend their expertise?" We then highlight emerging tools and concepts useful for risk assessment. Correlative models (regression-based, machine-learning and profile techniques), mechanistic models (basic reproduction number R 0) and methods of spatial regression, interaction and interpolation are described. We discuss further steps towards multidisciplinary approaches regarding new tools and emerging concepts to combine existing approaches such as Bayesian geostatistical modelling, mechanistic models which avoid the need for parameter fitting, joined correlative and mechanistic models, multi-criteria decision analysis and geographic profiling. We take the quality of both occurrence data for vector, host and disease cases, and data of the predictor variables into consideration as both determine the accuracy of risk area identification. Finally, we underline the importance of multidisciplinary research approaches. Even if the establishment of communication networks between scientific disciplines and the share of specific methods is time consuming, it promises new insights for the surveillance and control of vector-borne diseases worldwide.
NASA Astrophysics Data System (ADS)
Mohanty, Subhasish; Soppet, William K.; Majumdar, Saurindranath; Natesan, Krishnamurti
2016-05-01
Argonne National Laboratory (ANL), under the sponsorship of Department of Energy's Light Water Reactor Sustainability (LWRS) program, is trying to develop a mechanistic approach for more accurate life estimation of LWR components. In this context, ANL has conducted many fatigue experiments under different test and environment conditions on type 316 stainless steel (316 SS) material which is widely used in the US reactors. Contrary to the conventional S ∼ N curve based empirical fatigue life estimation approach, the aim of the present DOE sponsored work is to develop an understanding of the material ageing issues more mechanistically (e.g. time dependent hardening and softening) under different test and environmental conditions. Better mechanistic understanding will help develop computer-based advanced modeling tools to better extrapolate stress-strain evolution of reactor components under multi-axial stress states and hence help predict their fatigue life more accurately. Mechanics-based modeling of fatigue such as by using finite element (FE) tools requires the time/cycle dependent material hardening properties. Presently such time-dependent material hardening properties are hardly available in fatigue modeling literature even under in-air conditions. Getting those material properties under PWR environment, are even harder. Through this work we made preliminary attempt to generate time/cycle dependent stress-strain data both under in-air and PWR water conditions for further study such as for possible development of material models and constitutive relations for FE model implementation. Although, there are open-ended possibility to further improve the discussed test methods and related material estimation techniques we anticipate that the data presented in this paper will help the metal fatigue research community particularly, the researchers who are dealing with mechanistic modeling of metal fatigue such as using FE tools. In this paper the fatigue experiments under different test and environment conditions and related stress-strain results for 316 SS are discussed.
Simulating 2,368 temperate lakes reveals weak coherence in stratification phenology
Read, Jordan S.; Winslow, Luke A.; Hansen, Gretchen J. A.; Van Den Hoek, Jamon; Hanson, Paul C.; Bruce, Louise C; Markfort, Corey D.
2014-01-01
Changes in water temperatures resulting from climate warming can alter the structure and function of aquatic ecosystems. Lake-specific physical characteristics may play a role in mediating individual lake responses to climate. Past mechanistic studies of lake-climate interactions have simulated generic lake classes at large spatial scales or performed detailed analyses of small numbers of real lakes. Understanding the diversity of lake responses to climate change across landscapes requires a hybrid approach that couples site-specific lake characteristics with broad-scale environmental drivers. This study provides a substantial advancement in lake ecosystem modeling by combining open-source tools with freely available continental-scale data to mechanistically model daily temperatures for 2,368 Wisconsin lakes over three decades (1979-2011). The model accurately predicted observed surface layer temperatures (RMSE: 1.74°C) and the presence/absence of stratification (81.1% agreement). Among-lake coherence was strong for surface temperatures and weak for the timing of stratification, suggesting individual lake characteristics mediate some - but not all - ecologically relevant lake responses to climate.
Alimohammadi, Mona; Pichardo-Almarza, Cesar; Agu, Obiekezie; Díaz-Zuccarini, Vanessa
2016-01-01
Vascular calcification results in stiffening of the aorta and is associated with hypertension and atherosclerosis. Atherogenesis is a complex, multifactorial, and systemic process; the result of a number of factors, each operating simultaneously at several spatial and temporal scales. The ability to predict sites of atherogenesis would be of great use to clinicians in order to improve diagnostic and treatment planning. In this paper, we present a mathematical model as a tool to understand why atherosclerotic plaque and calcifications occur in specific locations. This model is then used to analyze vascular calcification and atherosclerotic areas in an aortic dissection patient using a mechanistic, multi-scale modeling approach, coupling patient-specific, fluid-structure interaction simulations with a model of endothelial mechanotransduction. A number of hemodynamic factors based on state-of-the-art literature are used as inputs to the endothelial permeability model, in order to investigate plaque and calcification distributions, which are compared with clinical imaging data. A significantly improved correlation between elevated hydraulic conductivity or volume flux and the presence of calcification and plaques was achieved by using a shear index comprising both mean and oscillatory shear components (HOLMES) and a non-Newtonian viscosity model as inputs, as compared to widely used hemodynamic indicators. The proposed approach shows promise as a predictive tool. The improvements obtained using the combined biomechanical/biochemical modeling approach highlight the benefits of mechanistic modeling as a powerful tool to understand complex phenomena and provides insight into the relative importance of key hemodynamic parameters. PMID:27445834
Integrated computational model of the bioenergetics of isolated lung mitochondria
Zhang, Xiao; Jacobs, Elizabeth R.; Camara, Amadou K. S.; Clough, Anne V.
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria. PMID:29889855
Integrated computational model of the bioenergetics of isolated lung mitochondria.
Zhang, Xiao; Dash, Ranjan K; Jacobs, Elizabeth R; Camara, Amadou K S; Clough, Anne V; Audi, Said H
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria.
Finke, G R; Bozinovic, F; Navarrete, S A
2009-01-01
Developing mechanistic models to predict an organism's body temperature facilitates the study of physiological stresses caused by extreme climatic conditions the species might have faced in the past or making predictions about changes to come in the near future. Because the models combine empirical observation of different climatic variables with essential morphological attributes of the species, it is possible to examine specific aspects of predicted climatic changes. Here, we develop a model for the competitively dominant intertidal mussel Perumytilus purpuratus that estimates body temperature on the basis of meteorological and tidal data with an average difference (+/-SE) of 0.410 degrees +/- 0.0315 degrees C in comparison with a field-deployed temperature logger. Modeled body temperatures of P. purpuratus in central Chile regularly exceeded 30 degrees C in summer months, and values as high as 38 degrees C were found. These results suggest that the temperatures reached by mussels in the intertidal zone in central Chile are not sufficiently high to induce significant mortality on adults of this species; however, because body temperatures >40 degrees C can be lethal for this species, sublethal effects on physiological performance warrant further investigation. Body temperatures of mussels increased sigmoidally with increasing tidal height. Body temperatures of individuals from approximately 70% of the tidal range leveled off and did not increase any further with increasing tidal height. Finally, body size played an important role in determining body temperature. A hypothetical 5-cm-long mussel (only 1 cm longer than mussels found in nature) did reach potentially lethal body temperatures, suggesting that the biophysical environment may play a role in limiting the size of this small species.
DOT National Transportation Integrated Search
2015-08-01
A mechanistic-empirical (ME) pavement design procedure allows for analyzing and selecting pavement structures based : on predicted distress progression resulting from stresses and strains within the pavement over its design life. The Virginia : Depar...
Breed, Greg A.; Golson, Emily A.; Tinker, M. Tim
2017-01-01
The home‐range concept is central in animal ecology and behavior, and numerous mechanistic models have been developed to understand home range formation and maintenance. These mechanistic models usually assume a single, contiguous home range. Here we describe and implement a simple home‐range model that can accommodate multiple home‐range centers, form complex shapes, allow discontinuities in use patterns, and infer how external and internal variables affect movement and use patterns. The model assumes individuals associate with two or more home‐range centers and move among them with some estimable probability. Movement in and around home‐range centers is governed by a two‐dimensional Ornstein‐Uhlenbeck process, while transitions between centers are modeled as a stochastic state‐switching process. We augmented this base model by introducing environmental and demographic covariates that modify transition probabilities between home‐range centers and can be estimated to provide insight into the movement process. We demonstrate the model using telemetry data from sea otters (Enhydra lutris) in California. The model was fit using a Bayesian Markov Chain Monte Carlo method, which estimated transition probabilities, as well as unique Ornstein‐Uhlenbeck diffusion and centralizing tendency parameters. Estimated parameters could then be used to simulate movement and space use that was virtually indistinguishable from real data. We used Deviance Information Criterion (DIC) scores to assess model fit and determined that both wind and reproductive status were predictive of transitions between home‐range centers. Females were less likely to move between home‐range centers on windy days, less likely to move between centers when tending pups, and much more likely to move between centers just after weaning a pup. These tendencies are predicted by theoretical movement rules but were not previously known and show that our model can extract meaningful behavioral insight from complex movement data.
Rylander, Charlotta; Sandanger, Torkjel Manning; Nøst, Therese Haugdahl; Breivik, Knut; Lund, Eiliv
2015-10-01
The number of studies on persistent organic pollutants (POPs) and type 2 diabetes mellitus (T2DM) is growing steadily. Although concentrations of many POPs in humans have decreased substantially, only some studies consider temporal and inter-individual changes in POP concentrations when assessing exposure. Here we combined plasma measurements with mechanistic modeling to generate complementary exposure measures to our single blood draw after disease diagnosis. Blood was collected between 2003-2006 from 106 subjects with T2DM and 106 age-matched controls, and POP concentrations were compared after adjustment for relevant risk factors and multiple testing. Area under the curve (AUC) of PCB-153 from birth until age 18, representing early-life exposure, and AUC from birth until time of diagnosis were generated as well as examples of life-time exposure trajectories using a mechanistic exposure model. The rank sum of polychlorinated biphenyls (PCBs) and organochlorine pesticides (OCPs, OR=16.9 (95% CI: 3.05-93.6)) as well as β-hexachlorocyclohexane (β-HCH, OR=203.8 (95% CI: 11.5-3620)) and 1, 1-dichloro-2,2-bis(p-chlorophenyl) ethylene (p,p'-DDE, OR=11.3 (95% CI: 2.55-49.9)) were associated with T2DM. Neither of the AUCs reflecting early life exposure and total life-time exposure at the time of diagnosis were associated with the disease. The predicted life course trajectories display clear differences within and between individuals in the past and suggest that a single blood draw provide limited information on POP exposure earlier in life. The predicted AUCs for PCB-153 did not support the positive association between T2DM and measured blood concentration of certain POPs. This may suggest that the model is either too simplistic and/or that strength of the association may vary through life and with time to/past diagnosis. Copyright © 2015 Elsevier Inc. All rights reserved.
Comparing spatial diversification and meta-population models in the Indo-Australian Archipelago
Chalmandrier, Loïc; Albouy, Camille; Descombes, Patrice; Sandel, Brody; Faurby, Soren; Svenning, Jens-Christian; Zimmermann, Niklaus E.
2018-01-01
Reconstructing the processes that have shaped the emergence of biodiversity gradients is critical to understand the dynamics of diversification of life on Earth. Islands have traditionally been used as model systems to unravel the processes shaping biological diversity. MacArthur and Wilson's island biogeographic model predicts diversity to be based on dynamic interactions between colonization and extinction rates, while treating islands themselves as geologically static entities. The current spatial configuration of islands should influence meta-population dynamics, but long-term geological changes within archipelagos are also expected to have shaped island biodiversity, in part by driving diversification. Here, we compare two mechanistic models providing inferences on species richness at a biogeographic scale: a mechanistic spatial-temporal model of species diversification and a spatial meta-population model. While the meta-population model operates over a static landscape, the diversification model is driven by changes in the size and spatial configuration of islands through time. We compare the inferences of both models to floristic diversity patterns among land patches of the Indo-Australian Archipelago. Simulation results from the diversification model better matched observed diversity than a meta-population model constrained only by the contemporary landscape. The diversification model suggests that the dynamic re-positioning of islands promoting land disconnection and reconnection induced an accumulation of particularly high species diversity on Borneo, which is central within the island network. By contrast, the meta-population model predicts a higher diversity on the mainlands, which is less compatible with empirical data. Our analyses highlight that, by comparing models with contrasting assumptions, we can pinpoint the processes that are most compatible with extant biodiversity patterns. PMID:29657753
Comparing spatial diversification and meta-population models in the Indo-Australian Archipelago.
Chalmandrier, Loïc; Albouy, Camille; Descombes, Patrice; Sandel, Brody; Faurby, Soren; Svenning, Jens-Christian; Zimmermann, Niklaus E; Pellissier, Loïc
2018-03-01
Reconstructing the processes that have shaped the emergence of biodiversity gradients is critical to understand the dynamics of diversification of life on Earth. Islands have traditionally been used as model systems to unravel the processes shaping biological diversity. MacArthur and Wilson's island biogeographic model predicts diversity to be based on dynamic interactions between colonization and extinction rates, while treating islands themselves as geologically static entities. The current spatial configuration of islands should influence meta-population dynamics, but long-term geological changes within archipelagos are also expected to have shaped island biodiversity, in part by driving diversification. Here, we compare two mechanistic models providing inferences on species richness at a biogeographic scale: a mechanistic spatial-temporal model of species diversification and a spatial meta-population model. While the meta-population model operates over a static landscape, the diversification model is driven by changes in the size and spatial configuration of islands through time. We compare the inferences of both models to floristic diversity patterns among land patches of the Indo-Australian Archipelago. Simulation results from the diversification model better matched observed diversity than a meta-population model constrained only by the contemporary landscape. The diversification model suggests that the dynamic re-positioning of islands promoting land disconnection and reconnection induced an accumulation of particularly high species diversity on Borneo, which is central within the island network. By contrast, the meta-population model predicts a higher diversity on the mainlands, which is less compatible with empirical data. Our analyses highlight that, by comparing models with contrasting assumptions, we can pinpoint the processes that are most compatible with extant biodiversity patterns.
Kasprak, Alan; Caster, Joshua J.; Bangen, Sara G.; Sankey, Joel B.
2017-01-01
The ability to quantify the processes driving geomorphic change in river valley margins is vital to geomorphologists seeking to understand the relative role of transport mechanisms (e.g. fluvial, aeolian, and hillslope processes) in landscape dynamics. High-resolution, repeat topographic data are becoming readily available to geomorphologists. By contrasting digital elevation models derived from repeat surveys, the transport processes driving topographic changes can be inferred, a method termed ‘mechanistic segregation.’ Unfortunately, mechanistic segregation largely relies on subjective and time consuming manual classification, which has implications both for its reproducibility and the practical scale of its application. Here we present a novel computational workflow for the mechanistic segregation of geomorphic transport processes in geospatial datasets. We apply the workflow to seven sites along the Colorado River in the Grand Canyon, where geomorphic transport is driven by a diverse suite of mechanisms. The workflow performs well when compared to field observations, with an overall predictive accuracy of 84% across 113 validation points. The approach most accurately predicts changes due to fluvial processes (100% accuracy) and aeolian processes (96%), with reduced accuracy in predictions of alluvial and colluvial processes (64% and 73%, respectively). Our workflow is designed to be applicable to a diversity of river systems and will likely provide a rapid and objective understanding of the processes driving geomorphic change at the reach and network scales. We anticipate that such an understanding will allow insight into the response of geomorphic transport processes to external forcings, such as shifts in climate, land use, or river regulation, with implications for process-based river management and restoration.
Improving the forecast for biodiversity under climate change.
Urban, M C; Bocedi, G; Hendry, A P; Mihoub, J-B; Pe'er, G; Singer, A; Bridle, J R; Crozier, L G; De Meester, L; Godsoe, W; Gonzalez, A; Hellmann, J J; Holt, R D; Huth, A; Johst, K; Krug, C B; Leadley, P W; Palmer, S C F; Pantel, J H; Schmitz, A; Zollner, P A; Travis, J M J
2016-09-09
New biological models are incorporating the realistic processes underlying biological responses to climate change and other human-caused disturbances. However, these more realistic models require detailed information, which is lacking for most species on Earth. Current monitoring efforts mainly document changes in biodiversity, rather than collecting the mechanistic data needed to predict future changes. We describe and prioritize the biological information needed to inform more realistic projections of species' responses to climate change. We also highlight how trait-based approaches and adaptive modeling can leverage sparse data to make broader predictions. We outline a global effort to collect the data necessary to better understand, anticipate, and reduce the damaging effects of climate change on biodiversity. Copyright © 2016, American Association for the Advancement of Science.
Ganusov, Vitaly V.; De Boer, Rob J.
2013-01-01
Bromodeoxyuridine (BrdU) is widely used in immunology to detect cell division, and several mathematical models have been proposed to estimate proliferation and death rates of lymphocytes from BrdU labelling and de-labelling curves. One problem in interpreting BrdU data is explaining the de-labelling curves. Because shortly after label withdrawal, BrdU+ cells are expected to divide into BrdU+ daughter cells, one would expect a flat down-slope. As for many cell types, the fraction of BrdU+ cells decreases during de-labelling, previous mathematical models had to make debatable assumptions to be able to account for the data. We develop a mechanistic model tracking the number of divisions that each cell has undergone in the presence and absence of BrdU, and allow cells to accumulate and dilute their BrdU content. From the same mechanistic model, one can naturally derive expressions for the mean BrdU content (MBC) of all cells, or the MBC of the BrdU+ subset, which is related to the mean fluorescence intensity of BrdU that can be measured in experiments. The model is extended to include subpopulations with different rates of division and death (i.e. kinetic heterogeneity). We fit the extended model to previously published BrdU data from memory T lymphocytes in simian immunodeficiency virus-infected and uninfected macaques, and find that the model describes the data with at least the same quality as previous models. Because the same model predicts a modest decline in the MBC of BrdU+ cells, which is consistent with experimental observations, BrdU dilution seems a natural explanation for the observed down-slopes in self-renewing populations. PMID:23034350
A Systems Model of Parkinson's Disease Using Biochemical Systems Theory.
Sasidharakurup, Hemalatha; Melethadathil, Nidheesh; Nair, Bipin; Diwakar, Shyam
2017-08-01
Parkinson's disease (PD), a neurodegenerative disorder, affects millions of people and has gained attention because of its clinical roles affecting behaviors related to motor and nonmotor symptoms. Although studies on PD from various aspects are becoming popular, few rely on predictive systems modeling approaches. Using Biochemical Systems Theory (BST), this article attempts to model and characterize dopaminergic cell death and understand pathophysiology of progression of PD. PD pathways were modeled using stochastic differential equations incorporating law of mass action, and initial concentrations for the modeled proteins were obtained from literature. Simulations suggest that dopamine levels were reduced significantly due to an increase in dopaminergic quinones and 3,4-dihydroxyphenylacetaldehyde (DOPAL) relating to imbalances compared to control during PD progression. Associating to clinically observed PD-related cell death, simulations show abnormal parkin and reactive oxygen species levels with an increase in neurofibrillary tangles. While relating molecular mechanistic roles, the BST modeling helps predicting dopaminergic cell death processes involved in the progression of PD and provides a predictive understanding of neuronal dysfunction for translational neuroscience.
Modeling approaches in avian conservation and the role of field biologists
Beissinger, Steven R.; Walters, J.R.; Catanzaro, D.G.; Smith, Kimberly G.; Dunning, J.B.; Haig, Susan M.; Noon, Barry; Stith, Bradley M.
2006-01-01
This review grew out of our realization that models play an increasingly important role in conservation but are rarely used in the research of most avian biologists. Modelers are creating models that are more complex and mechanistic and that can incorporate more of the knowledge acquired by field biologists. Such models require field biologists to provide more specific information, larger sample sizes, and sometimes new kinds of data, such as habitat-specific demography and dispersal information. Field biologists need to support model development by testing key model assumptions and validating models. The best conservation decisions will occur where cooperative interaction enables field biologists, modelers, statisticians, and managers to contribute effectively. We begin by discussing the general form of ecological models—heuristic or mechanistic, "scientific" or statistical—and then highlight the structure, strengths, weaknesses, and applications of six types of models commonly used in avian conservation: (1) deterministic single-population matrix models, (2) stochastic population viability analysis (PVA) models for single populations, (3) metapopulation models, (4) spatially explicit models, (5) genetic models, and (6) species distribution models. We end by considering their unique attributes, determining whether the assumptions that underlie the structure are valid, and testing the ability of the model to predict the future correctly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szymańska, Paulina; Martin, Katie R.; MacKeigan, Jeffrey P.
We constructed a mechanistic, computational model for regulation of (macro)autophagy and protein synthesis (at the level of translation). The model was formulated to study the system-level consequences of interactions among the following proteins: two key components of MTOR complex 1 (MTORC1), namely the protein kinase MTOR (mechanistic target of rapamycin) and the scaffold protein RPTOR; the autophagy-initiating protein kinase ULK1; and the multimeric energy-sensing AMP-activated protein kinase (AMPK). Inputs of the model include intrinsic AMPK kinase activity, which is taken as an adjustable surrogate parameter for cellular energy level or AMP:ATP ratio, and rapamycin dose, which controls MTORC1 activity. Outputsmore » of the model include the phosphorylation level of the translational repressor EIF4EBP1, a substrate of MTORC1, and the phosphorylation level of AMBRA1 (activating molecule in BECN1-regulated autophagy), a substrate of ULK1 critical for autophagosome formation. The model incorporates reciprocal regulation of mTORC1 and ULK1 by AMPK, mutual inhibition of MTORC1 and ULK1, and ULK1-mediated negative feedback regulation of AMPK. Through analysis of the model, we find that these processes may be responsible, depending on conditions, for graded responses to stress inputs, for bistable switching between autophagy and protein synthesis, or relaxation oscillations, comprising alternating periods of autophagy and protein synthesis. A sensitivity analysis indicates that the prediction of oscillatory behavior is robust to changes of the parameter values of the model. The model provides testable predictions about the behavior of the AMPK-MTORC1-ULK1 network, which plays a central role in maintaining cellular energy and nutrient homeostasis.« less
Multi-scale predictions of coniferous forest mortality in the northern hemisphere
NASA Astrophysics Data System (ADS)
McDowell, N. G.
2015-12-01
Global temperature rise and extremes accompanying drought threaten forests and their associated climatic feedbacks. Our incomplete understanding of the fundamental physiological thresholds of vegetation mortality during drought limits our ability to accurately simulate future vegetation distributions and associated climate feedbacks. Here we integrate experimental evidence with models to show potential widespread loss of needleleaf evergreen trees (NET; ~ conifers) within the Southwest USA by 2100; with rising temperature being the primary cause of mortality. Experimentally, dominant Southwest USA NET species died when they fell below predawn water potential (Ypd) thresholds (April-August mean) beyond which photosynthesis, stomatal and hydraulic conductance, and carbohydrate availability approached zero. Empirical and mechanistic models accurately predicted NET Ypd, and 91% of predictions (10/11) exceeded mortality thresholds within the 21st century due to temperature rise. Completely independent global models predicted >50% loss of northern hemisphere NET by 2100, consistent with the findings for Southwest USA. The global models disagreed with the ecosystem process models in regards to future mortality in Southwest USA, however, highlighting the potential underestimates of future NET mortality as simulated by the global models and signifying the importance of improving regional predictions. Taken together, these results from the validated regional predictions and the global simulations predict global-scale conifer loss in coming decades under projected global warming.
NASA Astrophysics Data System (ADS)
Fennel, Katja; Hu, Jiatang; Laurent, Arnaud; Marta-Almeida, Martinho; Hetland, Robert
2014-05-01
Interannual variations of the hypoxic area that develops every summer over the Texas-Louisiana Shelf are large. The 2008 Action Plan put forth by an alliance of multiple state and federal agencies and tribes calls for a decrease of the hypoxic area through nutrient management in the watershed. Realistic models help build mechanistic understanding of the processes underlying hypoxia formation and are thus indispensable for devising efficient nutrient reduction strategies. Here we present such a model, evaluate its hypoxia predictions against monitoring observations and assess the sensitivity of hypoxia predictions to model resolution, variations in sediment oxygen consumption and choice of physical horizontal boundary conditions. We find that hypoxia predictions on the shelf are very sensitive to the parameterization of sediment oxygen consumption, a result of the fact that hypoxic conditions are restricted to a relatively thin layer above the bottom over most of the shelf. We also show that the strength of vertical stratification is an important predictor of oxygen concentration in bottom waters and that modification of physical horizontal boundary conditions can have a large effect on hypoxia predictions.
Computing organic stereoselectivity - from concepts to quantitative calculations and predictions.
Peng, Qian; Duarte, Fernanda; Paton, Robert S
2016-11-07
Advances in theory and processing power have established computation as a valuable interpretative and predictive tool in the discovery of new asymmetric catalysts. This tutorial review outlines the theory and practice of modeling stereoselective reactions. Recent examples illustrate how an understanding of the fundamental principles and the application of state-of-the-art computational methods may be used to gain mechanistic insight into organic and organometallic reactions. We highlight the emerging potential of this computational tool-box in providing meaningful predictions for the rational design of asymmetric catalysts. We present an accessible account of the field to encourage future synergy between computation and experiment.
Cao, Pengxing
2017-01-01
Models of within-host influenza viral dynamics have contributed to an improved understanding of viral dynamics and antiviral effects over the past decade. Existing models can be classified into two broad types based on the mechanism of viral control: models utilising target cell depletion to limit the progress of infection and models which rely on timely activation of innate and adaptive immune responses to control the infection. In this paper, we compare how two exemplar models based on these different mechanisms behave and investigate how the mechanistic difference affects the assessment and prediction of antiviral treatment. We find that the assumed mechanism for viral control strongly influences the predicted outcomes of treatment. Furthermore, we observe that for the target cell-limited model the assumed drug efficacy strongly influences the predicted treatment outcomes. The area under the viral load curve is identified as the most reliable predictor of drug efficacy, and is robust to model selection. Moreover, with support from previous clinical studies, we suggest that the target cell-limited model is more suitable for modelling in vitro assays or infection in some immunocompromised/immunosuppressed patients while the immune response model is preferred for predicting the infection/antiviral effect in immunocompetent animals/patients. PMID:28933757
Research frontiers for improving our understanding of drought‐induced tree and forest mortality
Hartmann, Henrik; Moura, Catarina; Anderegg, William R. L.; Ruehr, Nadine; Salmon, Yann; Allen, Craig D.; Arndt, Stefan K.; Breshears, David D.; Davi, Hendrik; Galbraith, David; Ruthrof, Katinka X.; Wunder, Jan; Adams, Henry D.; Bloemen, Jasper; Cailleret, Maxime; Cobb, Richard; Gessler, Arthur; Grams, Thorsten E. E.; Jansen, Steven; Kautz, Markus; Lloret, Francisco; O’Brien, Michael
2018-01-01
Accumulating evidence highlights increased mortality risks for trees during severe drought, particularly under warmer temperatures and increasing vapour pressure deficit (VPD). Resulting forest die‐off events have severe consequences for ecosystem services, biophysical and biogeochemical land–atmosphere processes. Despite advances in monitoring, modelling and experimental studies of the causes and consequences of tree death from individual tree to ecosystem and global scale, a general mechanistic understanding and realistic predictions of drought mortality under future climate conditions are still lacking. We update a global tree mortality map and present a roadmap to a more holistic understanding of forest mortality across scales. We highlight priority research frontiers that promote: (1) new avenues for research on key tree ecophysiological responses to drought; (2) scaling from the tree/plot level to the ecosystem and region; (3) improvements of mortality risk predictions based on both empirical and mechanistic insights; and (4) a global monitoring network of forest mortality. In light of recent and anticipated large forest die‐off events such a research agenda is timely and needed to achieve scientific understanding for realistic predictions of drought‐induced tree mortality. The implementation of a sustainable network will require support by stakeholders and political authorities at the international level.
2016-01-01
Modeling and prediction of polar organic chemical integrative sampler (POCIS) sampling rates (Rs) for 73 compounds using artificial neural networks (ANNs) is presented for the first time. Two models were constructed: the first was developed ab initio using a genetic algorithm (GSD-model) to shortlist 24 descriptors covering constitutional, topological, geometrical and physicochemical properties and the second model was adapted for Rs prediction from a previous chromatographic retention model (RTD-model). Mechanistic evaluation of descriptors showed that models did not require comprehensive a priori information to predict Rs. Average predicted errors for the verification and blind test sets were 0.03 ± 0.02 L d–1 (RTD-model) and 0.03 ± 0.03 L d–1 (GSD-model) relative to experimentally determined Rs. Prediction variability in replicated models was the same or less than for measured Rs. Networks were externally validated using a measured Rs data set of six benzodiazepines. The RTD-model performed best in comparison to the GSD-model for these compounds (average absolute errors of 0.0145 ± 0.008 L d–1 and 0.0437 ± 0.02 L d–1, respectively). Improvements to generalizability of modeling approaches will be reliant on the need for standardized guidelines for Rs measurement. The use of in silico tools for Rs determination represents a more economical approach than laboratory calibrations. PMID:27363449
Toft, Nils; Boklund, Anette; Espinosa-Gongora, Carmen; Græsbøll, Kaare; Larsen, Jesper; Halasa, Tariq
2017-01-01
Before an efficient control strategy for livestock-associated methicillin resistant Staphylococcus aureus (LA-MRSA) in pigs can be decided upon, it is necessary to obtain a better understanding of how LA-MRSA spreads and persists within a pig herd, once it is introduced. We here present a mechanistic stochastic discrete-event simulation model for spread of LA-MRSA within a farrow-to-finish sow herd to aid in this. The model was individual-based and included three different disease compartments: susceptible, intermittent or persistent shedder of MRSA. The model was used for studying transmission dynamics and within-farm prevalence after different introductions of LA-MRSA into a farm. The spread of LA-MRSA throughout the farm mainly followed the movement of pigs. After spread of LA-MRSA had reached equilibrium, the prevalence of LA-MRSA shedders was predicted to be highest in the farrowing unit, independent of how LA-MRSA was introduced. LA-MRSA took longer to spread to the whole herd if introduced in the finisher stable, rather than by gilts in the mating stable. The more LA-MRSA positive animals introduced, the shorter time before the prevalence in the herd stabilised. Introduction of a low number of intermittently shedding pigs was predicted to frequently result in LA-MRSA fading out. The model is a potential decision support tool for assessments of short and long term consequences of proposed intervention strategies or surveillance options for LA-MRSA within pig herds. PMID:29182655
DOUBLE SHELL TANK (DST) HYDROXIDE DEPLETION MODEL FOR CARBON DIOXIDE ABSORPTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
OGDEN DM; KIRCH NW
2007-10-31
This document generates a supernatant hydroxide ion depletion model based on mechanistic principles. The carbon dioxide absorption mechanistic model is developed in this report. The report also benchmarks the model against historical tank supernatant hydroxide data and vapor space carbon dioxide data. A comparison of the newly generated mechanistic model with previously applied empirical hydroxide depletion equations is also performed.
Kostal, Jakub; Voutchkova-Kostal, Adelina
2016-01-19
Using computer models to accurately predict toxicity outcomes is considered to be a major challenge. However, state-of-the-art computational chemistry techniques can now be incorporated in predictive models, supported by advances in mechanistic toxicology and the exponential growth of computing resources witnessed over the past decade. The CADRE (Computer-Aided Discovery and REdesign) platform relies on quantum-mechanical modeling of molecular interactions that represent key biochemical triggers in toxicity pathways. Here, we present an external validation exercise for CADRE-SS, a variant developed to predict the skin sensitization potential of commercial chemicals. CADRE-SS is a hybrid model that evaluates skin permeability using Monte Carlo simulations, assigns reactive centers in a molecule and possible biotransformations via expert rules, and determines reactivity with skin proteins via quantum-mechanical modeling. The results were promising with an overall very good concordance of 93% between experimental and predicted values. Comparison to performance metrics yielded by other tools available for this endpoint suggests that CADRE-SS offers distinct advantages for first-round screenings of chemicals and could be used as an in silico alternative to animal tests where permissible by legislative programs.
Toxicokinetic and Dosimetry Modeling Tools for Exposure ...
New technologies and in vitro testing approaches have been valuable additions to risk assessments that have historically relied solely on in vivo test results. Compared to in vivo methods, in vitro high throughput screening (HTS) assays are less expensive, faster and can provide mechanistic insights on chemical action. However, extrapolating from in vitro chemical concentrations to target tissue or blood concentrations in vivo is fraught with uncertainties, and modeling is dependent upon pharmacokinetic variables not measured in in vitro assays. To address this need, new tools have been created for characterizing, simulating, and evaluating chemical toxicokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissue microdosimetry PK models relate whole-body chemical exposures to cell-scale concentrations. These tools rely on high-throughput in vitro measurements, and successful methods exist for pharmaceutical compounds that determine PK from limited in vitro measurements and chemical structure-derived property predictions. These high throughput (HT) methods provide a more rapid and less resource–intensive alternative to traditional PK model development. We have augmented these in vitro data with chemical structure-based descriptors and mechanistic tissue partitioning models to construct HTPBPK models for over three hundred environmental and pharmace
Composite Nanomechanics: A Mechanistic Properties Prediction
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Handler, Louis M.; Manderscheid, Jane M.
2007-01-01
A unique mechanistic theory is described to predict the properties of nanocomposites. The theory is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations hav e been programmed in a computer code. That computer code is used to predict 25 properties of a mononanofiber laminate. The results are pr esented graphically and discussed with respect to their practical sig nificance. Most of the results show smooth distributions. Results for matrix-dependent properties show bimodal through-the-thickness distr ibution with discontinuous changes from mode to mode.
Linking models and data on vegetation structure
NASA Astrophysics Data System (ADS)
Hurtt, G. C.; Fisk, J.; Thomas, R. Q.; Dubayah, R.; Moorcroft, P. R.; Shugart, H. H.
2010-06-01
For more than a century, scientists have recognized the importance of vegetation structure in understanding forest dynamics. Now future satellite missions such as Deformation, Ecosystem Structure, and Dynamics of Ice (DESDynI) hold the potential to provide unprecedented global data on vegetation structure needed to reduce uncertainties in terrestrial carbon dynamics. Here, we briefly review the uses of data on vegetation structure in ecosystem models, develop and analyze theoretical models to quantify model-data requirements, and describe recent progress using a mechanistic modeling approach utilizing a formal scaling method and data on vegetation structure to improve model predictions. Generally, both limited sampling and coarse resolution averaging lead to model initialization error, which in turn is propagated in subsequent model prediction uncertainty and error. In cases with representative sampling, sufficient resolution, and linear dynamics, errors in initialization tend to compensate at larger spatial scales. However, with inadequate sampling, overly coarse resolution data or models, and nonlinear dynamics, errors in initialization lead to prediction error. A robust model-data framework will require both models and data on vegetation structure sufficient to resolve important environmental gradients and tree-level heterogeneity in forest structure globally.
2008-06-06
biomass (Figures 5a and 6), as proposed by Biggs and ing causal mechanisms in the way that a mechanistic model Sanchez [1997]. In previous work...1987; L6pez-Veneroni and Cifuentes , 1994; Sahl et al., 1993; Chen et al., 2000]. Similarly, for our study years (2002- 2004), we observed...Hawaii, November. identified as important on the shelf, especially that of water Biggs, D. C., and L. L. Sanchez (1997), Nutrient enhanced primary
2015-01-01
We developed a three-dimensional fibroblastic nodule model for fibrogenicity testing of nanomaterials and investigated the role of fibroblast stemlike cells (FSCs) in the fibrogenic process. We showed that carbon nanotubes (CNTs) induced fibroblastic nodule formation in primary human lung fibroblast cultures resembling the fibroblastic foci in clinical fibrosis and promoted FSCs that are highly fibrogenic and a potential driving force of fibrogenesis. This study provides a predictive 3D model and mechanistic insight on CNT fibrogenesis. PMID:24873662
A Mechanistic-Based Healing Model for Self-Healing Glass Seals Used in Solid Oxide Fuel Cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Wei; Sun, Xin; Stephens, Elizabeth V.
The usage of self-healing glass as hermetic seals is a recent advancement in sealing technology development for the planar solid oxide fuel cells (SOFCs). Because of its capability of restoring the mechanical properties at elevated temperatures, the self-healing glass seal is expected to provide high reliability in maintaining the long-term structural integrity and functionality of SOFCs. In order to accommodate the design and to evaluate the effectiveness of such engineering seals under various thermo-mechanical operating conditions, computational modeling framework needs to be developed to accurately capture and predict the healing behavior of the glass material. In the present work, amore » mechanistic-based two-stage model was developed to study the stress and temperature-dependent crack healing of the self-healing glass materials. The model was first calibrated by experimental measurements combined with the kinetic Monte Carlo (kMC) simulation results and then implemented into the finite element analysis (FEA). The effects of various factors, i.e. stress, temperature, crack morphology, on the healing behavior of the glass were investigated and discussed.« less
Bird Migration Under Climate Change - A Mechanistic Approach Using Remote Sensing
NASA Technical Reports Server (NTRS)
Smith, James A.; Blattner, Tim; Messmer, Peter
2010-01-01
The broad-scale reductions and shifts that may be expected under climate change in the availability and quality of stopover habitat for long-distance migrants is an area of increasing concern for conservation biologists. Researchers generally have taken two broad approaches to the modeling of migration behaviour to understand the impact of these changes on migratory bird populations. These include models based on causal processes and their response to environmental stimulation, "mechanistic models", or models that primarily are based on observed animal distribution patterns and the correlation of these patterns with environmental variables, i.e. "data driven" models. Investigators have applied the latter technique to forecast changes in migration patterns with changes in the environment, for example, as might be expected under climate change, by forecasting how the underlying environmental data layers upon which the relationships are built will change over time. The learned geostatstical correlations are then applied to the modified data layers.. However, this is problematic. Even if the projections of how the underlying data layers will change are correct, it is not evident that the statistical relationships will remain the same, i.e. that the animal organism may not adapt its' behaviour to the changing conditions. Mechanistic models that explicitly take into account the physical, biological, and behaviour responses of an organism as well as the underlying changes in the landscape offer an alternative to address these shortcomings. The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies enable the application of the mechanistic models to predict how continental bird migration patterns may change in response to environmental change. In earlier work, we simulated the impact of effects of wetland loss and inter-annual variability on the fitness of migratory shorebirds in the central fly ways of North America. We demonstrated the phenotypic plasticity of a migratory population of Pectoral sandpipers consisting of an ensemble of 10,000 individual birds in response to changes in stopover locations using an individual based migration model driven by remotely sensed land surface data, climate data and biological field data. With the advent of new computing capabilities enabled hy recent GPU-GP computing paradigms and commodity hardware, it now is possible to simulate both larger ensemble populations and to incorporate more realistic mechanistic factors into migration models. Here, we take our first steps use these tools to study the impact of long-term drought variability on shorebird survival.
Synergies Between Quantum Mechanics and Machine Learning in Reaction Prediction.
Sadowski, Peter; Fooshee, David; Subrahmanya, Niranjan; Baldi, Pierre
2016-11-28
Machine learning (ML) and quantum mechanical (QM) methods can be used in two-way synergy to build chemical reaction expert systems. The proposed ML approach identifies electron sources and sinks among reactants and then ranks all source-sink pairs. This addresses a bottleneck of QM calculations by providing a prioritized list of mechanistic reaction steps. QM modeling can then be used to compute the transition states and activation energies of the top-ranked reactions, providing additional or improved examples of ranked source-sink pairs. Retraining the ML model closes the loop, producing more accurate predictions from a larger training set. The approach is demonstrated in detail using a small set of organic radical reactions.
Mechanistic modelling of drug release from a polymer matrix using magnetic resonance microimaging.
Kaunisto, Erik; Tajarobi, Farhad; Abrahmsen-Alami, Susanna; Larsson, Anette; Nilsson, Bernt; Axelsson, Anders
2013-03-12
In this paper a new model describing drug release from a polymer matrix tablet is presented. The utilization of the model is described as a two step process where, initially, polymer parameters are obtained from a previously published pure polymer dissolution model. The results are then combined with drug parameters obtained from literature data in the new model to predict solvent and drug concentration profiles and polymer and drug release profiles. The modelling approach was applied to the case of a HPMC matrix highly loaded with mannitol (model drug). The results showed that the drug release rate can be successfully predicted, using the suggested modelling approach. However, the model was not able to accurately predict the polymer release profile, possibly due to the sparse amount of usable pure polymer dissolution data. In addition to the case study, a sensitivity analysis of model parameters relevant to drug release was performed. The analysis revealed important information that can be useful in the drug formulation process. Copyright © 2013 Elsevier B.V. All rights reserved.
DOT National Transportation Integrated Search
2010-02-01
This study developed traffic inputs for use with the Guide for the Mechanistic-Empirical Design of New & Rehabilitated Pavement Structures (MEPDG) in Virginia and sought to determine if the predicted distresses showed differences between site-specifi...
NASA Astrophysics Data System (ADS)
Siegwolf, R. T. W.; Buchmann, N.; Frank, D.; Joos, F.; Kahmen, A.; Treydte, K.; Leuenberger, M.; Saurer, M.
2012-04-01
Trees play are a critical role in the carbon cycle - their photosynthetic assimilation is one of the largest terrestrial carbon fluxes and their standing biomass represents the largest carbon pool of the terrestrial biosphere. Understanding how tree physiology and growth respond to long-term environmental change is pivotal to predict the magnitude and direction of the terrestrial carbon sink. iTREE is an interdisciplinary research framework to capitalize on synergies among leading dendroclimatologists, plant physiologists, isotope specialists, and global carbon cycle modelers with the objectives of reducing uncertainties related to tree/forest growth in the context of changing natural environments. Cross-cutting themes in our project are tree rings, stable isotopes, and mechanistic modelling. We will (i) establish a European network of tree-ring based isotope time-series to retrodict interannual to long-term tree physiological changes, (ii) conduct laboratory and field experiments to adapt a mechanistic isotope model to derive plant physiological variables from tree-ring isotopes, (iii) implement this model into a dynamic global vegetation model, and perform subsequent model-data validation exercises to refine model representation of plant physiological processes and (iv) attribute long-term variation in tree growth to plant physiological and environmental drivers, and identify how our refined knowledge revises predictions of the coupled carbon-cycle climate system. We will contribute to i) advanced quantifications of long-term variation in tree growth across Central Europe, ii) novel long-term information on key physiological processes that underlie variations in tree growth, and iii) improved carbon cycle models that can be employed to revise predictions of the coupled carbon-cycle climate system. Hence iTREE will significantly contribute towards a seamless understanding of the responses of terrestrial ecosystems to long-term environmental change, and ultimately help reduce uncertainties of the magnitude and direction of the past and future terrestrial carbon sink.
Macklin, Paul; Cristini, Vittorio
2013-01-01
Simulating cancer behavior across multiple biological scales in space and time, i.e., multiscale cancer modeling, is increasingly being recognized as a powerful tool to refine hypotheses, focus experiments, and enable more accurate predictions. A growing number of examples illustrate the value of this approach in providing quantitative insight on the initiation, progression, and treatment of cancer. In this review, we introduce the most recent and important multiscale cancer modeling works that have successfully established a mechanistic link between different biological scales. Biophysical, biochemical, and biomechanical factors are considered in these models. We also discuss innovative, cutting-edge modeling methods that are moving predictive multiscale cancer modeling toward clinical application. Furthermore, because the development of multiscale cancer models requires a new level of collaboration among scientists from a variety of fields such as biology, medicine, physics, mathematics, engineering, and computer science, an innovative Web-based infrastructure is needed to support this growing community. PMID:21529163
High-Temperature Cast Aluminum for Efficient Engines
NASA Astrophysics Data System (ADS)
Bobel, Andrew C.
Accurate thermodynamic databases are the foundation of predictive microstructure and property models. An initial assessment of the commercially available Thermo-Calc TCAL2 database and the proprietary aluminum database of QuesTek demonstrated a large degree of deviation with respect to equilibrium precipitate phase prediction in the compositional region of interest when compared to 3-D atom probe tomography (3DAPT) and transmission electron microscopy (TEM) experimental results. New compositional measurements of the Q-phase (Al-Cu-Mg-Si phase) led to a remodeling of the Q-phase thermodynamic description in the CALPHAD databases which has produced significant improvements in the phase prediction capabilities of the thermodynamic model. Due to the unique morphologies of strengthening precipitate phases commonly utilized in high-strength cast aluminum alloys, the development of new microstructural evolution models to describe both rod and plate particle growth was critical for accurate mechanistic strength models which rely heavily on precipitate size and shape. Particle size measurements through both 3DAPT and TEM experiments were used in conjunction with literature results of many alloy compositions to develop a physical growth model for the independent prediction of rod radii and rod length evolution. In addition a machine learning (ML) model was developed for the independent prediction of plate thickness and plate diameter evolution as a function of alloy composition, aging temperature, and aging time. The developed models are then compared with physical growth laws developed for spheres and modified for ellipsoidal morphology effects. Analysis of the effect of particle morphology on strength enhancement has been undertaken by modification of the Orowan-Ashby equation for 〈110〉 alpha-Al oriented finite rods in addition to an appropriate version for similarly oriented plates. A mechanistic strengthening model was developed for cast aluminum alloys containing both rod and plate-like precipitates. The model accurately accounts for the temperature dependence of particle nucleation and growth, solid solution strengthening, Si eutectic strength, and base aluminum yield strength. Strengthening model predictions of tensile yield strength are in excellent agreement with experimental observations over a wide range of aluminum alloy systems, aging temperatures, and test conditions. The developed models enable the prediction of the required particle morphology and volume fraction necessary to achieve target property goals in the design of future aluminum alloys. The effect of partitioning elements to the Q-phase was also considered for the potential to control the nucleation rate, reduce coarsening, and control the evolution of particle morphology. Elements were selected based on density functional theory (DFT) calculations showing the prevalence of certain elements to partition to the Q-phase. 3DAPT experiments were performed on Q-phase containing wrought alloys with these additions and show segregation of certain elements to the Q-phase with relative agreement to DFT predictions.
Klinke, David J; Wang, Qing
2016-01-01
A major barrier for broadening the efficacy of immunotherapies for cancer is identifying key mechanisms that limit the efficacy of tumor infiltrating lymphocytes. Yet, identifying these mechanisms using human samples and mouse models for cancer remains a challenge. While interactions between cancer and the immune system are dynamic and non-linear, identifying the relative roles that biological components play in regulating anti-tumor immunity commonly relies on human intuition alone, which can be limited by cognitive biases. To assist natural intuition, modeling and simulation play an emerging role in identifying therapeutic mechanisms. To illustrate the approach, we developed a multi-scale mechanistic model to describe the control of tumor growth by a primary response of CD8+ T cells against defined tumor antigens using the B16 C57Bl/6 mouse model for malignant melanoma. The mechanistic model was calibrated to data obtained following adenovirus-based immunization and validated to data obtained following adoptive transfer of transgenic CD8+ T cells. More importantly, we use simulation to test whether the postulated network topology, that is the modeled biological components and their associated interactions, is sufficient to capture the observed anti-tumor immune response. Given the available data, the simulation results also provided a statistical basis for quantifying the relative importance of different mechanisms that underpin CD8+ T cell control of B16F10 growth. By identifying conditions where the postulated network topology is incomplete, we illustrate how this approach can be used as part of an iterative design-build-test cycle to expand the predictive power of the model.
Flood forecasting using non-stationarity in a river with tidal influence - a feasibility study
NASA Astrophysics Data System (ADS)
Killick, Rebecca; Kretzschmar, Ann; Ilic, Suzi; Tych, Wlodek
2017-04-01
Flooding is the most common natural hazard causing damage, disruption and loss of life worldwide. Despite improvements in modelling and forecasting of water levels and flood inundation (Kretzschmar et al., 2014; Hoitink and Jay, 2016), there are still large discrepancies between predictions and observations particularly during storm events when accurate predictions are most important. Many models exist for forecasting river levels (Smith et al., 2013; Leedal et al., 2013) however they commonly assume that the errors in the data are independent, stationary and normally distributed. This is generally not the case especially during storm events suggesting that existing models are not describing the drivers of river level in an appropriate fashion. Further challenges exist in the lower sections of a river influenced by both river and tidal flows and their interaction and there is scope for improvement in prediction. This paper investigates the use of a powerful statistical technique to adaptively forecast river levels by modelling the process as locally stationary. The proposed methodology takes information on both upstream and downstream river levels and incorporates meteorological information (rainfall forecasts) and tidal levels when required to forecast river levels at a specified location. Using this approach, a single model will be capable of predicting water levels in both tidal and non-tidal river reaches. In this pilot project, the methodology of Smith et al. (2013) using harmonic tidal analysis and data based mechanistic modelling is compared with the methodology developed by Killick et al. (2016) utilising data-driven wavelet decomposition to account for the information contained in the upstream and downstream river data to forecast a non-stationary time-series. Preliminary modelling has been carried out using the tidal stretch of the River Lune in North-west England and initial results are presented here. Future work includes expanding the methodology to forecast river levels at a network of locations simultaneously. References Hoitink, A. J. F., and D. A. Jay (2016), Tidal river dynamics: Implications for deltas, Rev. Geophys., 54, 240-272 Killick, R., Knight, M., Nason, G.P., Eckley, I.A. (2016) The Local Partial Autocorrelation Function and its Application to the Forecasting of Locally Stationary Time Series. Submitted Kretzschmar, Ann and Tych, Wlodek and Chappell, Nick A (2014) Reversing hydrology: estimation of sub-hourly rainfall time-series from streamflow. Env. Modell Softw., 60. pp. 290-301 D. Leedal, A. H. Weerts, P. J. Smith, & K. J. Beven. (2013). Application of data-based mechanistic modelling for flood forecasting at multiple locations in the Eden catchment in the National Flood Forecasting System (England and Wales). HESS, 17(1), 177-185. Smith, P., Beven, K., Horsburgh, K., Hardaker, P., & Collier, C. (2013). Data-based mechanistic modelling of tidally affected river reaches for flood warning purposes: An example on the River Dee, UK. , Q.J.R. Meteorol. Soc. 139(671), 340-349.
Olafuyi, Olusola; Coleman, Michael; Badhan, Raj K S
2017-11-01
Antimalarial therapy during pregnancy poses important safety concerns due to potential teratogenicity and maternal physiological and biochemical changes during gestation. Piperaquine (PQ) has gained interest for use in pregnancy in response to increasing resistance towards sulfadoxine-pyrimethamine in sub-Saharan Africa. Coinfection with HIV is common in many developing countries, however, little is known about the impact of antiretroviral (ARV) mediated drug-drug interaction (DDI) on piperaquine pharmacokinetics during pregnancy. This study applied mechanistic pharmacokinetic modelling to predict pharmacokinetics in non-pregnant and pregnant patients, which was validated in distinct customised population groups from Thailand, Sudan and Papua New Guinea. In each population group, no significant differences in day 7 concentrations were observed during different gestational weeks (GW) (weeks 10-40), supporting the notion that piperaquine is safe throughout pregnancy with consistent pharmacokinetics, although possible teratogenicity may limit this. Antiretroviral-mediated DDIs (efavirenz and ritonavir) had moderate effects on piperaquine during different gestational weeks with a predicted AUC ratio in the range 0.56-0.8 and 1.64-1.79 for efavirenz and ritonavir, respectively, over GW 10-40, with a reduction in circulating human serum albumin significantly reducing the number of subjects attaining the day 7 (post-dose) therapeutic efficacy concentrations under both efavirenz and ritonavir DDIs. This present model successfully mechanistically predicted the pharmacokinetics of piperaquine in pregnancy to be unchanged with respect to non-pregnant women, in the light of factors such as malaria/HIV co-infection. However, antiretroviral-mediated DDIs could significantly alter piperaquine pharmacokinetics. Further model refinement will include collation of relevant physiological and biochemical alterations common to HIV/malaria patients. Copyright © 2017 John Wiley & Sons, Ltd.
2011-01-01
Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them); homological validity (including species validity and strain validity), pathogenic validity (including ontopathogenic validity and triggering validity), mechanistic validity, face validity (including ethological and biomarker validity) and predictive validity (including induction and remission validity). Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity) and during adulthood (for example, stress: triggering validity). Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias) or biological mechanisms (such as dysfunction of the hormonal stress axis regulation) underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity) or biological (biomarker validity) outcomes: for example anhedonic behavior (ethological validity) or elevated corticosterone (biomarker validity). Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity) and between the effects of the treatments on the two organisms (remission validity). The relevance of this framework is then discussed regarding various animal models of depression. PMID:22738250
Using network biology to bridge pharmacokinetics and pharmacodynamics in oncology.
Kirouac, D C; Onsum, M D
2013-09-04
If mathematical modeling is to be used effectively in cancer drug development, future models must take into account both the mechanistic details of cellular signal transduction networks and the pharmacokinetics (PK) of drugs used to inhibit their oncogenic activity. In this perspective, we present an approach to building multiscale models that capture systems-level architectural features of oncogenic signaling networks, and describe how these models can be used to design combination therapies and identify predictive biomarkers in silico.CPT: Pharmacometrics & Systems Pharmacology (2013) 2, e71; doi:10.1038/psp.2013.38; published online 4 September 2013.
A Unifying Mechanistic Model of Selective Attention in Spiking Neurons
Bobier, Bruce; Stewart, Terrence C.; Eliasmith, Chris
2014-01-01
Visuospatial attention produces myriad effects on the activity and selectivity of cortical neurons. Spiking neuron models capable of reproducing a wide variety of these effects remain elusive. We present a model called the Attentional Routing Circuit (ARC) that provides a mechanistic description of selective attentional processing in cortex. The model is described mathematically and implemented at the level of individual spiking neurons, with the computations for performing selective attentional processing being mapped to specific neuron types and laminar circuitry. The model is used to simulate three studies of attention in macaque, and is shown to quantitatively match several observed forms of attentional modulation. Specifically, ARC demonstrates that with shifts of spatial attention, neurons may exhibit shifting and shrinking of receptive fields; increases in responses without changes in selectivity for non-spatial features (i.e. response gain), and; that the effect on contrast-response functions is better explained as a response-gain effect than as contrast-gain. Unlike past models, ARC embodies a single mechanism that unifies the above forms of attentional modulation, is consistent with a wide array of available data, and makes several specific and quantifiable predictions. PMID:24921249
A discrete model of Drosophila eggshell patterning reveals cell-autonomous and juxtacrine effects.
Fauré, Adrien; Vreede, Barbara M I; Sucena, Elio; Chaouiya, Claudine
2014-03-01
The Drosophila eggshell constitutes a remarkable system for the study of epithelial patterning, both experimentally and through computational modeling. Dorsal eggshell appendages arise from specific regions in the anterior follicular epithelium that covers the oocyte: two groups of cells expressing broad (roof cells) bordered by rhomboid expressing cells (floor cells). Despite the large number of genes known to participate in defining these domains and the important modeling efforts put into this developmental system, key patterning events still lack a proper mechanistic understanding and/or genetic basis, and the literature appears to conflict on some crucial points. We tackle these issues with an original, discrete framework that considers single-cell models that are integrated to construct epithelial models. We first build a phenomenological model that reproduces wild type follicular epithelial patterns, confirming EGF and BMP signaling input as sufficient to establish the major features of this patterning system within the anterior domain. Importantly, this simple model predicts an instructive juxtacrine signal linking the roof and floor domains. To explore this prediction, we define a mechanistic model that integrates the combined effects of cellular genetic networks, cell communication and network adjustment through developmental events. Moreover, we focus on the anterior competence region, and postulate that early BMP signaling participates with early EGF signaling in its specification. This model accurately simulates wild type pattern formation and is able to reproduce, with unprecedented level of precision and completeness, various published gain-of-function and loss-of-function experiments, including perturbations of the BMP pathway previously seen as conflicting results. The result is a coherent model built upon rules that may be generalized to other epithelia and developmental systems.
McGill, Mitchell R.; Jaeschke, Hartmut
2015-01-01
SUMMARY Introduction Drug hepatotoxicity is a major clinical issue. Acetaminophen (APAP) overdose is especially common. Serum biomarkers used to follow patient progress reflect either liver injury or function, but focus on biomarkers that can provide insight into the basic mechanisms of hepatotoxicity is increasing and enabling us to translate mechanisms of toxicity from animal models to humans. Areas covered We review recent advances in mechanistic serum biomarker research in drug hepatotoxicity. Specifically, biomarkers for reactive drug intermdiates, mitochondrial dysfunction, nuclear DNA damage, mode of cell death and inflammation are discussed, as well as microRNAs. Emphasis is placed on APAP-induced liver injury. Expert Opinion Several serum biomarkers of reactive drug intermediates, mitochondrial damage, nuclear DNA damage, apoptosis and necrosis, and inflammation have been described. These studies have provided evidence that mitochondrial damage is critical in APAP hepatotoxicity in humans, while apoptosis has only a minor role, and inflammation is important for recovery and regeneration after APAP overdose. Additionally, mechanistic serum biomarkers have been shown to predict outcome as well as, or better than, some clinical scores. In the future, such biomarkers will help determine the need for liver transplantation and, with improved understanding of the human pathophysiology, identify novel therapeutic targets. PMID:24836926
Marshall, David J; McQuaid, Christopher D
2011-01-22
The universal temperature-dependence model (UTD) of the metabolic theory of ecology (MTE) proposes that temperature controls mass-scaled, whole-animal resting metabolic rate according to the first principles of physics (Boltzmann kinetics). Controversy surrounds the model's implication of a mechanistic basis for metabolism that excludes the effects of adaptive regulation, and it is unclear how this would apply to organisms that live in fringe environments and typically show considerable metabolic adaptation. We explored thermal scaling of metabolism in a rocky-shore eulittoral-fringe snail (Echinolittorina malaccana) that experiences constrained energy gain and fluctuating high temperatures (between 25°C and approximately 50°C) during prolonged emersion (weeks). In contrast to the prediction of the UTD model, metabolic rate was often negatively related to temperature over a benign range (30-40°C), the relationship depending on (i) the temperature range, (ii) the degree of metabolic depression (related to the quiescent period), and (iii) whether snails were isolated within their shells. Apparent activation energies (E) varied between 0.05 and -0.43 eV, deviating excessively from the UTD's predicted range of between 0.6 and 0.7 eV. The lowering of metabolism when heated should improve energy conservation in a high-temperature environment and challenges both the theory's generality and its mechanistic basis.
Higher plant modelling for life support applications: first results of a simple mechanistic model
NASA Astrophysics Data System (ADS)
Hezard, Pauline; Dussap, Claude-Gilles; Sasidharan L, Swathy
2012-07-01
In the case of closed ecological life support systems, the air and water regeneration and food production are performed using microorganisms and higher plants. Wheat, rice, soybean, lettuce, tomato or other types of eatable annual plants produce fresh food while recycling CO2 into breathable oxygen. Additionally, they evaporate a large quantity of water, which can be condensed and used as potable water. This shows that recycling functions of air revitalization and food production are completely linked. Consequently, the control of a growth chamber for higher plant production has to be performed with efficient mechanistic models, in order to ensure a realistic prediction of plant behaviour, water and gas recycling whatever the environmental conditions. Purely mechanistic models of plant production in controlled environments are not available yet. This is the reason why new models must be developed and validated. This work concerns the design and test of a simplified version of a mathematical model coupling plant architecture and mass balance purposes in order to compare its results with available data of lettuce grown in closed and controlled chambers. The carbon exchange rate, water absorption and evaporation rate, biomass fresh weight as well as leaf surface are modelled and compared with available data. The model consists of four modules. The first one evaluates plant architecture, like total leaf surface, leaf area index and stem length data. The second one calculates the rate of matter and energy exchange depending on architectural and environmental data: light absorption in the canopy, CO2 uptake or release, water uptake and evapotranspiration. The third module evaluates which of the previous rates is limiting overall biomass growth; and the last one calculates biomass growth rate depending on matter exchange rates, using a global stoichiometric equation. All these rates are a set of differential equations, which are integrated with time in order to provide total biomass fresh weight during the full growth duration. The model predicts a growth with exponential rate at the beginning and then it becomes linear for the end of the growth; this follows rather accurately the experimental data. Even if this model is too simple to be realistic for more complex plants in changing environments, this is the first step for an integrated approach of plant growth accounting of architectural and mass transfer limitations.
Plant traits determine forest flammability
NASA Astrophysics Data System (ADS)
Zylstra, Philip; Bradstock, Ross
2016-04-01
Carbon and nutrient cycles in forest ecosystems are influenced by their inherent flammability - a property determined by the traits of the component plant species that form the fuel and influence the micro climate of a fire. In the absence of a model capable of explaining the complexity of such a system however, flammability is frequently represented by simple metrics such as surface fuel load. The implications of modelling fire - flammability feedbacks using surface fuel load were examined and compared to a biophysical, mechanistic model (Forest Flammability Model) that incorporates the influence of structural plant traits (e.g. crown shape and spacing) and leaf traits (e.g. thickness, dimensions and moisture). Fuels burn with values of combustibility modelled from leaf traits, transferring convective heat along vectors defined by flame angle and with plume temperatures that decrease with distance from the flame. Flames are re-calculated in one-second time-steps, with new leaves within the plant, neighbouring plants or higher strata ignited when the modelled time to ignition is reached, and other leaves extinguishing when their modelled flame duration is exceeded. The relative influence of surface fuels, vegetation structure and plant leaf traits were examined by comparing flame heights modelled using three treatments that successively added these components within the FFM. Validation was performed across a diverse range of eucalypt forests burnt under widely varying conditions during a forest fire in the Brindabella Ranges west of Canberra (ACT) in 2003. Flame heights ranged from 10 cm to more than 20 m, with an average of 4 m. When modelled from surface fuels alone, flame heights were on average 1.5m smaller than observed values, and were predicted within the error range 28% of the time. The addition of plant structure produced predicted flame heights that were on average 1.5m larger than observed, but were correct 53% of the time. The over-prediction in this case was the result of a small number of large errors, where higher strata such as forest canopy were modelled to ignite but did not. The addition of leaf traits largely addressed this error, so that the mean flame height over-prediction was reduced to 0.3m and the fully parameterised FFM gave correct predictions 62% of the time. When small (<1m) flames were excluded, the fully parameterised model correctly predicted flame heights 12 times more often than could be predicted using surface fuels alone, and the Mean Absolute Error was 4 times smaller. The inadequate consideration of plant traits within a mechanistic framework introduces significant error to forest fire behaviour modelling. The FFM provides a solution to this, and an avenue by which plant trait information can be used to better inform Global Vegetation Models and decision-making tools used to mitigate the impacts of fire.
Shorebird Migration Patterns in Response to Climate Change: A Modeling Approach
NASA Technical Reports Server (NTRS)
Smith, James A.
2010-01-01
The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies offer new opportunities for the application of mechanistic models to predict how continental scale bird migration patterns may change in response to environmental change. In earlier studies, we explored the phenotypic plasticity of a migratory population of Pectoral sandpipers by simulating the movement patterns of an ensemble of 10,000 individual birds in response to changes in stopover locations as an indicator of the impacts of wetland loss and inter-annual variability on the fitness of migratory shorebirds. We used an individual based, biophysical migration model, driven by remotely sensed land surface data, climate data, and biological field data. Mean stop-over durations and stop-over frequency with latitude predicted from our model for nominal cases were consistent with results reported in the literature and available field data. In this study, we take advantage of new computing capabilities enabled by recent GP-GPU computing paradigms and commodity hardware (general purchase computing on graphics processing units). Several aspects of our individual based (agent modeling) approach lend themselves well to GP-GPU computing. We have been able to allocate compute-intensive tasks to the graphics processing units, and now simulate ensembles of 400,000 birds at varying spatial resolutions along the central North American flyway. We are incorporating additional, species specific, mechanistic processes to better reflect the processes underlying bird phenotypic plasticity responses to different climate change scenarios in the central U.S.
Gupta, S; Basant, N; Mohan, D; Singh, K P
2016-07-01
Experimental determinations of the rate constants of the reaction of NO3 with a large number of organic chemicals are tedious, and time and resource intensive; and the development of computational methods has widely been advocated. In this study, we have developed room-temperature (298 K) and temperature-dependent quantitative structure-reactivity relationship (QSRR) models based on the ensemble learning approaches (decision tree forest (DTF) and decision treeboost (DTB)) for predicting the rate constant of the reaction of NO3 radicals with diverse organic chemicals, under OECD guidelines. Predictive powers of the developed models were established in terms of statistical coefficients. In the test phase, the QSRR models yielded a correlation (r(2)) of >0.94 between experimental and predicted rate constants. The applicability domains of the constructed models were determined. An attempt has been made to provide the mechanistic interpretation of the selected features for QSRR development. The proposed QSRR models outperformed the previous reports, and the temperature-dependent models offered a much wider applicability domain. This is the first report presenting a temperature-dependent QSRR model for predicting the nitrate radical reaction rate constant at different temperatures. The proposed models can be useful tools in predicting the reactivities of chemicals towards NO3 radicals in the atmosphere, hence, their persistence and exposure risk assessment.
Johansson, Michael A; Reich, Nicholas G; Hota, Aditi; Brownstein, John S; Santillana, Mauricio
2016-09-26
Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model.
Johansson, Michael A.; Reich, Nicholas G.; Hota, Aditi; Brownstein, John S.; Santillana, Mauricio
2016-01-01
Dengue viruses, which infect millions of people per year worldwide, cause large epidemics that strain healthcare systems. Despite diverse efforts to develop forecasting tools including autoregressive time series, climate-driven statistical, and mechanistic biological models, little work has been done to understand the contribution of different components to improved prediction. We developed a framework to assess and compare dengue forecasts produced from different types of models and evaluated the performance of seasonal autoregressive models with and without climate variables for forecasting dengue incidence in Mexico. Climate data did not significantly improve the predictive power of seasonal autoregressive models. Short-term and seasonal autocorrelation were key to improving short-term and long-term forecasts, respectively. Seasonal autoregressive models captured a substantial amount of dengue variability, but better models are needed to improve dengue forecasting. This framework contributes to the sparse literature of infectious disease prediction model evaluation, using state-of-the-art validation techniques such as out-of-sample testing and comparison to an appropriate reference model. PMID:27665707
Concepts and tools for predictive modeling of microbial dynamics.
Bernaerts, Kristel; Dens, Els; Vereecken, Karen; Geeraerd, Annemie H; Standaert, Arnout R; Devlieghere, Frank; Debevere, Johan; Van Impe, Jan F
2004-09-01
Description of microbial cell (population) behavior as influenced by dynamically changing environmental conditions intrinsically needs dynamic mathematical models. In the past, major effort has been put into the modeling of microbial growth and inactivation within a constant environment (static models). In the early 1990s, differential equation models (dynamic models) were introduced in the field of predictive microbiology. Here, we present a general dynamic model-building concept describing microbial evolution under dynamic conditions. Starting from an elementary model building block, the model structure can be gradually complexified to incorporate increasing numbers of influencing factors. Based on two case studies, the fundamentals of both macroscopic (population) and microscopic (individual) modeling approaches are revisited. These illustrations deal with the modeling of (i) microbial lag under variable temperature conditions and (ii) interspecies microbial interactions mediated by lactic acid production (product inhibition). Current and future research trends should address the need for (i) more specific measurements at the cell and/or population level, (ii) measurements under dynamic conditions, and (iii) more comprehensive (mechanistically inspired) model structures. In the context of quantitative microbial risk assessment, complexity of the mathematical model must be kept under control. An important challenge for the future is determination of a satisfactory trade-off between predictive power and manageability of predictive microbiology models.
DOT National Transportation Integrated Search
2009-01-01
In the MechanisticEmpirical Pavement Design Guide (M-EPDG), prediction of flexible pavement response and performance needs an input of dynamic modulus of hot-mix asphalt (HMA) at all three levels of hierarchical inputs. This study was intended to ...
The adverse outcome pathway (AOP) framework is intended to help support greater use of mechanistic toxicology data as a basis for risk assessment and/or regulatory decision-making. While there have been clear advances in the ability to rapidly generate mechanistically-oriented da...
Ecology and the ratchet of events: climate variability, niche dimensions, and species distributions
Jackson, Stephen T.; Betancourt, Julio L.; Booth, Robert K.; Gray, Stephen T.
2009-01-01
Climate change in the coming centuries will be characterized by interannual, decadal, and multidecadal fluctuations superimposed on anthropogenic trends. Predicting ecological and biogeographic responses to these changes constitutes an immense challenge for ecologists. Perspectives from climatic and ecological history indicate that responses will be laden with contingencies, resulting from episodic climatic events interacting with demographic and colonization events. This effect is compounded by the dependency of environmental sensitivity upon life-stage for many species. Climate variables often used in empirical niche models may become decoupled from the proximal variables that directly influence individuals and populations. Greater predictive capacity, and more-fundamental ecological and biogeographic understanding, will come from integration of correlational niche modeling with mechanistic niche modeling, dynamic ecological modeling, targeted experiments, and systematic observations of past and present patterns and dynamics.
Ecology and the ratchet of events: Climate variability, niche dimensions, and species distributions
Jackson, S.T.; Betancourt, J.L.; Booth, R.K.; Gray, S.T.
2009-01-01
Climate change in the coming centuries will be characterized by interannual, decadal, and multidecadal fluctuations superimposed on anthropogenic trends. Predicting ecological and biogeographic responses to these changes constitutes an immense challenge for ecologists. Perspectives from climatic and ecological history indicate that responses will be laden with contingencies, resulting from episodic climatic events interacting with demographic and colonization events. This effect is compounded by the dependency of environmental sensitivity upon life-stage for many species. Climate variables often used in empirical niche models may become decoupled from the proximal variables that directly influence individuals and populations. Greater predictive capacity, and morefundamental ecological and biogeographic understanding, will come from integration of correlational niche modeling with mechanistic niche modeling, dynamic ecological modeling, targeted experiments, and systematic observations of past and present patterns and dynamics.
Evolution-informed forecasting of seasonal influenza A (H3N2)
Du, Xiangjun; King, Aaron A.; Woods, Robert J.; Pascual, Mercedes
2018-01-01
Inter-pandemic or seasonal influenza exacts an enormous annual burden both in terms of human health and economic impact. Incidence prediction ahead of season remains a challenge largely because of the virus’ antigenic evolution. We propose here a forecasting approach that incorporates evolutionary change into a mechanistic epidemiological model. The proposed models are simple enough that their parameters can be estimated from retrospective surveillance data. These models link amino-acid sequences of hemagglutinin epitopes with a transmission model for seasonal H3N2 influenza, also informed by H1N1 levels. With a monthly time series of H3N2 incidence in the United States over 10 years, we demonstrate the feasibility of prediction ahead of season and an accurate real-time forecast for the 2016/2017 influenza season. PMID:29070700
Ecology and the ratchet of events: Climate variability, niche dimensions, and species distributions
Jackson, Stephen T.; Betancourt, Julio L.; Booth, Robert K.; Gray, Stephen T.
2009-01-01
Climate change in the coming centuries will be characterized by interannual, decadal, and multidecadal fluctuations superimposed on anthropogenic trends. Predicting ecological and biogeographic responses to these changes constitutes an immense challenge for ecologists. Perspectives from climatic and ecological history indicate that responses will be laden with contingencies, resulting from episodic climatic events interacting with demographic and colonization events. This effect is compounded by the dependency of environmental sensitivity upon life-stage for many species. Climate variables often used in empirical niche models may become decoupled from the proximal variables that directly influence individuals and populations. Greater predictive capacity, and more-fundamental ecological and biogeographic understanding, will come from integration of correlational niche modeling with mechanistic niche modeling, dynamic ecological modeling, targeted experiments, and systematic observations of past and present patterns and dynamics. PMID:19805104
Systems Toxicology: From Basic Research to Risk Assessment
2014-01-01
Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777
Matsunaga, Norikazu; Fukuchi, Yukina; Imawaka, Haruo; Tamai, Ikumi
2018-05-01
Functional interplay between transporters and drug-metabolizing enzymes is currently one of the hottest topics in the field of drug metabolism and pharmacokinetics. Uptake transporter-enzyme interplay is important to determine intrinsic hepatic clearance based on the extended clearance concept. Enzyme and efflux transporter interplay, which includes both sinusoidal (basolateral) and canalicular efflux transporters, determines the fate of metabolites formed in the liver. As sandwich-cultured hepatocytes (SCHs) maintain metabolic activities and form a canalicular network, the whole interplay between uptake and efflux transporters and drug-metabolizing enzymes can be investigated simultaneously. In this article, we review the utility and applicability of SCHs for mechanistic understanding of hepatic disposition of both parent drugs and metabolites. In addition, the utility of SCHs for mimicking species-specific disposition of parent drugs and metabolites in vivo is described. We also review application of SCHs for clinically relevant prediction of drug-drug interactions caused by drugs and metabolites. The usefulness of mathematical modeling of hepatic disposition of parent drugs and metabolites in SCHs is described to allow a quantitative understanding of an event in vitro and to develop a more advanced model to predict in vivo disposition. Copyright © 2018 by The American Society for Pharmacology and Experimental Therapeutics.
Systems toxicology: from basic research to risk assessment.
Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C
2014-03-17
Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.
NASA Astrophysics Data System (ADS)
Lehmann, Peter; von Ruette, Jonas; Fan, Linfeng; Or, Dani
2014-05-01
Rapid debris flows initiated by rainfall induced shallow landslides present a highly destructive natural hazard in steep terrain. The impact and run-out paths of debris flows depend on the volume, composition and initiation zone of released material and are requirements to make accurate debris flow predictions and hazard maps. For that purpose we couple the mechanistic 'Catchment-scale Hydro-mechanical Landslide Triggering (CHLT)' model to compute timing, location, and landslide volume with simple approaches to estimate debris flow runout distances. The runout models were tested using two landslide inventories obtained in the Swiss Alps following prolonged rainfall events. The predicted runout distances were in good agreement with observations, confirming the utility of such simple models for landscape scale estimates. In a next step debris flow paths were computed for landslides predicted with the CHLT model for a certain range of soil properties to explore its effect on runout distances. This combined approach offers a more complete spatial picture of shallow landslide and subsequent debris flow hazards. The additional information provided by CHLT model concerning location, shape, soil type and water content of the released mass may also be incorporated into more advanced models of runout to improve predictability and impact of such abruptly-released mass.
NASA Astrophysics Data System (ADS)
Gallice, A.
2015-12-01
Stream temperature controls important aspects of the riverine habitat, such as the rate of spawning or death of many fish species, or the concentration of numerous dissolved substances. In the current context of accelerating climate change, the future evolution of stream temperature is regarded as uncertain, particularly in the Alps. This uncertainty fostered the development of many prediction models, which are usually classified in two categories: mechanistic models and statistical models. Based on the numerical resolution of physical conservation laws, mechanistic models are generally considered to provide more reliable long-term estimates than regression models. However, despite their physical basis, these models are observed to differ quite significantly in some aspects of their implementation, notably (1) the routing of water in the river channel and (2) the estimation of the temperature of groundwater discharging into the stream. For each one of these two aspects, we considered several of the standard modeling approaches reported in the literature and implemented them in a new modular framework. The latter is based on the spatially-distributed snow model Alpine3D, which is essentially used in the framework to compute the amount of water infiltrating in the upper soil layer. Starting from there, different methods can be selected for the computation of the water and energy fluxes in the hillslopes and in the river network. We relied on this framework to compare the various methodologies for river channel routing and groundwater temperature modeling. We notably assessed the impact of each these approaches on the long-term stream temperature predictions of the model under a typical climate change scenario. The case study was conducted over a high Alpine catchment in Switzerland, whose hydrological and thermal regimes are expected to be markedly affected by climate change. The results show that the various modeling approaches lead to significant differences in the model predictions, and that these differences may be larger than the uncertainties in future air temperature. It is also shown that the temperature of groundwater discharging into the stream has a marked impact on the modeled stream temperature at the catchment outlet.
Mesolimbic confidence signals guide perceptual learning in the absence of external feedback
Guggenmos, Matthias; Wilbertz, Gregor; Hebart, Martin N; Sterzer, Philipp
2016-01-01
It is well established that learning can occur without external feedback, yet normative reinforcement learning theories have difficulties explaining such instances of learning. Here, we propose that human observers are capable of generating their own feedback signals by monitoring internal decision variables. We investigated this hypothesis in a visual perceptual learning task using fMRI and confidence reports as a measure for this monitoring process. Employing a novel computational model in which learning is guided by confidence-based reinforcement signals, we found that mesolimbic brain areas encoded both anticipation and prediction error of confidence—in remarkable similarity to previous findings for external reward-based feedback. We demonstrate that the model accounts for choice and confidence reports and show that the mesolimbic confidence prediction error modulation derived through the model predicts individual learning success. These results provide a mechanistic neurobiological explanation for learning without external feedback by augmenting reinforcement models with confidence-based feedback. DOI: http://dx.doi.org/10.7554/eLife.13388.001 PMID:27021283
NASA Astrophysics Data System (ADS)
Podder, M. S.; Majumder, C. B.
2016-01-01
The main objective of the present study was to investigate the efficiency of Corynebacterium glutamicum MTCC 2745 immobilized on granular activated carbon/MnFe2O4 (GAC/MnFe2O4) composite to treat high concentration of arsenic bearing wastewater. Non-linear regression analysis was done for determining the best-fit kinetic model on the basis of three correlation coefficients and three error functions and also for predicting the parameters involved in kinetic models. The results showed that Fractal-like mixed 1,2 order model for As(III) and Brouser-Weron-Sototlongo as well as Fractal-like pseudo second order models for As(V) were proficient to provide realistic description of biosorption/bioaccumulation kinetic. Applicability of mechanistic models in the current study exhibited that the rate governing step in biosorption/bioaccumulation of both As(III) and As(V) was film diffusion rather than intraparticle diffusion. The evaluated thermodynamic parameters ΔG0, ΔH0 and ΔS0 revealed that biosorption/bioaccumulation of both As(III) and As(V) was feasible, spontaneous and exothermic under studied conditions.
Evers, J B; Vos, J; Yin, X; Romero, P; van der Putten, P E L; Struik, P C
2010-05-01
Intimate relationships exist between form and function of plants, determining many processes governing their growth and development. However, in most crop simulation models that have been created to simulate plant growth and, for example, predict biomass production, plant structure has been neglected. In this study, a detailed simulation model of growth and development of spring wheat (Triticum aestivum) is presented, which integrates degree of tillering and canopy architecture with organ-level light interception, photosynthesis, and dry-matter partitioning. An existing spatially explicit 3D architectural model of wheat development was extended with routines for organ-level microclimate, photosynthesis, assimilate distribution within the plant structure according to organ demands, and organ growth and development. Outgrowth of tiller buds was made dependent on the ratio between assimilate supply and demand of the plants. Organ-level photosynthesis, biomass production, and bud outgrowth were simulated satisfactorily. However, to improve crop simulation results more efforts are needed mechanistically to model other major plant physiological processes such as nitrogen uptake and distribution, tiller death, and leaf senescence. Nevertheless, the work presented here is a significant step forwards towards a mechanistic functional-structural plant model, which integrates plant architecture with key plant processes.
NASA Astrophysics Data System (ADS)
López de Lacalle, Luis Norberto; Urbicain Pelayo, Gorka; Fernández-Valdivielso, Asier; Alvarez, Alvaro; González, Haizea
2017-09-01
Difficult to cut materials such as nickel and titanium alloys are used in the aeronautical industry, the former alloys due to its heat-resistant behavior and the latter for the low weight - high strength ratio. Ceramic tools made out alumina with reinforce SiC whiskers are a choice in turning for roughing and semifinishing workpiece stages. Wear rate is high in the machining of these alloys, and consequently cutting forces tends to increase along one operation. This paper establishes the cutting force relation between work-piece and tool in the turning of such difficult-to-cut alloys by means of a mechanistic cutting force model that considers the tool wear effect. The cutting force model demonstrates the force sensitivity to the cutting engagement parameters (ap, f) when using ceramic inserts and wear is considered. Wear is introduced through a cutting time factor, being useful in real conditions taking into account that wear quickly appears in alloys machining. A good accuracy in the cutting force model coefficients is the key issue for an accurate prediction of turning forces, which could be used as criteria for tool replacement or as input for chatter or other models.
Modelling the ecological niche from functional traits
Kearney, Michael; Simpson, Stephen J.; Raubenheimer, David; Helmuth, Brian
2010-01-01
The niche concept is central to ecology but is often depicted descriptively through observing associations between organisms and habitats. Here, we argue for the importance of mechanistically modelling niches based on functional traits of organisms and explore the possibilities for achieving this through the integration of three theoretical frameworks: biophysical ecology (BE), the geometric framework for nutrition (GF) and dynamic energy budget (DEB) models. These three frameworks are fundamentally based on the conservation laws of thermodynamics, describing energy and mass balance at the level of the individual and capturing the prodigious predictive power of the concepts of ‘homeostasis’ and ‘evolutionary fitness’. BE and the GF provide mechanistic multi-dimensional depictions of climatic and nutritional niches, respectively, providing a foundation for linking organismal traits (morphology, physiology, behaviour) with habitat characteristics. In turn, they provide driving inputs and cost functions for mass/energy allocation within the individual as determined by DEB models. We show how integration of the three frameworks permits calculation of activity constraints, vital rates (survival, development, growth, reproduction) and ultimately population growth rates and species distributions. When integrated with contemporary niche theory, functional trait niche models hold great promise for tackling major questions in ecology and evolutionary biology. PMID:20921046
Effect of topological patterning on self-rolling of nanomembranes.
Chen, Cheng; Song, Pengfei; Meng, Fanchao; Ou, Pengfei; Liu, Xinyu; Song, Jun
2018-08-24
The effects of topological patterning (i.e., grating and rectangular patterns) on the self-rolling behaviors of heteroepitaxial strained nanomembranes have been systematically studied. An analytical modeling framework, validated through finite-element simulations, has been formulated to predict the resultant curvature of the patterned nanomembrane as the pattern thickness and density vary. The effectiveness of the grating pattern in regulating the rolling direction of the nanomembrane has been demonstrated and quantitatively assessed. Further to the rolling of nanomembranes, a route to achieve predictive design of helical structures has been proposed and showcased. The present study provides new knowledge and mechanistic guidance towards predictive control and tuning of roll-up nanostructures via topological patterning.
Mueller, Stefan O; Dekant, Wolfgang; Jennings, Paul; Testai, Emanuela; Bois, Frederic
2015-12-25
This special issue of Toxicology in Vitro is dedicated to disseminating the results of the EU-funded collaborative project "Profiling the toxicity of new drugs: a non animal-based approach integrating toxicodynamics and biokinetics" (Predict-IV; Grant 202222). The project's overall aim was to develop strategies to improve the assessment of drug safety in the early stage of development and late discovery phase, by an intelligent combination of non animal-based test systems, cell biology, mechanistic toxicology and in silico modeling, in a rapid and cost effective manner. This overview introduces the scope and overall achievements of Predict-IV. Copyright © 2014 Elsevier Ltd. All rights reserved.
A network of molecular switches controls the activation of the two-component response regulator NtrC
NASA Astrophysics Data System (ADS)
Vanatta, Dan K.; Shukla, Diwakar; Lawrenz, Morgan; Pande, Vijay S.
2015-06-01
Recent successes in simulating protein structure and folding dynamics have demonstrated the power of molecular dynamics to predict the long timescale behaviour of proteins. Here, we extend and improve these methods to predict molecular switches that characterize conformational change pathways between the active and inactive state of nitrogen regulatory protein C (NtrC). By employing unbiased Markov state model-based molecular dynamics simulations, we construct a dynamic picture of the activation pathways of this key bacterial signalling protein that is consistent with experimental observations and predicts new mutants that could be used for validation of the mechanism. Moreover, these results suggest a novel mechanistic paradigm for conformational switching.
Szymańska, Paulina; Martin, Katie R.; MacKeigan, Jeffrey P.; ...
2015-03-11
We constructed a mechanistic, computational model for regulation of (macro)autophagy and protein synthesis (at the level of translation). The model was formulated to study the system-level consequences of interactions among the following proteins: two key components of MTOR complex 1 (MTORC1), namely the protein kinase MTOR (mechanistic target of rapamycin) and the scaffold protein RPTOR; the autophagy-initiating protein kinase ULK1; and the multimeric energy-sensing AMP-activated protein kinase (AMPK). Inputs of the model include intrinsic AMPK kinase activity, which is taken as an adjustable surrogate parameter for cellular energy level or AMP:ATP ratio, and rapamycin dose, which controls MTORC1 activity. Outputsmore » of the model include the phosphorylation level of the translational repressor EIF4EBP1, a substrate of MTORC1, and the phosphorylation level of AMBRA1 (activating molecule in BECN1-regulated autophagy), a substrate of ULK1 critical for autophagosome formation. The model incorporates reciprocal regulation of mTORC1 and ULK1 by AMPK, mutual inhibition of MTORC1 and ULK1, and ULK1-mediated negative feedback regulation of AMPK. Through analysis of the model, we find that these processes may be responsible, depending on conditions, for graded responses to stress inputs, for bistable switching between autophagy and protein synthesis, or relaxation oscillations, comprising alternating periods of autophagy and protein synthesis. A sensitivity analysis indicates that the prediction of oscillatory behavior is robust to changes of the parameter values of the model. The model provides testable predictions about the behavior of the AMPK-MTORC1-ULK1 network, which plays a central role in maintaining cellular energy and nutrient homeostasis.« less
Sjögren, Erik; Westergren, Jan; Grant, Iain; Hanisch, Gunilla; Lindfors, Lennart; Lennernäs, Hans; Abrahamsson, Bertil; Tannergren, Christer
2013-07-16
Oral drug delivery is the predominant administration route for a major part of the pharmaceutical products used worldwide. Further understanding and improvement of gastrointestinal drug absorption predictions is currently a highly prioritized area of research within the pharmaceutical industry. The fraction absorbed (fabs) of an oral dose after administration of a solid dosage form is a key parameter in the estimation of the in vivo performance of an orally administrated drug formulation. This study discloses an evaluation of the predictive performance of the mechanistic physiologically based absorption model GI-Sim. GI-Sim deploys a compartmental gastrointestinal absorption and transit model as well as algorithms describing permeability, dissolution rate, salt effects, partitioning into micelles, particle and micelle drifting in the aqueous boundary layer, particle growth and amorphous or crystalline precipitation. Twelve APIs with reported or expected absorption limitations in humans, due to permeability, dissolution and/or solubility, were investigated. Predictions of the intestinal absorption for different doses and formulations were performed based on physicochemical and biopharmaceutical properties, such as solubility in buffer and simulated intestinal fluid, molecular weight, pK(a), diffusivity and molecule density, measured or estimated human effective permeability and particle size distribution. The performance of GI-Sim was evaluated by comparing predicted plasma concentration-time profiles along with oral pharmacokinetic parameters originating from clinical studies in healthy individuals. The capability of GI-Sim to correctly predict impact of dose and particle size as well as the in vivo performance of nanoformulations was also investigated. The overall predictive performance of GI-Sim was good as >95% of the predicted pharmacokinetic parameters (C(max) and AUC) were within a 2-fold deviation from the clinical observations and the predicted plasma AUC was within one standard deviation of the observed mean plasma AUC in 74% of the simulations. GI-Sim was also able to correctly capture the trends in dose- and particle size dependent absorption for the study drugs with solubility and dissolution limited absorption, respectively. In addition, GI-Sim was also shown to be able to predict the increase in absorption and plasma exposure achieved with nanoformulations. Based on the results, the performance of GI-Sim was shown to be suitable for early risk assessment as well as to guide decision making in pharmaceutical formulation development. Copyright © 2013 Elsevier B.V. All rights reserved.
Ouzounoglou, Eleftherios; Kolokotroni, Eleni; Stanulla, Martin; Stamatakos, Georgios S
2018-02-06
Efficient use of Virtual Physiological Human (VPH)-type models for personalized treatment response prediction purposes requires a precise model parameterization. In the case where the available personalized data are not sufficient to fully determine the parameter values, an appropriate prediction task may be followed. This study, a hybrid combination of computational optimization and machine learning methods with an already developed mechanistic model called the acute lymphoblastic leukaemia (ALL) Oncosimulator which simulates ALL progression and treatment response is presented. These methods are used in order for the parameters of the model to be estimated for retrospective cases and to be predicted for prospective ones. The parameter value prediction is based on a regression model trained on retrospective cases. The proposed Hybrid ALL Oncosimulator system has been evaluated when predicting the pre-phase treatment outcome in ALL. This has been correctly achieved for a significant percentage of patient cases tested (approx. 70% of patients). Moreover, the system is capable of denying the classification of cases for which the results are not trustworthy enough. In that case, potentially misleading predictions for a number of patients are avoided, while the classification accuracy for the remaining patient cases further increases. The results obtained are particularly encouraging regarding the soundness of the proposed methodologies and their relevance to the process of achieving clinical applicability of the proposed Hybrid ALL Oncosimulator system and VPH models in general.
Advanced modelling, monitoring, and process control of bioconversion systems
NASA Astrophysics Data System (ADS)
Schmitt, Elliott C.
Production of fuels and chemicals from lignocellulosic biomass is an increasingly important area of research and industrialization throughout the world. In order to be competitive with fossil-based fuels and chemicals, maintaining cost-effectiveness is critical. Advanced process control (APC) and optimization methods could significantly reduce operating costs in the biorefining industry. Two reasons APC has previously proven challenging to implement for bioprocesses include: lack of suitable online sensor technology of key system components, and strongly nonlinear first principal models required to predict bioconversion behavior. To overcome these challenges batch fermentations with the acetogen Moorella thermoacetica were monitored with Raman spectroscopy for the conversion of real lignocellulosic hydrolysates and a kinetic model for the conversion of synthetic sugars was developed. Raman spectroscopy was shown to be effective in monitoring the fermentation of sugarcane bagasse and sugarcane straw hydrolysate, where univariate models predicted acetate concentrations with a root mean square error of prediction (RMSEP) of 1.9 and 1.0 g L-1 for bagasse and straw, respectively. Multivariate partial least squares (PLS) models were employed to predict acetate, xylose, glucose, and total sugar concentrations for both hydrolysate fermentations. The PLS models were more robust than univariate models, and yielded a percent error of approximately 5% for both sugarcane bagasse and sugarcane straw. In addition, a screening technique was discussed for improving Raman spectra of hydrolysate samples prior to collecting fermentation data. Furthermore, a mechanistic model was developed to predict batch fermentation of synthetic glucose, xylose, and a mixture of the two sugars to acetate. The models accurately described the bioconversion process with an RMSEP of approximately 1 g L-1 for each model and provided insights into how kinetic parameters changed during dual substrate fermentation with diauxic growth. Model predictive control (MPC), an advanced process control strategy, is capable of utilizing nonlinear models and sensor feedback to provide optimal input while ensuring critical process constraints are met. Using the microorganism Saccharomyces cerevisiae, a commonly used microorganism for biofuel production, and work performed with M. thermoacetica, a nonlinear MPC was implemented on a continuous membrane cell-recycle bioreactor (MCRB) for the conversion of glucose to ethanol. The dilution rate was used to control the ethanol productivity of the system will maintaining total substrate conversion above the constraint of 98%. PLS multivariate models for glucose (RMSEP 1.5 g L-1) and ethanol (RMSEP 0.4 g L-1) were robust in predicting concentrations and a mechanistic kinetic model built accurately predicted continuous fermentation behavior. A setpoint trajectory, ranging from 2 - 4.5 g L-1 h-1 for productivity was closely tracked by the fermentation system using Raman measurements and an extended Kalman filter to estimate biomass concentrations. Overall, this work was able to demonstrate an effective approach for real-time monitoring and control of a complex fermentation system.
Theoretical model of impact damage in structural ceramics
NASA Technical Reports Server (NTRS)
Liaw, B. M.; Kobayashi, A. S.; Emery, A. G.
1984-01-01
This paper presents a mechanistically consistent model of impact damage based on elastic failures due to tensile and shear overloading. An elastic axisymmetric finite element model is used to determine the dynamic stresses generated by a single particle impact. Local failures in a finite element are assumed to occur when the primary/secondary principal stresses or the maximum shear stress reach critical tensile or shear stresses, respectively. The succession of failed elements thus models macrocrack growth. Sliding motions of cracks, which closed during unloading, are resisted by friction and the unrecovered deformation represents the 'plastic deformation' reported in the literature. The predicted ring cracks on the contact surface, as well as the cone cracks, median cracks, radial cracks, lateral cracks, and damage-induced porous zones in the interior of hot-pressed silicon nitride plates, matched those observed experimentally. The finite element model also predicted the uplifting of the free surface surrounding the impact site.
Validity and validation of expert (Q)SAR systems.
Hulzebos, E; Sijm, D; Traas, T; Posthumus, R; Maslankiewicz, L
2005-08-01
At a recent workshop in Setubal (Portugal) principles were drafted to assess the suitability of (quantitative) structure-activity relationships ((Q)SARs) for assessing the hazards and risks of chemicals. In the present study we applied some of the Setubal principles to test the validity of three (Q)SAR expert systems and validate the results. These principles include a mechanistic basis, the availability of a training set and validation. ECOSAR, BIOWIN and DEREK for Windows have a mechanistic or empirical basis. ECOSAR has a training set for each QSAR. For half of the structural fragments the number of chemicals in the training set is >4. Based on structural fragments and log Kow, ECOSAR uses linear regression to predict ecotoxicity. Validating ECOSAR for three 'valid' classes results in predictivity of > or = 64%. BIOWIN uses (non-)linear regressions to predict the probability of biodegradability based on fragments and molecular weight. It has a large training set and predicts non-ready biodegradability well. DEREK for Windows predictions are supported by a mechanistic rationale and literature references. The structural alerts in this program have been developed with a training set of positive and negative toxicity data. However, to support the prediction only a limited number of chemicals in the training set is presented to the user. DEREK for Windows predicts effects by 'if-then' reasoning. The program predicts best for mutagenicity and carcinogenicity. Each structural fragment in ECOSAR and DEREK for Windows needs to be evaluated and validated separately.
Tomkins, Melissa; Kliot, Adi; Marée, Athanasius Fm; Hogenhout, Saskia A
2018-03-13
Members of the Candidatus genus Phytoplasma are small bacterial pathogens that hijack their plant hosts via the secretion of virulence proteins (effectors) leading to a fascinating array of plant phenotypes, such as witch's brooms (stem proliferations) and phyllody (retrograde development of flowers into vegetative tissues). Phytoplasma depend on insect vectors for transmission, and interestingly, these insect vectors were found to be (in)directly attracted to plants with these phenotypes. Therefore, phytoplasma effectors appear to reprogram plant development and defence to lure insect vectors, similarly to social engineering malware, which employs tricks to lure people to infected computers and webpages. A multi-layered mechanistic modelling approach will enable a better understanding of how phytoplasma effector-mediated modulations of plant host development and insect vector behaviour contribute to phytoplasma spread, and ultimately to predict the long reach of phytoplasma effector genes. Copyright © 2018. Published by Elsevier Ltd.
Alierta, J A; Pérez, M A; Seral, B; García-Aznar, J M
2016-09-01
The aim of this study is to evaluate the fracture union or non-union for a specific patient that presented oblique fractures in tibia and fibula, using a mechanistic-based bone healing model. Normally, this kind of fractures can be treated through an intramedullary nail using two possible configurations that depends on the mechanical stabilisation: static and dynamic. Both cases are simulated under different fracture geometries in order to understand the effect of the mechanical stabilisation on the fracture healing outcome. The results of both simulations are in good agreement with previous clinical experience. From the results, it is demonstrated that the dynamization of the fracture improves healing in comparison with a static or rigid fixation of the fracture. This work shows the versatility and potential of a mechanistic-based bone healing model to predict the final outcome (union, non-union, delayed union) of realistic 3D fractures where even more than one bone is involved.
Carlson, Hans K.; Clark, Iain C.; Melnyk, Ryan A.; Coates, John D.
2011-01-01
The anaerobic oxidation of Fe(II) by subsurface microorganisms is an important part of biogeochemical cycling in the environment, but the biochemical mechanisms used to couple iron oxidation to nitrate respiration are not well understood. Based on our own work and the evidence available in the literature, we propose a mechanistic model for anaerobic nitrate-dependent iron oxidation. We suggest that anaerobic iron-oxidizing microorganisms likely exist along a continuum including: (1) bacteria that inadvertently oxidize Fe(II) by abiotic or biotic reactions with enzymes or chemical intermediates in their metabolic pathways (e.g., denitrification) and suffer from toxicity or energetic penalty, (2) Fe(II) tolerant bacteria that gain little or no growth benefit from iron oxidation but can manage the toxic reactions, and (3) bacteria that efficiently accept electrons from Fe(II) to gain a growth advantage while preventing or mitigating the toxic reactions. Predictions of the proposed model are highlighted and experimental approaches are discussed. PMID:22363331
Emergence of tissue polarization from synergy of intracellular and extracellular auxin signaling
Wabnik, Krzysztof; Kleine-Vehn, Jürgen; Balla, Jozef; Sauer, Michael; Naramoto, Satoshi; Reinöhl, Vilém; Merks, Roeland M H; Govaerts, Willy; Friml, Jiří
2010-01-01
Plant development is exceptionally flexible as manifested by its potential for organogenesis and regeneration, which are processes involving rearrangements of tissue polarities. Fundamental questions concern how individual cells can polarize in a coordinated manner to integrate into the multicellular context. In canalization models, the signaling molecule auxin acts as a polarizing cue, and feedback on the intercellular auxin flow is key for synchronized polarity rearrangements. We provide a novel mechanistic framework for canalization, based on up-to-date experimental data and minimal, biologically plausible assumptions. Our model combines the intracellular auxin signaling for expression of PINFORMED (PIN) auxin transporters and the theoretical postulation of extracellular auxin signaling for modulation of PIN subcellular dynamics. Computer simulations faithfully and robustly recapitulated the experimentally observed patterns of tissue polarity and asymmetric auxin distribution during formation and regeneration of vascular systems and during the competitive regulation of shoot branching by apical dominance. Additionally, our model generated new predictions that could be experimentally validated, highlighting a mechanistically conceivable explanation for the PIN polarization and canalization of the auxin flow in plants. PMID:21179019
NASA Astrophysics Data System (ADS)
Vanwalleghem, T.; Román, A.; Peña, A.; Laguna, A.; Giráldez, J. V.
2017-12-01
There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties in the critical zone. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of traditional digital soil mapping versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.
Efstathiou, Christos; Isukapalli, Sastry
2011-01-01
Allergic airway diseases represent a complex health problem which can be exacerbated by the synergistic action of pollen particles and air pollutants such as ozone. Understanding human exposures to aeroallergens requires accurate estimates of the spatial distribution of airborne pollen levels as well as of various air pollutants at different times. However, currently there are no established methods for estimating allergenic pollen emissions and concentrations over large geographic areas such as the United States. A mechanistic modeling system for describing pollen emissions and transport over extensive domains has been developed by adapting components of existing regional scale air quality models and vegetation databases. First, components of the Biogenic Emissions Inventory System (BEIS) were adapted to predict pollen emission patterns. Subsequently, the transport module of the Community Multiscale Air Quality (CMAQ) modeling system was modified to incorporate description of pollen transport. The combined model, CMAQ-pollen, allows for simultaneous prediction of multiple air pollutants and pollen levels in a single model simulation, and uses consistent assumptions related to the transport of multiple chemicals and pollen species. Application case studies for evaluating the combined modeling system included the simulation of birch and ragweed pollen levels for the year 2002, during their corresponding peak pollination periods (April for birch and September for ragweed). The model simulations were driven by previously evaluated meteorological model outputs and emissions inventories for the eastern United States for the simulation period. A semi-quantitative evaluation of CMAQ-pollen was performed using tree and ragweed pollen counts in Newark, NJ for the same time periods. The peak birch pollen concentrations were predicted to occur within two days of the peak measurements, while the temporal patterns closely followed the measured profiles of overall tree pollen. For the case of ragweed pollen, the model was able to capture the patterns observed during September 2002, but did not predict an early peak; this can be associated with a wider species pollination window and inadequate spatial information in current land cover databases. An additional sensitivity simulation was performed to comparatively evaluate the dispersion patterns predicted by CMAQ-pollen with those predicted by the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model, which is used extensively in aerobiological studies. The CMAQ estimated concentration plumes matched the equivalent pollen scenario modeled with HYSPLIT. The novel pollen modeling approach presented here allows simultaneous estimation of multiple airborne allergens and other air pollutants, and is being developed as a central component of an integrated population exposure modeling system, the Modeling Environment for Total Risk studies (MENTOR) for multiple, co-occurring contaminants that include aeroallergens and irritants. PMID:21516207
NASA Astrophysics Data System (ADS)
Efstathiou, Christos; Isukapalli, Sastry; Georgopoulos, Panos
2011-04-01
Allergic airway diseases represent a complex health problem which can be exacerbated by the synergistic action of pollen particles and air pollutants such as ozone. Understanding human exposures to aeroallergens requires accurate estimates of the spatial distribution of airborne pollen levels as well as of various air pollutants at different times. However, currently there are no established methods for estimating allergenic pollen emissions and concentrations over large geographic areas such as the United States. A mechanistic modeling system for describing pollen emissions and transport over extensive domains has been developed by adapting components of existing regional scale air quality models and vegetation databases. First, components of the Biogenic Emissions Inventory System (BEIS) were adapted to predict pollen emission patterns. Subsequently, the transport module of the Community Multiscale Air Quality (CMAQ) modeling system was modified to incorporate description of pollen transport. The combined model, CMAQ-pollen, allows for simultaneous prediction of multiple air pollutants and pollen levels in a single model simulation, and uses consistent assumptions related to the transport of multiple chemicals and pollen species. Application case studies for evaluating the combined modeling system included the simulation of birch and ragweed pollen levels for the year 2002, during their corresponding peak pollination periods (April for birch and September for ragweed). The model simulations were driven by previously evaluated meteorological model outputs and emissions inventories for the eastern United States for the simulation period. A semi-quantitative evaluation of CMAQ-pollen was performed using tree and ragweed pollen counts in Newark, NJ for the same time periods. The peak birch pollen concentrations were predicted to occur within two days of the peak measurements, while the temporal patterns closely followed the measured profiles of overall tree pollen. For the case of ragweed pollen, the model was able to capture the patterns observed during September 2002, but did not predict an early peak; this can be associated with a wider species pollination window and inadequate spatial information in current land cover databases. An additional sensitivity simulation was performed to comparatively evaluate the dispersion patterns predicted by CMAQ-pollen with those predicted by the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model, which is used extensively in aerobiological studies. The CMAQ estimated concentration plumes matched the equivalent pollen scenario modeled with HYSPLIT. The novel pollen modeling approach presented here allows simultaneous estimation of multiple airborne allergens and other air pollutants, and is being developed as a central component of an integrated population exposure modeling system, the Modeling Environment for Total Risk studies (MENTOR) for multiple, co-occurring contaminants that include aeroallergens and irritants.
A neighborhood statistics model for predicting stream pathogen indicator levels.
Pandey, Pramod K; Pasternack, Gregory B; Majumder, Mahbubul; Soupir, Michelle L; Kaiser, Mark S
2015-03-01
Because elevated levels of water-borne Escherichia coli in streams are a leading cause of water quality impairments in the U.S., water-quality managers need tools for predicting aqueous E. coli levels. Presently, E. coli levels may be predicted using complex mechanistic models that have a high degree of unchecked uncertainty or simpler statistical models. To assess spatio-temporal patterns of instream E. coli levels, herein we measured E. coli, a pathogen indicator, at 16 sites (at four different times) within the Squaw Creek watershed, Iowa, and subsequently, the Markov Random Field model was exploited to develop a neighborhood statistics model for predicting instream E. coli levels. Two observed covariates, local water temperature (degrees Celsius) and mean cross-sectional depth (meters), were used as inputs to the model. Predictions of E. coli levels in the water column were compared with independent observational data collected from 16 in-stream locations. The results revealed that spatio-temporal averages of predicted and observed E. coli levels were extremely close. Approximately 66 % of individual predicted E. coli concentrations were within a factor of 2 of the observed values. In only one event, the difference between prediction and observation was beyond one order of magnitude. The mean of all predicted values at 16 locations was approximately 1 % higher than the mean of the observed values. The approach presented here will be useful while assessing instream contaminations such as pathogen/pathogen indicator levels at the watershed scale.
Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P
2017-12-05
The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.
Lei, Chon Lok; Wang, Ken; Clerx, Michael; Johnstone, Ross H; Hortigon-Vinagre, Maria P; Zamora, Victor; Allan, Andrew; Smith, Godfrey L; Gavaghan, David J; Mirams, Gary R; Polonchuk, Liudmila
2017-01-01
Human induced pluripotent stem cell derived cardiomyocytes (iPSC-CMs) have applications in disease modeling, cell therapy, drug screening and personalized medicine. Computational models can be used to interpret experimental findings in iPSC-CMs, provide mechanistic insights, and translate these findings to adult cardiomyocyte (CM) electrophysiology. However, different cell lines display different expression of ion channels, pumps and receptors, and show differences in electrophysiology. In this exploratory study, we use a mathematical model based on iPSC-CMs from Cellular Dynamic International (CDI, iCell), and compare its predictions to novel experimental recordings made with the Axiogenesis Cor.4U line. We show that tailoring this model to the specific cell line, even using limited data and a relatively simple approach, leads to improved predictions of baseline behavior and response to drugs. This demonstrates the need and the feasibility to tailor models to individual cell lines, although a more refined approach will be needed to characterize individual currents, address differences in ion current kinetics, and further improve these results.
Thomas, Reuben; Thomas, Russell S.; Auerbach, Scott S.; Portier, Christopher J.
2013-01-01
Background Several groups have employed genomic data from subchronic chemical toxicity studies in rodents (90 days) to derive gene-centric predictors of chronic toxicity and carcinogenicity. Genes are annotated to belong to biological processes or molecular pathways that are mechanistically well understood and are described in public databases. Objectives To develop a molecular pathway-based prediction model of long term hepatocarcinogenicity using 90-day gene expression data and to evaluate the performance of this model with respect to both intra-species, dose-dependent and cross-species predictions. Methods Genome-wide hepatic mRNA expression was retrospectively measured in B6C3F1 mice following subchronic exposure to twenty-six (26) chemicals (10 were positive, 2 equivocal and 14 negative for liver tumors) previously studied by the US National Toxicology Program. Using these data, a pathway-based predictor model for long-term liver cancer risk was derived using random forests. The prediction model was independently validated on test sets associated with liver cancer risk obtained from mice, rats and humans. Results Using 5-fold cross validation, the developed prediction model had reasonable predictive performance with the area under receiver-operator curve (AUC) equal to 0.66. The developed prediction model was then used to extrapolate the results to data associated with rat and human liver cancer. The extrapolated model worked well for both extrapolated species (AUC value of 0.74 for rats and 0.91 for humans). The prediction models implied a balanced interplay between all pathway responses leading to carcinogenicity predictions. Conclusions Pathway-based prediction models estimated from sub-chronic data hold promise for predicting long-term carcinogenicity and also for its ability to extrapolate results across multiple species. PMID:23737943
Thomas, Reuben; Thomas, Russell S; Auerbach, Scott S; Portier, Christopher J
2013-01-01
Several groups have employed genomic data from subchronic chemical toxicity studies in rodents (90 days) to derive gene-centric predictors of chronic toxicity and carcinogenicity. Genes are annotated to belong to biological processes or molecular pathways that are mechanistically well understood and are described in public databases. To develop a molecular pathway-based prediction model of long term hepatocarcinogenicity using 90-day gene expression data and to evaluate the performance of this model with respect to both intra-species, dose-dependent and cross-species predictions. Genome-wide hepatic mRNA expression was retrospectively measured in B6C3F1 mice following subchronic exposure to twenty-six (26) chemicals (10 were positive, 2 equivocal and 14 negative for liver tumors) previously studied by the US National Toxicology Program. Using these data, a pathway-based predictor model for long-term liver cancer risk was derived using random forests. The prediction model was independently validated on test sets associated with liver cancer risk obtained from mice, rats and humans. Using 5-fold cross validation, the developed prediction model had reasonable predictive performance with the area under receiver-operator curve (AUC) equal to 0.66. The developed prediction model was then used to extrapolate the results to data associated with rat and human liver cancer. The extrapolated model worked well for both extrapolated species (AUC value of 0.74 for rats and 0.91 for humans). The prediction models implied a balanced interplay between all pathway responses leading to carcinogenicity predictions. Pathway-based prediction models estimated from sub-chronic data hold promise for predicting long-term carcinogenicity and also for its ability to extrapolate results across multiple species.
Does Mechanistic Thinking Improve Student Success in Organic Chemistry?
ERIC Educational Resources Information Center
Grove, Nathaniel P.; Cooper, Melanie M.; Cox, Elizabeth L.
2012-01-01
The use of the curved-arrow notation to depict electron flow during mechanistic processes is one of the most important representational conventions in the organic chemistry curriculum. Our previous research documented a disturbing trend: when asked to predict the products of a series of reactions, many students do not spontaneously engage in…
Mechanistic Lake Modeling to Understand and Predict Heterogeneous Responses to Climate Warming
NASA Astrophysics Data System (ADS)
Read, J. S.; Winslow, L. A.; Rose, K. C.; Hansen, G. J.
2016-12-01
Substantial warming has been documented for of hundreds globally distributed lakes, with likely impacts on ecosystem processes. Despite a clear pattern of widespread warming, thermal responses of individual lakes to climate change are often heterogeneous, with the warming rates of neighboring lakes varying across depths and among seasons. We aggregated temperature observations and parameterized mechanistic models for 9,000 lakes in the U.S. states of Minnesota, Wisconsin, and Michigan to examine broad-scale lake warming trends and among-lake diversity. Daily lake temperature profiles and ice-cover dynamics were simulated using the General Lake Model for the contemporary period (1979-2015) using drivers from the North American Land Data Assimilation System (NLDAS-2) and for contemporary and future periods (1980-2100) using downscaled data from six global circulation models driven by the Representative Climate Pathway 8.5 scenario. For the contemporary period, modeled vs observed summer mean surface temperatures had a root mean squared error of 0.98°C with modeled warming trends similar to observed trends. Future simulations under the extreme 8.5 scenario predicted a median lake summer surface warming rate of 0.57°C/decade until mid-century, with slower rates in the later half of the 21st century (0.35°C/decade). Modeling scenarios and analysis of field data suggest that the lake-specific properties of size, water clarity, and depth are strong controls on the sensitivity of lakes to climate change. For example, a simulated 1% annual decline in water clarity was sufficient to override the effects of climate warming on whole lake water temperatures in some - but not all - study lakes. Understanding heterogeneous lake responses to climate variability can help identify lake-specific features that influence resilience to climate change.
Robinson, Joshua F; Theunissen, Peter T; van Dartel, Dorien A M; Pennings, Jeroen L; Faustman, Elaine M; Piersma, Aldert H
2011-09-01
Toxicogenomic evaluations may improve toxicity prediction of in vitro-based developmental models, such as whole embryo culture (WEC) and embryonic stem cells (ESC), by providing a robust mechanistic marker which can be linked with responses associated with developmental toxicity in vivo. While promising in theory, toxicogenomic comparisons between in vivo and in vitro models are complex due to inherent differences in model characteristics and experimental design. Determining factors which influence these global comparisons are critical in the identification of reliable mechanistic-based markers of developmental toxicity. In this study, we compared available toxicogenomic data assessing the impact of the known teratogen, methylmercury (MeHg) across a diverse set of in vitro and in vivo models to investigate the impact of experimental variables (i.e. model, dose, time) on our comparative assessments. We evaluated common and unique aspects at both the functional (Gene Ontology) and gene level of MeHg-induced response. At the functional level, we observed stronger similarity in MeHg-response between mouse embryos exposed in utero (2 studies), ESC, and WEC as compared to liver, brain and mouse embryonic fibroblast MeHg studies. These findings were strongly correlated to the presence of a MeHg-induced developmentally related gene signature. In addition, we identified specific MeHg-induced gene expression alterations associated with developmental signaling and heart development across WEC, ESC and in vivo systems. However, the significance of overlap between studies was highly dependent on traditional experimental variables (i.e. dose, time). In summary, we identify promising examples of unique gene expression responses which show in vitro-in vivo similarities supporting the relevance of in vitro developmental models for predicting in vivo developmental toxicity. Copyright © 2011 Elsevier Inc. All rights reserved.
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M; Young, Vincent B; Jansson, Janet K; Fredricks, David N; Borenstein, Elhanan
2016-01-01
Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites' abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M.; Young, Vincent B.; Jansson, Janet K.; Fredricks, David N.
2016-01-01
ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCE Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism. PMID:27239563
Mathematical modeling of drug release from lipid dosage forms.
Siepmann, J; Siepmann, F
2011-10-10
Lipid dosage forms provide an interesting potential for controlled drug delivery. In contrast to frequently used poly(ester) based devices for parenteral administration, they do not lead to acidification upon degradation and potential drug inactivation, especially in the case of protein drugs and other acid-labile active agents. The aim of this article is to give an overview on the current state of the art of mathematical modeling of drug release from this type of advanced drug delivery systems. Empirical and semi-empirical models are described as well as mechanistic theories, considering diffusional mass transport, potentially limited drug solubility and the leaching of other, water-soluble excipients into the surrounding bulk fluid. Various practical examples are given, including lipid microparticles, beads and implants, which can successfully be used to control the release of an incorporated drug during periods ranging from a few hours up to several years. The great benefit of mechanistic mathematical theories is the possibility to quantitatively predict the effects of different formulation parameters and device dimensions on the resulting drug release kinetics. Thus, in silico simulations can significantly speed up product optimization. This is particularly useful if long release periods (e.g., several months) are targeted, since experimental trial-and-error studies are highly time-consuming in these cases. In the future it would be highly desirable to combine mechanistic theories with the quantitative description of the drug fate in vivo, ideally including the pharmacodynamic efficacy of the treatments. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Pedretti, Daniele; Bianchi, Marco
2018-03-01
Breakthrough curves (BTCs) observed during tracer tests in highly heterogeneous aquifers display strong tailing. Power laws are popular models for both the empirical fitting of these curves, and the prediction of transport using upscaling models based on best-fitted estimated parameters (e.g. the power law slope or exponent). The predictive capacity of power law based upscaling models can be however questioned due to the difficulties to link model parameters with the aquifers' physical properties. This work analyzes two aspects that can limit the use of power laws as effective predictive tools: (a) the implication of statistical subsampling, which often renders power laws undistinguishable from other heavily tailed distributions, such as the logarithmic (LOG); (b) the difficulties to reconcile fitting parameters obtained from models with different formulations, such as the presence of a late-time cutoff in the power law model. Two rigorous and systematic stochastic analyses, one based on benchmark distributions and the other on BTCs obtained from transport simulations, are considered. It is found that a power law model without cutoff (PL) results in best-fitted exponents (αPL) falling in the range of typical experimental values reported in the literature (1.5 < αPL < 4). The PL exponent tends to lower values as the tailing becomes heavier. Strong fluctuations occur when the number of samples is limited, due to the effects of subsampling. On the other hand, when the power law model embeds a cutoff (PLCO), the best-fitted exponent (αCO) is insensitive to the degree of tailing and to the effects of subsampling and tends to a constant αCO ≈ 1. In the PLCO model, the cutoff rate (λ) is the parameter that fully reproduces the persistence of the tailing and is shown to be inversely correlated to the LOG scale parameter (i.e. with the skewness of the distribution). The theoretical results are consistent with the fitting analysis of a tracer test performed during the MADE-5 experiment. It is shown that a simple mechanistic upscaling model based on the PLCO formulation is able to predict the ensemble of BTCs from the stochastic transport simulations without the need of any fitted parameters. The model embeds the constant αCO = 1 and relies on a stratified description of the transport mechanisms to estimate λ. The PL fails to reproduce the ensemble of BTCs at late time, while the LOG model provides consistent results as the PLCO model, however without a clear mechanistic link between physical properties and model parameters. It is concluded that, while all parametric models may work equally well (or equally wrong) for the empirical fitting of the experimental BTCs tails due to the effects of subsampling, for predictive purposes this is not true. A careful selection of the proper heavily tailed models and corresponding parameters is required to ensure physically-based transport predictions.
Pragmatic perspective on aerobic scope: peaking, plummeting, pejus and apportioning.
Farrell, A P
2016-01-01
A major challenge for fish biologists in the 21st century is to predict the biotic effects of global climate change. With marked changes in biogeographic distribution already in evidence for a variety of aquatic animals, mechanistic explanations for these shifts are being sought, ones that then can be used as a foundation for predictive models of future climatic scenarios. One mechanistic explanation for the thermal performance of fishes that has gained some traction is the oxygen and capacity-limited thermal tolerance (OCLTT) hypothesis, which suggests that an aquatic organism's capacity to supply oxygen to tissues becomes limited when body temperature reaches extremes. Central to this hypothesis is an optimum temperature for absolute aerobic scope (AAS, loosely defined as the capacity to deliver oxygen to tissues beyond a basic need). On either side of this peak for AAS are pejus temperatures that define when AAS falls off and thereby reduces an animal's absolute capacity for activity. This article provides a brief perspective on the potential uses and limitations of some of the key physiological indicators related to aerobic scope in fishes. The intent is that practitioners who attempt predictive ecological applications can better recognize limitations and make better use of the OCLTT hypothesis and its underlying physiology. © 2015 The Fisheries Society of the British Isles.
Comparison of kinetic models for atom recombination on high-temperature reusable surface insulation
NASA Technical Reports Server (NTRS)
Willey, Ronald J.
1993-01-01
Five kinetic models are compared for their ability to predict recombination coefficients for oxygen and nitrogen atoms over high-temperature reusable surface insulation (HRSI). Four of the models are derived using Rideal-Eley or Langmuir-Hinshelwood catalytic mechanisms to describe the reaction sequence. The fifth model is an empirical expression that offers certain features unattainable through mechanistic description. The results showed that a four-parameter model, with temperature as the only variable, works best with data currently available. The model describes recombination coefficients for oxygen and nitrogen atoms for temperatures from 300 to 1800 K. Kinetic models, with atom concentrations, demonstrate the influence of atom concentration on recombination coefficients. These models can be used for the prediction of heating rates due to catalytic recombination during re-entry or aerobraking maneuvers. The work further demonstrates a requirement for more recombination experiments in the temperature ranges of 300-1000 K, and 1500-1850 K, with deliberate concentration variation to verify model requirements.
Bayesian model aggregation for ensemble-based estimates of protein pKa values
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gosink, Luke J.; Hogan, Emilie A.; Pulsipher, Trenton C.
2014-03-01
This paper investigates an ensemble-based technique called Bayesian Model Averaging (BMA) to improve the performance of protein amino acid pmore » $$K_a$$ predictions. Structure-based p$$K_a$$ calculations play an important role in the mechanistic interpretation of protein structure and are also used to determine a wide range of protein properties. A diverse set of methods currently exist for p$$K_a$$ prediction, ranging from empirical statistical models to {\\it ab initio} quantum mechanical approaches. However, each of these methods are based on a set of assumptions that have inherent bias and sensitivities that can effect a model's accuracy and generalizability for p$$K_a$$ prediction in complicated biomolecular systems. We use BMA to combine eleven diverse prediction methods that each estimate pKa values of amino acids in staphylococcal nuclease. These methods are based on work conducted for the pKa Cooperative and the pKa measurements are based on experimental work conducted by the Garc{\\'i}a-Moreno lab. Our study demonstrates that the aggregated estimate obtained from BMA outperforms all individual prediction methods in our cross-validation study with improvements from 40-70\\% over other method classes. This work illustrates a new possible mechanism for improving the accuracy of p$$K_a$$ prediction and lays the foundation for future work on aggregate models that balance computational cost with prediction accuracy.« less
Critical evaluation of mechanistic two-phase flow pipeline and well simulation models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dhulesia, H.; Lopez, D.
1996-12-31
Mechanistic steady state simulation models, rather than empirical correlations, are used for a design of multiphase production system including well, pipeline and downstream installations. Among the available models, PEPITE, WELLSIM, OLGA, TACITE and TUFFP are widely used for this purpose and consequently, a critical evaluation of these models is needed. An extensive validation methodology is proposed which consists of two distinct steps: first to validate the hydrodynamic point model using the test loop data and, then to validate the over-all simulation model using the real pipelines and wells data. The test loop databank used in this analysis contains about 5952more » data sets originated from four different test loops and a majority of these data are obtained at high pressures (up to 90 bars) with real hydrocarbon fluids. Before performing the model evaluation, physical analysis of the test loops data is required to eliminate non-coherent data. The evaluation of these point models demonstrates that the TACITE and OLGA models can be applied to any configuration of pipes. The TACITE model performs better than the OLGA model because it uses the most appropriate closure laws from the literature validated on a large number of data. The comparison of predicted and measured pressure drop for various real pipelines and wells demonstrates that the TACITE model is a reliable tool.« less
A white-box model of S-shaped and double S-shaped single-species population growth
Kalmykov, Lev V.
2015-01-01
Complex systems may be mechanistically modelled by white-box modeling with using logical deterministic individual-based cellular automata. Mathematical models of complex systems are of three types: black-box (phenomenological), white-box (mechanistic, based on the first principles) and grey-box (mixtures of phenomenological and mechanistic models). Most basic ecological models are of black-box type, including Malthusian, Verhulst, Lotka–Volterra models. In black-box models, the individual-based (mechanistic) mechanisms of population dynamics remain hidden. Here we mechanistically model the S-shaped and double S-shaped population growth of vegetatively propagated rhizomatous lawn grasses. Using purely logical deterministic individual-based cellular automata we create a white-box model. From a general physical standpoint, the vegetative propagation of plants is an analogue of excitation propagation in excitable media. Using the Monte Carlo method, we investigate a role of different initial positioning of an individual in the habitat. We have investigated mechanisms of the single-species population growth limited by habitat size, intraspecific competition, regeneration time and fecundity of individuals in two types of boundary conditions and at two types of fecundity. Besides that, we have compared the S-shaped and J-shaped population growth. We consider this white-box modeling approach as a method of artificial intelligence which works as automatic hyper-logical inference from the first principles of the studied subject. This approach is perspective for direct mechanistic insights into nature of any complex systems. PMID:26038717
Predicted impacts of climate change on malaria transmission in West Africa
NASA Astrophysics Data System (ADS)
Yamana, T. K.; Eltahir, E. A. B.
2014-12-01
Increases in temperature and changes in precipitation due to climate change are expected to alter the spatial distribution of malaria transmission. This is especially true in West Africa, where malaria prevalence follows the current north-south gradients in temperature and precipitation. We assess the skill of GCMs at simulating past and present climate in West Africa in order to select the most credible climate predictions for the periods 2030-2060 and 2070-2100. We then use the Hydrology, Entomology and Malaria Transmission Simulator (HYDREMATS), a mechanistic model of malaria transmission, to translate the predicted changes in climate into predicted changes availability of mosquito breeding sites, mosquito populations, and malaria prevalence. We investigate the role of acquired immunity in determining a population's response to changes in exposure to the malaria parasite.
Multi-model comparison highlights consistency in predicted effect of warming on a semi-arid shrub
Renwick, Katherine M.; Curtis, Caroline; Kleinhesselink, Andrew R.; Schlaepfer, Daniel R.; Bradley, Bethany A.; Aldridge, Cameron L.; Poulter, Benjamin; Adler, Peter B.
2018-01-01
A number of modeling approaches have been developed to predict the impacts of climate change on species distributions, performance, and abundance. The stronger the agreement from models that represent different processes and are based on distinct and independent sources of information, the greater the confidence we can have in their predictions. Evaluating the level of confidence is particularly important when predictions are used to guide conservation or restoration decisions. We used a multi-model approach to predict climate change impacts on big sagebrush (Artemisia tridentata), the dominant plant species on roughly 43 million hectares in the western United States and a key resource for many endemic wildlife species. To evaluate the climate sensitivity of A. tridentata, we developed four predictive models, two based on empirically derived spatial and temporal relationships, and two that applied mechanistic approaches to simulate sagebrush recruitment and growth. This approach enabled us to produce an aggregate index of climate change vulnerability and uncertainty based on the level of agreement between models. Despite large differences in model structure, predictions of sagebrush response to climate change were largely consistent. Performance, as measured by change in cover, growth, or recruitment, was predicted to decrease at the warmest sites, but increase throughout the cooler portions of sagebrush's range. A sensitivity analysis indicated that sagebrush performance responds more strongly to changes in temperature than precipitation. Most of the uncertainty in model predictions reflected variation among the ecological models, raising questions about the reliability of forecasts based on a single modeling approach. Our results highlight the value of a multi-model approach in forecasting climate change impacts and uncertainties and should help land managers to maximize the value of conservation investments.
Balancing the stochastic description of uncertainties as a function of hydrologic model complexity
NASA Astrophysics Data System (ADS)
Del Giudice, D.; Reichert, P.; Albert, C.; Kalcic, M.; Logsdon Muenich, R.; Scavia, D.; Bosch, N. S.; Michalak, A. M.
2016-12-01
Uncertainty analysis is becoming an important component of forecasting water and pollutant fluxes in urban and rural environments. Properly accounting for errors in the modeling process can help to robustly assess the uncertainties associated with the inputs (e.g. precipitation) and outputs (e.g. runoff) of hydrological models. In recent years we have investigated several Bayesian methods to infer the parameters of a mechanistic hydrological model along with those of the stochastic error component. The latter describes the uncertainties of model outputs and possibly inputs. We have adapted our framework to a variety of applications, ranging from predicting floods in small stormwater systems to nutrient loads in large agricultural watersheds. Given practical constraints, we discuss how in general the number of quantities to infer probabilistically varies inversely with the complexity of the mechanistic model. Most often, when evaluating a hydrological model of intermediate complexity, we can infer the parameters of the model as well as of the output error model. Describing the output errors as a first order autoregressive process can realistically capture the "downstream" effect of inaccurate inputs and structure. With simpler runoff models we can additionally quantify input uncertainty by using a stochastic rainfall process. For complex hydrologic transport models, instead, we show that keeping model parameters fixed and just estimating time-dependent output uncertainties could be a viable option. The common goal across all these applications is to create time-dependent prediction intervals which are both reliable (cover the nominal amount of validation data) and precise (are as narrow as possible). In conclusion, we recommend focusing both on the choice of the hydrological model and of the probabilistic error description. The latter can include output uncertainty only, if the model is computationally-expensive, or, with simpler models, it can separately account for different sources of errors like in the inputs and the structure of the model.
Tokunaga, Taisuke; Yatabe, Takeshi; Matsumoto, Takahiro; Ando, Tatsuya; Yoon, Ki-Seok; Ogo, Seiji
2017-01-01
We report the mechanistic investigation of catalytic H 2 evolution from formic acid in water using a formate-bridged dinuclear Ru complex as a formate hydrogen lyase model. The mechanistic study is based on isotope-labeling experiments involving hydrogen isotope exchange reaction.
Building the bridge between animal movement and population dynamics.
Morales, Juan M; Moorcroft, Paul R; Matthiopoulos, Jason; Frair, Jacqueline L; Kie, John G; Powell, Roger A; Merrill, Evelyn H; Haydon, Daniel T
2010-07-27
While the mechanistic links between animal movement and population dynamics are ecologically obvious, it is much less clear when knowledge of animal movement is a prerequisite for understanding and predicting population dynamics. GPS and other technologies enable detailed tracking of animal location concurrently with acquisition of landscape data and information on individual physiology. These tools can be used to refine our understanding of the mechanistic links between behaviour and individual condition through 'spatially informed' movement models where time allocation to different behaviours affects individual survival and reproduction. For some species, socially informed models that address the movements and average fitness of differently sized groups and how they are affected by fission-fusion processes at relevant temporal scales are required. Furthermore, as most animals revisit some places and avoid others based on their previous experiences, we foresee the incorporation of long-term memory and intention in movement models. The way animals move has important consequences for the degree of mixing that we expect to find both within a population and between individuals of different species. The mixing rate dictates the level of detail required by models to capture the influence of heterogeneity and the dynamics of intra- and interspecific interaction.
Evaluation of the energy efficiency of enzyme fermentation by mechanistic modeling.
Albaek, Mads O; Gernaey, Krist V; Hansen, Morten S; Stocks, Stuart M
2012-04-01
Modeling biotechnological processes is key to obtaining increased productivity and efficiency. Particularly crucial to successful modeling of such systems is the coupling of the physical transport phenomena and the biological activity in one model. We have applied a model for the expression of cellulosic enzymes by the filamentous fungus Trichoderma reesei and found excellent agreement with experimental data. The most influential factor was demonstrated to be viscosity and its influence on mass transfer. Not surprisingly, the biological model is also shown to have high influence on the model prediction. At different rates of agitation and aeration as well as headspace pressure, we can predict the energy efficiency of oxygen transfer, a key process parameter for economical production of industrial enzymes. An inverse relationship between the productivity and energy efficiency of the process was found. This modeling approach can be used by manufacturers to evaluate the enzyme fermentation process for a range of different process conditions with regard to energy efficiency. Copyright © 2011 Wiley Periodicals, Inc.
Pseudomonas aeruginosa dose response and bathing water infection.
Roser, D J; van den Akker, B; Boase, S; Haas, C N; Ashbolt, N J; Rice, S A
2014-03-01
Pseudomonas aeruginosa is the opportunistic pathogen mostly implicated in folliculitis and acute otitis externa in pools and hot tubs. Nevertheless, infection risks remain poorly quantified. This paper reviews disease aetiologies and bacterial skin colonization science to advance dose-response theory development. Three model forms are identified for predicting disease likelihood from pathogen density. Two are based on Furumoto & Mickey's exponential 'single-hit' model and predict infection likelihood and severity (lesions/m2), respectively. 'Third-generation', mechanistic, dose-response algorithm development is additionally scoped. The proposed formulation integrates dispersion, epidermal interaction, and follicle invasion. The review also details uncertainties needing consideration which pertain to water quality, outbreaks, exposure time, infection sites, biofilms, cerumen, environmental factors (e.g. skin saturation, hydrodynamics), and whether P. aeruginosa is endogenous or exogenous. The review's findings are used to propose a conceptual infection model and identify research priorities including pool dose-response modelling, epidermis ecology and infection likelihood-based hygiene management.
Damage evolution and mechanical response of cross-ply ceramic composite laminates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weitsman, Y.; Yu, N.; Zhu, H.
1995-12-31
A mechanistic model for the damage evolution and mechanical response of cross-ply ceramic composite laminates under monotonically increasing uniaxial tension is presented. The model accounts for a variety of damage mechanisms evolving in cross-ply ceramic composite laminates, such as fiber-bridged matrix cracks in 0{degrees}-plies, transversely oriented matrix cracks in 90{degrees}-plies, and slips at 0{degrees}/90{degrees} ply interfaces as well as at the fiber/matrix interfaces. Energy criteria are developed to determine the creation and progression of matrix cracks and slip zones. The model predicts that the crack density in 0{degrees}-plies becomes higher than that within the 90{degrees}-plies as the applied load ismore » incrementally increased, which agrees with the experimental observation. It is also shown that the model provides a reasonable prediction for the nonlinear stress-strain behavior of crossply SiC/CAS ceramic composites.« less
In praise of mechanistically-rich models
DeAngelis, Donald L.; Mooij, Wolf M.; Canham, Charles D.; Cole, Jonathan J.; Lauenroth, William K.
2003-01-01
The book opens with an overview of the status and role of modeling in ecosystem science, including perspectives on the long-running debate over the appropriate level of complexity in models. This is followed by eight chapters that address the critical issue of evaluating ecosystem models, including methods of addressing uncertainty. Next come several case studies of the role of models in environmental policy and management. A section on the future of modeling in ecosystem science focuses on increasing the use of modeling in undergraduate education and the modeling skills of professionals within the field. The benefits and limitations of predictive (versus observational) models are also considered in detail. Written by stellar contributors, this book grants access to the state of the art and science of ecosystem modeling.
NASA Astrophysics Data System (ADS)
Chen, Cheng; Song, Pengfei; Meng, Fanchao; Li, Xiao; Liu, Xinyu; Song, Jun
2017-12-01
The present work presents a quantitative modeling framework for investigating the self-rolling of nanomembranes under different lattice mismatch strain anisotropy. The effect of transverse mismatch strain on the roll-up direction and curvature has been systematically studied employing both analytical modeling and numerical simulations. The bidirectional nature of the self-rolling of nanomembranes and the critical role of transverse strain in affecting the rolling behaviors have been demonstrated. Two fabrication strategies, i.e., third-layer deposition and corner geometry engineering, have been proposed to predictively manipulate the bidirectional rolling competition of strained nanomembranes, so as to achieve controlled, unidirectional roll-up. In particular for the strategy of corner engineering, microfabrication experiments have been performed to showcase its practical application and effectiveness. Our study offers new mechanistic knowledge towards understanding and predictive engineering of self-rolling of nanomembranes with improved roll-up yield.
A mechanistic model of small intestinal starch digestion and glucose uptake in the cow.
Mills, J A N; France, J; Ellis, J L; Crompton, L A; Bannink, A; Hanigan, M D; Dijkstra, J
2017-06-01
The high contribution of postruminal starch digestion (up to 50%) to total-tract starch digestion on energy-dense, starch-rich diets demands that limitations to small intestinal starch digestion be identified. A mechanistic model of the small intestine was described and evaluated with regard to its ability to simulate observations from abomasal carbohydrate infusions in the dairy cow. The 7 state variables represent starch, oligosaccharide, glucose, and pancreatic amylase in the intestinal lumen, oligosaccharide and glucose in the unstirred water layer at the intestinal wall, and intracellular glucose of the enterocyte. Enzymatic hydrolysis of starch was modeled as a 2-stage process involving the activity of pancreatic amylase in the lumen and of oligosaccharidase at the brush border of the enterocyte confined within the unstirred water layer. The Na + -dependent glucose transport into the enterocyte was represented along with a facilitative glucose transporter 2 transport system on the basolateral membrane. The small intestine is subdivided into 3 main sections, representing the duodenum, jejunum, and ileum for parameterization. Further subsections are defined between which continual digesta flow is represented. The model predicted nonstructural carbohydrate disappearance in the small intestine for cattle unadapted to duodenal infusion with a coefficient of determination of 0.92 and a root mean square prediction error of 25.4%. Simulation of glucose disappearance for mature Holstein heifers adapted to various levels of duodenal glucose infusion yielded a coefficient of determination of 0.81 and a root mean square prediction error of 38.6%. Analysis of model behavior identified limitations to the efficiency of small intestinal starch digestion with high levels of duodenal starch flow. Limitations to individual processes, particularly starch digestion in the proximal section of the intestine, can create asynchrony between starch hydrolysis and glucose uptake capacity. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Crops in silico: A community wide multi-scale computational modeling framework of plant canopies
NASA Astrophysics Data System (ADS)
Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.
2016-12-01
Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.
Dynamic model predicting overweight, obesity, and extreme obesity prevalence trends.
Thomas, Diana M; Weedermann, Marion; Fuemmeler, Bernard F; Martin, Corby K; Dhurandhar, Nikhil V; Bredlau, Carl; Heymsfield, Steven B; Ravussin, Eric; Bouchard, Claude
2014-02-01
Obesity prevalence in the United States appears to be leveling, but the reasons behind the plateau remain unknown. Mechanistic insights can be provided from a mathematical model. The objective of this study is to model known multiple population parameters associated with changes in body mass index (BMI) classes and to establish conditions under which obesity prevalence will plateau. A differential equation system was developed that predicts population-wide obesity prevalence trends. The model considers both social and nonsocial influences on weight gain, incorporates other known parameters affecting obesity trends, and allows for country specific population growth. The dynamic model predicts that: obesity prevalence is a function of birthrate and the probability of being born in an obesogenic environment; obesity prevalence will plateau independent of current prevention strategies; and the US prevalence of overweight, obesity, and extreme obesity will plateau by about 2030 at 28%, 32%, and 9% respectively. The US prevalence of obesity is stabilizing and will plateau, independent of current preventative strategies. This trend has important implications in accurately evaluating the impact of various anti-obesity strategies aimed at reducing obesity prevalence. Copyright © 2013 The Obesity Society.
Pappu, J Sharon Mano; Gummadi, Sathyanarayana N
2016-11-01
This study examines the use of unstructured kinetic model and artificial neural networks as predictive tools for xylitol production by Debaryomyces nepalensis NCYC 3413 in bioreactor. An unstructured kinetic model was proposed in order to assess the influence of pH (4, 5 and 6), temperature (25°C, 30°C and 35°C) and volumetric oxygen transfer coefficient kLa (0.14h(-1), 0.28h(-1) and 0.56h(-1)) on growth and xylitol production. A feed-forward back-propagation artificial neural network (ANN) has been developed to investigate the effect of process condition on xylitol production. ANN configuration of 6-10-3 layers was selected and trained with 339 experimental data points from bioreactor studies. Results showed that simulation and prediction accuracy of ANN was apparently higher when compared to unstructured mechanistic model under varying operational conditions. ANN was found to be an efficient data-driven tool to predict the optimal harvest time in xylitol production. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward
2018-04-01
A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.
A TCP model for external beam treatment of intermediate-risk prostate cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walsh, Sean; Putten, Wil van der
2013-03-15
Purpose: Biological models offer the ability to predict clinical outcomes. The authors describe a model to predict the clinical response of intermediate-risk prostate cancer to external beam radiotherapy for a variety of fractionation regimes. Methods: A fully heterogeneous population averaged tumor control probability model was fit to clinical outcome data for hyper, standard, and hypofractionated treatments. The tumor control probability model was then employed to predict the clinical outcome of extreme hypofractionation regimes, as utilized in stereotactic body radiotherapy. Results: The tumor control probability model achieves an excellent level of fit, R{sup 2} value of 0.93 and a root meanmore » squared error of 1.31%, to the clinical outcome data for hyper, standard, and hypofractionated treatments using realistic values for biological input parameters. Residuals Less-Than-Or-Slanted-Equal-To 1.0% are produced by the tumor control probability model when compared to clinical outcome data for stereotactic body radiotherapy. Conclusions: The authors conclude that this tumor control probability model, used with the optimized radiosensitivity values obtained from the fit, is an appropriate mechanistic model for the analysis and evaluation of external beam RT plans with regard to tumor control for these clinical conditions.« less
NASA Technical Reports Server (NTRS)
Hoehler, Tori M.
2010-01-01
The remarkable challenges and possibilities of the coming few decades will compel the biogeochemical and astrobiological sciences to characterize the interactions between biology and its environment in a fundamental, mechanistic, and quantitative fashion. The clear need for integrative and scalable biology-environment models is exemplified in the Earth sciences by the challenge of effectively addressing anthropogenic global change, and in the space sciences by the challenge of mounting a well-constrained yet sufficiently adaptive and inclusive search for life beyond Earth. Our understanding of the life-planet interaction is still, however, largely empirical. A variety of approaches seek to move from empirical to mechanistic descriptions. One approach focuses on the relationship between biology and energy, which is at once universal (all life requires energy), unique (life manages energy flow in a fashion not seen in abiotic systems), and amenable to characterization and quantification in thermodynamic terms. Simultaneously, a focus on energy flow addresses a critical point of interface between life and its geological, chemical, and physical environment. Characterizing and quantifying this relationship for life on Earth will support the development of integrative and predictive models for biology-environment dynamics. Understanding this relationship at its most fundamental level holds potential for developing concepts of habitability and biosignatures that can optimize astrobiological exploration strategies and are extensible to all life.
Eĭdel'man, Iu A; Slanina, S V; Sal'nikov, I V; Andreev, S G
2012-12-01
The knowledge of radiation-induced chromosomal aberration (CA) mechanisms is required in many fields of radiation genetics, radiation biology, biodosimetry, etc. However, these mechanisms are yet to be quantitatively characterised. One of the reasons is that the relationships between primary lesions of DNA/chromatin/chromosomes and dose-response curves for CA are unknown because the pathways of lesion interactions in an interphase nucleus are currently inaccessible for direct experimental observation. This article aims for the comparative analysis of two principally different scenarios of formation of simple and complex interchromosomal exchange aberrations: by lesion interactions at chromosome territories' surface vs. in the whole space of the nucleus. The analysis was based on quantitative mechanistic modelling of different levels of structures and processes involved in CA formation: chromosome structure in an interphase nucleus, induction, repair and interactions of DNA lesions. It was shown that the restricted diffusion of chromosomal loci, predicted by computational modelling of chromosome organization, results in lesion interactions in the whole space of the nucleus being impossible. At the same time, predicted features of subchromosomal dynamics agrees well with in vivo observations and does not contradict the mechanism of CA formation at the surface of chromosome territories. On the other hand, the "surface mechanism" of CA formation, despite having certain qualities, proved to be insufficient to explain high frequency of complex exchange aberrations observed by mFISH technique. The alternative mechanism, CA formation on nuclear centres is expected to be sufficient to explain frequent complex exchanges.
Computational substrates of social value in interpersonal collaboration.
Fareri, Dominic S; Chang, Luke J; Delgado, Mauricio R
2015-05-27
Decisions to engage in collaborative interactions require enduring considerable risk, yet provide the foundation for building and maintaining relationships. Here, we investigate the mechanisms underlying this process and test a computational model of social value to predict collaborative decision making. Twenty-six participants played an iterated trust game and chose to invest more frequently with their friends compared with a confederate or computer despite equal reinforcement rates. This behavior was predicted by our model, which posits that people receive a social value reward signal from reciprocation of collaborative decisions conditional on the closeness of the relationship. This social value signal was associated with increased activity in the ventral striatum and medial prefrontal cortex, which significantly predicted the reward parameters from the social value model. Therefore, we demonstrate that the computation of social value drives collaborative behavior in repeated interactions and provide a mechanistic account of reward circuit function instantiating this process. Copyright © 2015 the authors 0270-6474/15/358170-11$15.00/0.
Predicting Insulin Absorption and Glucose Uptake during Exercise in Type 1 Diabetes
NASA Astrophysics Data System (ADS)
Frank, Spencer; Hinshaw, Ling; Basu, Rita; Szeri, Andrew; Basu, Ananda
2017-11-01
A dose of insulin infused into subcutaneous tissue has been shown to absorb more quickly during exercise, potentially causing hypoglycemia in persons with type 1 diabetes. We develop a model that relates exercise-induced physiological changes to enhanced insulin-absorption (k) and glucose uptake (GU). Drawing on concepts of the microcirculation we derive a relationship that reveals that k and GU are mainly determined by two physiological parameters that characterize the tissue: the tissue perfusion rate (Q) and the capillary permeability surface area (PS). Independently measured values of Q and PS from the literature are used in the model to make predictions of k and GU. We compare these predictions to experimental observations of healthy and diabetic patients that are given a meal followed by rest or exercise. The experiments show that during exercise insulin concentrations significantly increase and that glucose levels fall rapidly. The model predictions are consistent with the experiments and show that increases in Q and PS directly increase k and GU. This mechanistic understanding provides a basis for handling exercise in control algorithms for an artificial pancreas. Now at University of British Columbia.
Bedward, Michael; Penman, Trent D.; Doherty, Michael D.; Weber, Rodney O.; Gill, A. Malcolm; Cary, Geoffrey J.
2016-01-01
The influence of plant traits on forest fire behaviour has evolutionary, ecological and management implications, but is poorly understood and frequently discounted. We use a process model to quantify that influence and provide validation in a diverse range of eucalypt forests burnt under varying conditions. Measured height of consumption was compared to heights predicted using a surface fuel fire behaviour model, then key aspects of our model were sequentially added to this with and without species-specific information. Our fully specified model had a mean absolute error 3.8 times smaller than the otherwise identical surface fuel model (p < 0.01), and correctly predicted the height of larger (≥1 m) flames 12 times more often (p < 0.001). We conclude that the primary endogenous drivers of fire severity are the species of plants present rather than the surface fuel load, and demonstrate the accuracy and versatility of the model for quantifying this. PMID:27529789
Zylstra, Philip; Bradstock, Ross A; Bedward, Michael; Penman, Trent D; Doherty, Michael D; Weber, Rodney O; Gill, A Malcolm; Cary, Geoffrey J
2016-01-01
The influence of plant traits on forest fire behaviour has evolutionary, ecological and management implications, but is poorly understood and frequently discounted. We use a process model to quantify that influence and provide validation in a diverse range of eucalypt forests burnt under varying conditions. Measured height of consumption was compared to heights predicted using a surface fuel fire behaviour model, then key aspects of our model were sequentially added to this with and without species-specific information. Our fully specified model had a mean absolute error 3.8 times smaller than the otherwise identical surface fuel model (p < 0.01), and correctly predicted the height of larger (≥1 m) flames 12 times more often (p < 0.001). We conclude that the primary endogenous drivers of fire severity are the species of plants present rather than the surface fuel load, and demonstrate the accuracy and versatility of the model for quantifying this.
Reconciled rat and human metabolic networks for comparative toxicogenomics and biomarker predictions
Blais, Edik M.; Rawls, Kristopher D.; Dougherty, Bonnie V.; Li, Zhuo I.; Kolling, Glynis L.; Ye, Ping; Wallqvist, Anders; Papin, Jason A.
2017-01-01
The laboratory rat has been used as a surrogate to study human biology for more than a century. Here we present the first genome-scale network reconstruction of Rattus norvegicus metabolism, iRno, and a significantly improved reconstruction of human metabolism, iHsa. These curated models comprehensively capture metabolic features known to distinguish rats from humans including vitamin C and bile acid synthesis pathways. After reconciling network differences between iRno and iHsa, we integrate toxicogenomics data from rat and human hepatocytes, to generate biomarker predictions in response to 76 drugs. We validate comparative predictions for xanthine derivatives with new experimental data and literature-based evidence delineating metabolite biomarkers unique to humans. Our results provide mechanistic insights into species-specific metabolism and facilitate the selection of biomarkers consistent with rat and human biology. These models can serve as powerful computational platforms for contextualizing experimental data and making functional predictions for clinical and basic science applications. PMID:28176778
Mishra, H; Polak, S; Jamei, M; Rostami-Hodjegan, A
2014-01-01
We aimed to investigate the application of combined mechanistic pharmacokinetic (PK) and pharmacodynamic (PD) modeling and simulation in predicting the domperidone (DOM) triggered pseudo-electrocardiogram modification in the presence of a CYP3A inhibitor, ketoconazole (KETO), using in vitro–in vivo extrapolation. In vitro metabolic and inhibitory data were incorporated into physiologically based pharmacokinetic (PBPK) models within Simcyp to simulate time course of plasma DOM and KETO concentrations when administered alone or in combination with KETO (DOM+KETO). Simulated DOM concentrations in plasma were used to predict changes in gender-specific QTcF (Fridericia correction) intervals within the Cardiac Safety Simulator platform taking into consideration DOM, KETO, and DOM+KETO triggered inhibition of multiple ionic currents in population. Combination of in vitro–in vivo extrapolation, PBPK, and systems pharmacology of electric currents in the heart was able to predict the direction and magnitude of PK and PD changes under coadministration of the two drugs although some disparities were detected. PMID:25116274
Structure-activity relationships for skin sensitization: recent improvements to Derek for Windows.
Langton, Kate; Patlewicz, Grace Y; Long, Anthony; Marchant, Carol A; Basketter, David A
2006-12-01
Derek for Windows (DfW) is a knowledge-based expert system that predicts the toxicity of a chemical from its structure. Its predictions are based in part on alerts that describe structural features or toxicophores associated with toxicity. Recently, improvements have been made to skin sensitization alerts within the DfW knowledge base in collaboration with Unilever. These include modifications to the alerts describing the skin sensitization potential of aldehydes, 1,2-diketones, and isothiazolinones and consist of enhancements to the toxicophore definition, the mechanistic classification, and the extent of supporting evidence provided. The outcomes from this collaboration demonstrate the importance of updating and refining computer models for the prediction of skin sensitization as new information from experimental and theoretical studies becomes available.
NASA Astrophysics Data System (ADS)
Robin, C.; Gérard, M.; Quinaud, M.; d'Arbigny, J.; Bultel, Y.
2016-09-01
The prediction of Proton Exchange Membrane Fuel Cell (PEMFC) lifetime is one of the major challenges to optimize both material properties and dynamic control of the fuel cell system. In this study, by a multiscale modeling approach, a mechanistic catalyst dissolution model is coupled to a dynamic PEMFC cell model to predict the performance loss of the PEMFC. Results are compared to two 2000-h experimental aging tests. More precisely, an original approach is introduced to estimate the loss of an equivalent active surface area during an aging test. Indeed, when the computed Electrochemical Catalyst Surface Area profile is fitted on the experimental measures from Cyclic Voltammetry, the computed performance loss of the PEMFC is underestimated. To be able to predict the performance loss measured by polarization curves during the aging test, an equivalent active surface area is obtained by a model inversion. This methodology enables to successfully find back the experimental cell voltage decay during time. The model parameters are fitted from the polarization curves so that they include the global degradation. Moreover, the model captures the aging heterogeneities along the surface of the cell observed experimentally. Finally, a second 2000-h durability test in dynamic operating conditions validates the approach.
An elastic failure model of indentation damage. [of brittle structural ceramics
NASA Technical Reports Server (NTRS)
Liaw, B. M.; Kobayashi, A. S.; Emery, A. F.
1984-01-01
A mechanistically consistent model for indentation damage based on elastic failure at tensile or shear overloads, is proposed. The model accommodates arbitrary crack orientation, stress relaxation, reduction and recovery of stiffness due to crack opening and closure, and interfacial friction due to backward sliding of closed cracks. This elastic failure model was implemented by an axisymmetric finite element program which was used to simulate progressive damage in a silicon nitride plate indented by a tungsten carbide sphere. The predicted damage patterns and the permanent impression matched those observed experimentally. The validation of this elastic failure model shows that the plastic deformation postulated by others is not necessary to replicate the indentation damage of brittle structural ceramics.
Food allergy animal models: an overview.
Helm, Ricki M
2002-05-01
Specific food allergy is characterized by sensitization to innocuous food proteins with production of allergen-specific IgE that binds to receptors on basophils and mast cells. Upon recurrent exposure to the same allergen, an allergic response is induced by mediator release following cross-linking of cell-bound allergen-specific IgE. The determination of what makes an innocuous food protein an allergen in predisposed individuals is unknown; however, mechanistic and protein allergen predictive models are being actively investigated in a number of animal models. Currently, there is no animal model that will actively profile known food allergens, predict the allergic potential of novel food proteins, or demonstrate clinically the human food allergic sensitization/allergic response. Animal models under investigation include mice, rats, the guinea pig, atopic dog, and neonatal swine. These models are being assessed for production of IgE, clinical responses to re-exposure, and a ranking of food allergens (based on potency) including a nonfood allergen protein source. A selection of animal models actively being investigated that will contribute to our understanding of what makes a protein an allergen and future predictive models for assessing the allergenicity of novel proteins is presented in this review.
Muller, Claudia; Busignies, Virginie; Mazel, Vincent; Forestier, Christiane; Nivoliez, Adrien; Tchoreloff, Pierre
2013-01-01
Probiotics are of great current interest in the pharmaceutical industry because of their multiple effects on human health. To beneficially affect the host, an adequate dosage of the probiotic bacteria in the product must be guaranteed from the time of manufacturing to expiration date. Stability test guidelines as laid down by the ICH-Q1A stipulate a minimum testing period of 12 months. The challenge for producers is to reduce this time. In this paper, a mechanistic approach using the Arrhenius model is proposed to predict stability. Applied for the first time to laboratory and industrial probiotic powders, the model was able to provide a reliable mathematical representation of the effects of temperature on bacterial death (R2>0.9). The destruction rate (k) was determined according to the manufacturing process, strain and storage conditions. The marketed product demonstrated a better stability (k = 0.08 months−1) than the laboratory sample (k = 0.80 months−1). With industrial batches, k obtained at 6 months of studies was comparable to that obtained at 12 months, evidence of the model’s robustness. In addition, predicted values at 12 months were greatly similar (±30%) to those obtained by real-time assessing the model’s reliability. This method could be an interesting approach to predict the probiotic stability and could reduce to 6 months the length of stability studies as against 12 (ICH guideline) or 24 months (expiration date). PMID:24244412
Biochar: from laboratory mechanisms through the greenhouse to field trials
NASA Astrophysics Data System (ADS)
Masiello, C. A.; Gao, X.; Dugan, B.; Silberg, J. J.; Zygourakis, K.; Alvarez, P. J. J.
2014-12-01
The biochar community is excellent at pointing to individual cases where biochar amendment has changed soil properties, with some studies showing significant improvements in crop yields, reduction in nutrient export, and remediation of pollutants. However, many studies exist which do not show improvements, and in some cases, studies clearly show detrimental outcomes. The next, crucial step in biochar science and engineering research will be to develop a process-based understanding of how biochar acts to improve soil properties. In particular, we need a better mechanistic understanding of how biochar sorbs and desorbs contaminants, how it interacts with soil water, and how it interacts with the soil microbial community. These mechanistic studies need to encompass processes that range from the nanometer to the kilometer scale. At the nanometer scale, we need a predictive model of how biochar will sorb and desorb hydrocarbons, nutrients, and toxic metals. At the micrometer scale we need models that explain biochar's effects on soil water, especially the plant-available fraction of soil water. The micrometer scale is also where mechanistic information is neeed about microbial processes. At the macroscale we need physical models to describe the landscape mobility of biochar, because biochar that washes away from fields can no longer provide crop benefits. To be most informative, biochar research should occur along a lab-greenhouse-field trial trajectory. Laboratory experiments should aim determine what mechanisms may act to control biochar-soil processes, and then greenhouse experiments can be used to test the significance of lab-derived mechanisms in short, highly replicated, controlled experiments. Once evidence of effect is determined from greenhouse experiments, field trials are merited. Field trials are the gold standard needed prior to full deployment, but results from field trials cannot be extrapolated to other field sites without the mechanistic backup provided by greenhouse and lab trials.
Al Sharif, Merilin; Tsakovska, Ivanka; Pajeva, Ilza; Alov, Petko; Fioravanzo, Elena; Bassan, Arianna; Kovarich, Simona; Yang, Chihae; Mostrag-Szlichtyng, Aleksandra; Vitcheva, Vessela; Worth, Andrew P; Richarz, Andrea-N; Cronin, Mark T D
2017-12-01
The aim of this paper was to provide a proof of concept demonstrating that molecular modelling methodologies can be employed as a part of an integrated strategy to support toxicity prediction consistent with the mode of action/adverse outcome pathway (MoA/AOP) framework. To illustrate the role of molecular modelling in predictive toxicology, a case study was undertaken in which molecular modelling methodologies were employed to predict the activation of the peroxisome proliferator-activated nuclear receptor γ (PPARγ) as a potential molecular initiating event (MIE) for liver steatosis. A stepwise procedure combining different in silico approaches (virtual screening based on docking and pharmacophore filtering, and molecular field analysis) was developed to screen for PPARγ full agonists and to predict their transactivation activity (EC 50 ). The performance metrics of the classification model to predict PPARγ full agonists were balanced accuracy=81%, sensitivity=85% and specificity=76%. The 3D QSAR model developed to predict EC 50 of PPARγ full agonists had the following statistical parameters: q 2 cv =0.610, N opt =7, SEP cv =0.505, r 2 pr =0.552. To support the linkage of PPARγ agonism predictions to prosteatotic potential, molecular modelling was combined with independently performed mechanistic mining of available in vivo toxicity data followed by ToxPrint chemotypes analysis. The approaches investigated demonstrated a potential to predict the MIE, to facilitate the process of MoA/AOP elaboration, to increase the scientific confidence in AOP, and to become a basis for 3D chemotype development. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandina N. Rao; Subhash C. Ayirala; Madhav M. Kulkarni
This report describes the progress of the project ''Development And Optimization of Gas-Assisted Gravity Drainage (GAGD) Process for Improved Light Oil Recovery'' for the duration of the thirteenth project quarter (Oct 1, 2005 to Dec 30, 2005). There are three main tasks in this research project. Task 1 is a scaled physical model study of the GAGD process. Task 2 is further development of a vanishing interfacial tension (VIT) technique for miscibility determination. Task 3 is determination of multiphase displacement characteristics in reservoir rocks. Section I reports experimental work designed to investigate wettability effects of porous medium, on secondary andmore » tertiary mode GAGD performance. The experiments showed a significant improvement of oil recovery in the oil-wet experiments versus the water-wet runs, both in secondary as well as tertiary mode. When comparing experiments conducted in secondary mode to those run in tertiary mode an improvement in oil recovery was also evident. Additionally, this section summarizes progress made with regard to the scaled physical model construction and experimentation. The purpose of building a scaled physical model, which attempts to include various multiphase mechanics and fluid dynamic parameters operational in the field scale, was to incorporate visual verification of the gas front for viscous instabilities, capillary fingering, and stable displacement. Preliminary experimentation suggested that construction of the 2-D model from sintered glass beads was a feasible alternative. During this reporting quarter, several sintered glass mini-models were prepared and some preliminary experiments designed to visualize gas bubble development were completed. In Section II, the gas-oil interfacial tensions measured in decane-CO{sub 2} system at 100 F and live decane consisting of 25 mole% methane, 30 mole% n-butane and 45 mole% n-decane against CO{sub 2} gas at 160 F have been modeled using the Parachor and newly proposed mechanistic Parachor models. In the decane-CO{sub 2} binary system, Parachor model was found to be sufficient for interfacial tension calculations. The predicted miscibility from the Parachor model deviated only by about 2.5% from the measured VIT miscibility. However, in multicomponent live decane-CO{sub 2} system, the performance of the Parachor model was poor, while good match of interfacial tension predictions has been obtained experimentally using the proposed mechanistic Parachor model. The predicted miscibility from the mechanistic Parachor model accurately matched with the measured VIT miscibility in live decane-CO2 system, which indicates the suitability of this model to predict miscibility in complex multicomponent hydrocarbon systems. In the previous reports to the DOE (15323R07, Oct 2004; 15323R08, Jan 2005; 15323R09, Apr 2005; 15323R10, July 2005 and 154323, Oct 2005), the 1-D experimental results from dimensionally scaled GAGD and WAG corefloods were reported for Section III. Additionally, since Section I reports the experimental results from 2-D physical model experiments; this section attempts to extend this 2-D GAGD study to 3-D (4-phase) flow through porous media and evaluate the performance of these processes using reservoir simulation. Section IV includes the technology transfer efforts undertaken during the quarter. This research work resulted in one international paper presentation in Tulsa, OK; one journal publication; three pending abstracts for SCA 2006 Annual Conference and an invitation to present at the Independents Day session at the IOR Symposium 2006.« less
Breen, Michael S; Breen, Miyuki; Williams, Ronald W; Schultz, Bradley D
2010-12-15
A critical aspect of air pollution exposure models is the estimation of the air exchange rate (AER) of individual homes, where people spend most of their time. The AER, which is the airflow into and out of a building, is a primary mechanism for entry of outdoor air pollutants and removal of indoor source emissions. The mechanistic Lawrence Berkeley Laboratory (LBL) AER model was linked to a leakage area model to predict AER from questionnaires and meteorology. The LBL model was also extended to include natural ventilation (LBLX). Using literature-reported parameter values, AER predictions from LBL and LBLX models were compared to data from 642 daily AER measurements across 31 detached homes in central North Carolina, with corresponding questionnaires and meteorological observations. Data was collected on seven consecutive days during each of four consecutive seasons. For the individual model-predicted and measured AER, the median absolute difference was 43% (0.17 h(-1)) and 40% (0.17 h(-1)) for the LBL and LBLX models, respectively. Additionally, a literature-reported empirical scale factor (SF) AER model was evaluated, which showed a median absolute difference of 50% (0.25 h(-1)). The capability of the LBL, LBLX, and SF models could help reduce the AER uncertainty in air pollution exposure models used to develop exposure metrics for health studies.
Network news: innovations in 21st century systems biology.
Arkin, Adam P; Schaffer, David V
2011-03-18
A decade ago, seminal perspectives and papers set a strong vision for the field of systems biology, and a number of these themes have flourished. Here, we describe key technologies and insights that have elucidated the evolution, architecture, and function of cellular networks, ultimately leading to the first predictive genome-scale regulatory and metabolic models of organisms. Can systems approaches bridge the gap between correlative analysis and mechanistic insights? Copyright © 2011 Elsevier Inc. All rights reserved.
Scalar utility theory and proportional processing: what does it actually imply?
Rosenström, Tom; Wiesner, Karoline; Houston, Alasdair I
2017-01-01
Scalar Utility Theory (SUT) is a model used to predict animal and human choice behaviour in the context of reward amount, delay to reward, and variability in these quantities (risk preferences). This article reviews and extends SUT, deriving novel predictions. We show that, contrary to what has been implied in the literature, (1) SUT can predict both risk averse and risk prone behaviour for both reward amounts and delays to reward depending on experimental parameters, (2) SUT implies violations of several concepts of rational behaviour (e.g. it violates strong stochastic transitivity and its equivalents, and leads to probability matching) and (3) SUT can predict, but does not always predict, a linear relationship between risk sensitivity in choices and coefficient of variation in the decision-making experiment. SUT derives from Scalar Expectancy Theory which models uncertainty in behavioural timing using a normal distribution. We show that the above conclusions also hold for other distributions, such as the inverse Gaussian distribution derived from drift-diffusion models. A straightforward way to test the key assumptions of SUT is suggested and possible extensions, future prospects and mechanistic underpinnings are discussed. PMID:27288541
Scalar utility theory and proportional processing: What does it actually imply?
Rosenström, Tom; Wiesner, Karoline; Houston, Alasdair I
2016-09-07
Scalar Utility Theory (SUT) is a model used to predict animal and human choice behaviour in the context of reward amount, delay to reward, and variability in these quantities (risk preferences). This article reviews and extends SUT, deriving novel predictions. We show that, contrary to what has been implied in the literature, (1) SUT can predict both risk averse and risk prone behaviour for both reward amounts and delays to reward depending on experimental parameters, (2) SUT implies violations of several concepts of rational behaviour (e.g. it violates strong stochastic transitivity and its equivalents, and leads to probability matching) and (3) SUT can predict, but does not always predict, a linear relationship between risk sensitivity in choices and coefficient of variation in the decision-making experiment. SUT derives from Scalar Expectancy Theory which models uncertainty in behavioural timing using a normal distribution. We show that the above conclusions also hold for other distributions, such as the inverse Gaussian distribution derived from drift-diffusion models. A straightforward way to test the key assumptions of SUT is suggested and possible extensions, future prospects and mechanistic underpinnings are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Gupta, Pankaj; Friberg, Lena E; Karlsson, Mats O; Krishnaswami, Sriram; French, Jonathan
2010-06-01
CP-690,550, a selective inhibitor of the Janus kinase family, is being developed as an oral disease-modifying antirheumatic drug for the treatment of rheumatoid arthritis (RA). A semi-mechanistic model was developed to characterize the time course of drug-induced absolute neutrophil count (ANC) reduction in a phase 2a study. Data from 264 RA patients receiving 6-week treatment (placebo, 5, 15, 30 mg bid) followed by a 6-week off-treatment period were analyzed. The model included a progenitor cell pool, a maturation chain comprising transit compartments, a circulation pool, and a feedback mechanism. The model was adequately described by system parameters (BASE(h), ktr(h), gamma, and k(circ)), disease effect parameters (DIS), and drug effect parameters (k(off) and k(D)). The disease manifested as an increase in baseline ANC and reduced maturation time due to increased demand from the inflammation site. The drug restored the perturbed system parameters to their normal values via an indirect mechanism. ANC reduction due to a direct myelosuppressive drug effect was not supported. The final model successfully described the dose- and time-dependent changes in ANC and predicted the incidence of neutropenia at different doses reasonably well.
Mechanistic links between cellular trade-offs, gene expression, and growth.
Weiße, Andrea Y; Oyarzún, Diego A; Danos, Vincent; Swain, Peter S
2015-03-03
Intracellular processes rarely work in isolation but continually interact with the rest of the cell. In microbes, for example, we now know that gene expression across the whole genome typically changes with growth rate. The mechanisms driving such global regulation, however, are not well understood. Here we consider three trade-offs that, because of limitations in levels of cellular energy, free ribosomes, and proteins, are faced by all living cells and we construct a mechanistic model that comprises these trade-offs. Our model couples gene expression with growth rate and growth rate with a growing population of cells. We show that the model recovers Monod's law for the growth of microbes and two other empirical relationships connecting growth rate to the mass fraction of ribosomes. Further, we can explain growth-related effects in dosage compensation by paralogs and predict host-circuit interactions in synthetic biology. Simulating competitions between strains, we find that the regulation of metabolic pathways may have evolved not to match expression of enzymes to levels of extracellular substrates in changing environments but rather to balance a trade-off between exploiting one type of nutrient over another. Although coarse-grained, the trade-offs that the model embodies are fundamental, and, as such, our modeling framework has potentially wide application, including in both biotechnology and medicine.
A robust framework to predict mercury speciation in combustion flue gases.
Ticknor, Jonathan L; Hsu-Kim, Heileen; Deshusses, Marc A
2014-01-15
Mercury emissions from coal combustion have become a global concern as growing energy demands have increased the consumption of coal. The effective implementation of treatment technologies requires knowledge of mercury speciation in the flue gas, namely concentrations of elemental, oxidized and particulate mercury at the exit of the boiler. A model that can accurately predict mercury species in flue gas would be very useful in that context. Here, a Bayesian regularized artificial neural network (BRANN) that uses five coal properties and combustion temperature was developed to predict mercury speciation in flue gases before treatment technology implementation. The results of the model show that up to 97 percent of the variation in mercury species concentration is captured through the use of BRANNs. The BRANN model was used to conduct a parametric sensitivity which revealed that the coal chlorine content and coal calorific value were the most sensitive parameters, followed by the combustion temperature. The coal sulfur content was the least important parameter. The results demonstrate the applicability of BRANNs for predicting mercury concentration and speciation in combustion flue gas and provide a more efficient and effective technique when compared to other advanced non-mechanistic modeling strategies. Copyright © 2013 Elsevier B.V. All rights reserved.
Model of transient drug diffusion across cornea.
Zhang, Wensheng; Prausnitz, Mark R; Edwards, Aurélie
2004-09-30
A mathematical model of solute transient diffusion across the cornea to the anterior chamber of the eye was developed for topical drug delivery. Solute bioavailability was predicted given solute molecular radius and octanol-to-water distribution coefficient (Phi), ocular membrane ultrastructural parameters, tear fluid hydrodynamics, as well as solute distribution volume (Vd) and clearance rate (Cla) in the anterior chamber. The results suggest that drug bioavailability is primarily determined by solute lipophilicity. In human eyes, bioavailability is predicted to range between 1% and 5% for lipophilic molecules (Phi>1), and to be less than 0.5% for hydrophilic molecules (Phi<0.01). The simulations indicate that the distribution coefficient that maximizes bioavailability is on the order of 10. It was also found that the maximum solute concentration in the anterior chamber (Cmax) and the time needed to reach Cmax significantly depend on Phi, Vd, and Cla. Consistent with experimental findings, model predictions suggest that drug bioavailability can be increased by lowering the conjunctival-to-corneal permeability ratio and reducing precorneal solute drainage. Because of its mechanistic basis, this model will be useful to predict drug transport kinetics and bioavailability for new compounds and in diseased eyes.
NASA Astrophysics Data System (ADS)
Basant, Nikita; Gupta, Shikha
2018-03-01
The reactions of molecular ozone (O3), hydroxyl (•OH) and nitrate (NO3) radicals are among the major pathways of removal of volatile organic compounds (VOCs) in the atmospheric environment. The gas-phase kinetic rate constants (kO3, kOH, kNO3) are thus, important in assessing the ultimate fate and exposure risk of atmospheric VOCs. Experimental data for rate constants are not available for many emerging VOCs and the computational methods reported so far address a single target modeling only. In this study, we have developed a multi-target (mt) QSPR model for simultaneous prediction of multiple kinetic rate constants (kO3, kOH, kNO3) of diverse organic chemicals considering an experimental data set of VOCs for which values of all the three rate constants are available. The mt-QSPR model identified and used five descriptors related to the molecular size, degree of saturation and electron density in a molecule, which were mechanistically interpretable. These descriptors successfully predicted three rate constants simultaneously. The model yielded high correlations (R2 = 0.874-0.924) between the experimental and simultaneously predicted endpoint rate constant (kO3, kOH, kNO3) values in test arrays for all the three systems. The model also passed all the stringent statistical validation tests for external predictivity. The proposed multi-target QSPR model can be successfully used for predicting reactivity of new VOCs simultaneously for their exposure risk assessment.
Hauschild, L; Lovatto, P A; Pomar, J; Pomar, C
2012-07-01
The objective of this study was to develop and evaluate a mathematical model used to estimate the daily amino acid requirements of individual growing-finishing pigs. The model includes empirical and mechanistic model components. The empirical component estimates daily feed intake (DFI), BW, and daily gain (DG) based on individual pig information collected in real time. Based on DFI, BW, and DG estimates, the mechanistic component uses classic factorial equations to estimate the optimal concentration of amino acids that must be offered to each pig to meet its requirements. The model was evaluated with data from a study that investigated the effect of feeding pigs with a 3-phase or daily multiphase system. The DFI and BW values measured in this study were compared with those estimated by the empirical component of the model. The coherence of the values estimated by the mechanistic component was evaluated by analyzing if it followed a normal pattern of requirements. Lastly, the proposed model was evaluated by comparing its estimates with those generated by the existing growth model (InraPorc). The precision of the proposed model and InraPorc in estimating DFI and BW was evaluated through the mean absolute error. The empirical component results indicated that the DFI and BW trajectories of individual pigs fed ad libitum could be predicted 1 d (DFI) or 7 d (BW) ahead with the average mean absolute error of 12.45 and 1.85%, respectively. The average mean absolute error obtained with the InraPorc for the average individual of the population was 14.72% for DFI and 5.38% for BW. Major differences were observed when estimates from InraPorc were compared with individual observations. The proposed model, however, was effective in tracking the change in DFI and BW for each individual pig. The mechanistic model component estimated the optimal standardized ileal digestible Lys to NE ratio with reasonable between animal (average CV = 7%) and overtime (average CV = 14%) variation. Thus, the amino acid requirements estimated by model are animal- and time-dependent and follow, in real time, the individual DFI and BW growth patterns. The proposed model can follow the average feed intake and feed weight trajectory of each individual pig in real time with good accuracy. Based on these trajectories and using classical factorial equations, the model makes it possible to estimate dynamically the AA requirements of each animal, taking into account the intake and growth changes of the animal.
Chesapeake Bay Forecast System: Oxygen Prediction for the Sustainable Ecosystem Management
NASA Astrophysics Data System (ADS)
Mathukumalli, B.; Long, W.; Zhang, X.; Wood, R.; Murtugudde, R. G.
2010-12-01
The Chesapeake Bay Forecast System (CBFS) is a flexible, end-to-end expert prediction tool for decision makers that will provide customizable, user-specified predictions and projections of the region’s climate, air and water quality, local chemistry, and ecosystems at days to decades. As a part of CBFS, the long-term water quality data were collected and assembled to develop ecological models for the sustainable management of the Chesapeake Bay. Cultural eutrophication depletes oxygen levels in this ecosystem particularly in summer which has several negative implications on the structure and function of ecosystem. In order to understand dynamics and prediction of spatially-explicit oxygen levels in the Bay, an empirical process based ecological model is developed with long-term control variables (water temperature, salinity, nitrogen and phosphorus). Statistical validation methods were employed to demonstrate usability of predictions for management purposes and the predicted oxygen levels are quite faithful to observations. The predicted oxygen values and other physical outputs from downscaling of regional weather and climate predictions, or forecasts from hydrodynamic models can be used to forecast various ecological components. Such forecasts would be useful for both recreational and commercial users of the bay (for example, bass fishing). Furthermore, this work can also be used to predict extent of hypoxia/anoxia not only from anthropogenic nutrient pollution, but also from global warming. Some hindcasts and forecasts are discussed along with the ongoing efforts at a mechanistic ecosystem model to provide prognostic oxygen predictions and projections and upper trophic modeling using an energetics approach.
Predicting Electrostatic Forces in RNA Folding
Tan, Zhi-Jie; Chen, Shi-Jie
2016-01-01
Metal ion-mediated electrostatic interactions are critical to RNA folding. Although considerable progress has been made in mechanistic studies, the problem of accurate predictions for the ion effects in RNA folding remains unsolved, mainly due to the complexity of several potentially important issues such as ion correlation and dehydration effects. In this chapter, after giving a brief overview of the experimental findings and theoretical approaches, we focus on a recently developed new model, the tightly bound ion (TBI) model, for ion electrostatics in RNA folding. The model is unique because it can treat ion correlation and fluctuation effects for realistic RNA 3D structures. For monovalent ion (such as Na+) solutions, where ion correlation is weak, TBI and the Poisson–Boltzmann (PB) theory give the same results and the results agree with the experimental data. For multivalent ion (such as Mg2+) solutions, where ion correlation can be strong, however, TBI gives much improved predictions than the PB. Moreover, the model suggests an ion correlation- induced mechanism for the unusual efficiency of Mg2+ ions in the stabilization of RNA tertiary folds. In this chapter, after introducing the theoretical framework of the TBI model, we will describe how to apply the model to predict ion-binding properties and ion-dependent folding stabilities. PMID:20946803
Reconstruction of late Holocene climate based on tree growth and mechanistic hierarchical models
Tipton, John; Hooten, Mevin B.; Pederson, Neil; Tingley, Martin; Bishop, Daniel
2016-01-01
Reconstruction of pre-instrumental, late Holocene climate is important for understanding how climate has changed in the past and how climate might change in the future. Statistical prediction of paleoclimate from tree ring widths is challenging because tree ring widths are a one-dimensional summary of annual growth that represents a multi-dimensional set of climatic and biotic influences. We develop a Bayesian hierarchical framework using a nonlinear, biologically motivated tree ring growth model to jointly reconstruct temperature and precipitation in the Hudson Valley, New York. Using a common growth function to describe the response of a tree to climate, we allow for species-specific parameterizations of the growth response. To enable predictive backcasts, we model the climate variables with a vector autoregressive process on an annual timescale coupled with a multivariate conditional autoregressive process that accounts for temporal correlation and cross-correlation between temperature and precipitation on a monthly scale. Our multi-scale temporal model allows for flexibility in the climate response through time at different temporal scales and predicts reasonable climate scenarios given tree ring width data.
Huang, Ruili; Xia, Menghang; Sakamuru, Srilatha; Zhao, Jinghua; Shahane, Sampada A.; Attene-Ramos, Matias; Zhao, Tongan; Austin, Christopher P.; Simeonov, Anton
2016-01-01
Target-specific, mechanism-oriented in vitro assays post a promising alternative to traditional animal toxicology studies. Here we report the first comprehensive analysis of the Tox21 effort, a large-scale in vitro toxicity screening of chemicals. We test ∼10,000 chemicals in triplicates at 15 concentrations against a panel of nuclear receptor and stress response pathway assays, producing more than 50 million data points. Compound clustering by structure similarity and activity profile similarity across the assays reveals structure–activity relationships that are useful for the generation of mechanistic hypotheses. We apply structural information and activity data to build predictive models for 72 in vivo toxicity end points using a cluster-based approach. Models based on in vitro assay data perform better in predicting human toxicity end points than animal toxicity, while a combination of structural and activity data results in better models than using structure or activity data alone. Our results suggest that in vitro activity profiles can be applied as signatures of compound mechanism of toxicity and used in prioritization for more in-depth toxicological testing. PMID:26811972
Creasy, Arch; Reck, Jason; Pabst, Timothy; Hunter, Alan; Barker, Gregory; Carta, Giorgio
2018-05-29
A previously developed empirical interpolation (EI) method is extended to predict highly overloaded multicomponent elution behavior on a cation exchange (CEX) column based on batch isotherm data. Instead of a fully mechanistic model, the EI method employs an empirically modified multicomponent Langmuir equation to correlate two-component adsorption isotherm data at different salt concentrations. Piecewise cubic interpolating polynomials are then used to predict competitive binding at intermediate salt concentrations. The approach is tested for the separation of monoclonal antibody monomer and dimer mixtures by gradient elution on the cation exchange resin Nuvia HR-S. Adsorption isotherms are obtained over a range of salt concentrations with varying monomer and dimer concentrations. Coupled with a lumped kinetic model, the interpolated isotherms predict the column behavior for highly overloaded conditions. Predictions based on the EI method showed good agreement with experimental elution curves for protein loads up to 40 mg/mL column or about 50% of the column binding capacity. The approach can be extended to other chromatographic modalities and to more than two components. This article is protected by copyright. All rights reserved.
Predicting the effect of urban noise on the active space of avian vocal signals.
Parris, Kirsten M; McCarthy, Michael A
2013-10-01
Urbanization changes the physical environment of nonhuman species but also markedly changes their acoustic environment. Urban noise interferes with acoustic communication in a range of animals, including birds, with potentially profound impacts on fitness. However, a mechanistic theory to predict which species of birds will be most affected by urban noise is lacking. We develop a mathematical model to predict the decrease in the active space of avian vocal signals after moving from quiet forest habitats to noisy urban habitats. We find that the magnitude of the decrease is largely a function of signal frequency. However, this relationship is not monotonic. A metaregression of observed increases in the frequency of birdsong in urban noise supports the model's predictions for signals with frequencies between 1.5 and 4 kHz. Using results of the metaregression and the model described above, we show that the expected gain in active space following observed frequency shifts is up to 12% and greatest for birds with signals at the lower end of this frequency range. Our generally applicable model, along with three predictions regarding the behavioral and population-level responses of birds to urban noise, represents an important step toward a theory of acoustic communication in urban habitats.
Computational Modeling and Simulation of Developmental ...
Developmental and Reproductive Toxicity (DART) testing is important for assessing the potential consequences of drug and chemical exposure on human health and well-being. Complexity of pregnancy and the reproductive cycle makes DART testing challenging and costly for traditional (animal-based) methods. A compendium of in vitro data from ToxCast/Tox21 high-throughput screening (HTS) programs is available for predictive toxicology. ‘Predictive DART’ will require an integrative strategy that mobilizes HTS data into in silico models that capture the relevant embryology. This lecture addresses progress on EPA's 'virtual embryo'. The question of how tissues and organs are shaped during development is crucial for understanding (and predicting) human birth defects. While ToxCast HTS data may predict developmental toxicity with reasonable accuracy, mechanistic models are still necessary to capture the relevant biology. Subtle microscopic changes induced chemically may amplify to an adverse outcome but coarse changes may override lesion propagation in any complex adaptive system. Modeling system dynamics in a developing tissue is a multiscale problem that challenges our ability to predict toxicity from in vitro profiling data (ToxCast/Tox21). (DISCLAIMER: The views expressed in this presentation are those of the presenter and do not necessarily reflect the views or policies of the US EPA). This was an invited seminar presentation to the National Institute for Public H
Howell, Brett A; Chauhan, Anuj
2010-08-01
Physiologically based pharmacokinetic (PBPK) models were developed for design and optimization of liposome therapy for treatment of overdoses of tricyclic antidepressants and local anesthetics. In vitro drug-binding data for pegylated, anionic liposomes and published mechanistic equations for partition coefficients were used to develop the models. The models were proven reliable through comparisons to intravenous data. The liposomes were predicted to be highly effective at treating amitriptyline overdoses, with reductions in the area under the concentration versus time curves (AUC) of 64% for the heart and brain. Peak heart and brain drug concentrations were predicted to drop by 20%. Bupivacaine AUC and peak concentration reductions were lower at 15.4% and 17.3%, respectively, for the heart and brain. The predicted pharmacokinetic profiles following liposome administration agreed well with data from clinical studies where protein fragments were administered to patients for overdose treatment. Published data on local cardiac function were used to relate the predicted concentrations in the body to local pharmacodynamic effects in the heart. While the results offer encouragement for future liposome therapies geared toward overdose, it is imperative to point out that animal experiments and phase I clinical trials are the next steps to ensuring the efficacy of the treatment. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association
The attention schema theory: a mechanistic account of subjective awareness
Graziano, Michael S. A.; Webb, Taylor W.
2015-01-01
We recently proposed the attention schema theory, a novel way to explain the brain basis of subjective awareness in a mechanistic and scientifically testable manner. The theory begins with attention, the process by which signals compete for the brain’s limited computing resources. This internal signal competition is partly under a bottom–up influence and partly under top–down control. We propose that the top–down control of attention is improved when the brain has access to a simplified model of attention itself. The brain therefore constructs a schematic model of the process of attention, the ‘attention schema,’ in much the same way that it constructs a schematic model of the body, the ‘body schema.’ The content of this internal model leads a brain to conclude that it has a subjective experience. One advantage of this theory is that it explains how awareness and attention can sometimes become dissociated; the brain’s internal models are never perfect, and sometimes a model becomes dissociated from the object being modeled. A second advantage of this theory is that it explains how we can be aware of both internal and external events. The brain can apply attention to many types of information including external sensory information and internal information about emotions and cognitive states. If awareness is a model of attention, then this model should pertain to the same domains of information to which attention pertains. A third advantage of this theory is that it provides testable predictions. If awareness is the internal model of attention, used to help control attention, then without awareness, attention should still be possible but should suffer deficits in control. In this article, we review the existing literature on the relationship between attention and awareness, and suggest that at least some of the predictions of the theory are borne out by the evidence. PMID:25954242
Computational modeling of neurostimulation in brain diseases.
Wang, Yujiang; Hutchings, Frances; Kaiser, Marcus
2015-01-01
Neurostimulation as a therapeutic tool has been developed and used for a range of different diseases such as Parkinson's disease, epilepsy, and migraine. However, it is not known why the efficacy of the stimulation varies dramatically across patients or why some patients suffer from severe side effects. This is largely due to the lack of mechanistic understanding of neurostimulation. Hence, theoretical computational approaches to address this issue are in demand. This chapter provides a review of mechanistic computational modeling of brain stimulation. In particular, we will focus on brain diseases, where mechanistic models (e.g., neural population models or detailed neuronal models) have been used to bridge the gap between cellular-level processes of affected neural circuits and the symptomatic expression of disease dynamics. We show how such models have been, and can be, used to investigate the effects of neurostimulation in the diseased brain. We argue that these models are crucial for the mechanistic understanding of the effect of stimulation, allowing for a rational design of stimulation protocols. Based on mechanistic models, we argue that the development of closed-loop stimulation is essential in order to avoid inference with healthy ongoing brain activity. Furthermore, patient-specific data, such as neuroanatomic information and connectivity profiles obtainable from neuroimaging, can be readily incorporated to address the clinical issue of variability in efficacy between subjects. We conclude that mechanistic computational models can and should play a key role in the rational design of effective, fully integrated, patient-specific therapeutic brain stimulation. © 2015 Elsevier B.V. All rights reserved.
Morin, Xavier; Thuiller, Wilfried
2009-05-01
Obtaining reliable predictions of species range shifts under climate change is a crucial challenge for ecologists and stakeholders. At the continental scale, niche-based models have been widely used in the last 10 years to predict the potential impacts of climate change on species distributions all over the world, although these models do not include any mechanistic relationships. In contrast, species-specific, process-based predictions remain scarce at the continental scale. This is regrettable because to secure relevant and accurate predictions it is always desirable to compare predictions derived from different kinds of models applied independently to the same set of species and using the same raw data. Here we compare predictions of range shifts under climate change scenarios for 2100 derived from niche-based models with those of a process-based model for 15 North American boreal and temperate tree species. A general pattern emerged from our comparisons: niche-based models tend to predict a stronger level of extinction and a greater proportion of colonization than the process-based model. This result likely arises because niche-based models do not take phenotypic plasticity and local adaptation into account. Nevertheless, as the two kinds of models rely on different assumptions, their complementarity is revealed by common findings. Both modeling approaches highlight a major potential limitation on species tracking their climatic niche because of migration constraints and identify similar zones where species extirpation is likely. Such convergent predictions from models built on very different principles provide a useful way to offset uncertainties at the continental scale. This study shows that the use in concert of both approaches with their own caveats and advantages is crucial to obtain more robust results and that comparisons among models are needed in the near future to gain accuracy regarding predictions of range shifts under climate change.
Brillant, Nathalie; Elmasry, Mohamed; Burton, Neal C; Rodriguez, Josep Monne; Sharkey, Jack W; Fenwick, Stephen; Poptani, Harish; Kitteringham, Neil R; Goldring, Christopher E; Kipar, Anja; Park, B Kevin; Antoine, Daniel J
2017-10-01
The prediction and understanding of acetaminophen (APAP)-induced liver injury (APAP-ILI) and the response to therapeutic interventions is complex. This is due in part to sensitivity and specificity limitations of currently used assessment techniques. Here we sought to determine the utility of integrating translational non-invasive photoacoustic imaging of liver function with mechanistic circulating biomarkers of hepatotoxicity with histological assessment to facilitate the more accurate and precise characterization of APAP-ILI and the efficacy of therapeutic intervention. Perturbation of liver function and cellular viability was assessed in C57BL/6J male mice by Indocyanine green (ICG) clearance (Multispectral Optoacoustic Tomography (MSOT)) and by measurement of mechanistic (miR-122, HMGB1) and established (ALT, bilirubin) circulating biomarkers in response to the acetaminophen and its treatment with acetylcysteine (NAC) in vivo. We utilised a 60% partial hepatectomy model as a situation of defined hepatic functional mass loss to compared acetaminophen-induced changes to. Integration of these mechanistic markers correlated with histological features of APAP hepatotoxicity in a time-dependent manner. They accurately reflected the onset and recovery from hepatotoxicity compared to traditional biomarkers and also reported the efficacy of NAC with high sensitivity. ICG clearance kinetics correlated with histological scores for acute liver damage for APAP (i.e. 3h timepoint; r=0.90, P<0.0001) and elevations in both of the mechanistic biomarkers, miR-122 (e.g. 6h timepoint; r=0.70, P=0.005) and HMGB1 (e.g. 6h timepoint; r=0.56, P=0.04). For the first time we report the utility of this non-invasive longitudinal imaging approach to provide direct visualisation of the liver function coupled with mechanistic biomarkers, in the same animal, allowing the investigation of the toxicological and pharmacological aspects of APAP-ILI and hepatic regeneration. Copyright © 2017. Published by Elsevier Inc.
A century of transitions in New York City's measles dynamics.
Hempel, Karsten; Earn, David J D
2015-05-06
Infectious diseases spreading in a human population occasionally exhibit sudden transitions in their qualitative dynamics. Previous work has successfully predicted such transitions in New York City's historical measles incidence using the seasonally forced susceptible-infectious-recovered (SIR) model. This work relied on a dataset spanning 45 years (1928-1973), which we have extended to 93 years (1891-1984). We identify additional dynamical transitions in the longer dataset and successfully explain them by analysing attractors and transients of the same mechanistic epidemiological model. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
van Dijk, C; de Levie, R
1985-01-01
The continuum and single jump treatments of ion transport through black lipid membranes predict experimentally distinguishable results, even when the same mechanistic assumptions are made and the same potential-distance profile is used. On the basis of steady-state current-voltage curves for nonactin-mediated transport of potassium ions, we find that the continuum model describes the data accurately, whereas the single jump model fails to do so, for all cases investigated in which capacitance measurements indicate that the membrane thickness varies little with applied potential. PMID:3839420
Exact computation of the maximum-entropy potential of spiking neural-network models.
Cofré, R; Cessac, B
2014-05-01
Understanding how stimuli and synaptic connectivity influence the statistics of spike patterns in neural networks is a central question in computational neuroscience. The maximum-entropy approach has been successfully used to characterize the statistical response of simultaneously recorded spiking neurons responding to stimuli. However, in spite of good performance in terms of prediction, the fitting parameters do not explain the underlying mechanistic causes of the observed correlations. On the other hand, mathematical models of spiking neurons (neuromimetic models) provide a probabilistic mapping between the stimulus, network architecture, and spike patterns in terms of conditional probabilities. In this paper we build an exact analytical mapping between neuromimetic and maximum-entropy models.
NASA Astrophysics Data System (ADS)
McCoy, D.; Burrows, S. M.; Elliott, S.; Frossard, A. A.; Russell, L. M.; Liu, X.; Ogunro, O. O.; Easter, R. C.; Rasch, P. J.
2014-12-01
Remote marine clouds, such as those over the Southern Ocean, are particularly sensitive to variations in the concentration and chemical composition of aerosols that serve as cloud condensation nuclei (CCN). Observational evidence indicates that the organic content of fine marine aerosol is greatly increased during the biologically active season near strong phytoplankton blooms in certain locations, while being nearly constant in other locations. We have recently developed a novel modeling framework that mechanistically links the organic fraction of submicron sea spray to ocean biogeochemistry (Burrows et al., in discussion, ACPD, 2014; Elliott et al., ERL, 2014). Because of its combination of large phytoplankton blooms and high wind speeds, the Southern Ocean is an ideal location for testing our understanding of the processes driving the enrichment of organics in sea spray aerosol. Comparison of the simulated OM fraction with satellite observations shows that OM fraction is a statistically significant predictor of cloud droplet number concentration over the Southern Ocean. This presentation will focus on predictions from our modeling framework for the Southern Ocean, specifically, the predicted geographic gradients and seasonal cycles in the aerosol organic matter and its functional group composition. The timing and location of a Southern Ocean field campaign will determine its utility in observing the effects of highly localized and seasonal phytoplankton blooms on aerosol composition and clouds. Reference cited: Burrows, S. M., Ogunro, O., Frossard, A. A., Russell, L. M., Rasch, P. J., and Elliott, S.: A physically-based framework for modelling the organic fractionation of sea spray aerosol from bubble film Langmuir equilibria, Atmos. Chem. Phys. Discuss., 14, 5375-5443, doi:10.5194/acpd-14-5375-2014, 2014. Elliott, S., Burrows, S. M., Deal, C., Liu, X., Long, M., Ogunro, O., Russell, L. M., and Wingenter O.. "Prospects for simulating macromolecular surfactant chemistry at the ocean-atmosphere boundary." Environmental Research Letters 9, no. 6 (2014): 064012.
MECHANISTIC DOSIMETRY MODELS OF NANOMATERIAL DEPOSITION IN THE RESPIRATORY TRACT
Accurate health risk assessments of inhalation exposure to nanomaterials will require dosimetry models that account for interspecies differences in dose delivered to the respiratory tract. Mechanistic models offer the advantage to interspecies extrapolation that physicochemica...
Mechanistic modeling of destratification in cryogenic storage tanks using ultrasonics.
Jagannathan, T K; Mohanan, Srijith; Nagarajan, R
2014-01-01
Stratification is one of the main causes for vaporization of cryogens and increase of tank pressure during cryogenic storage. This leads subsequent problems such as cavitation in cryo-pumps, reduced length of storage time. Hence, it is vital to prevent stratification to improve the cost efficiency of storage systems. If stratified layers exist inside the tank, they have to be removed by suitable methods without venting the vapor. Sonication is one such method capable of keeping fluid layers mixed. In the present work, a mechanistic model for ultrasonic destratification is proposed and validated with destratification experiments done in water. Then, the same model is used to predict the destratification characteristics of cryogenic liquids such as liquid nitrogen (LN₂), liquid hydrogen (LH₂) and liquid ammonia (LNH₃). The destratification parameters are analysed for different frequencies of ultrasound and storage pressures by considering continuous and pulsed modes of ultrasonic operation. From the results, it is determined that use of high frequency ultrasound (low-power/continuous; high-power/pulsing) or low frequency ultrasound (continuous operation with moderate power) can both be effective in removing stratification. Copyright © 2013 Elsevier B.V. All rights reserved.
2014-01-01
Background Protein sites evolve at different rates due to functional and biophysical constraints. It is usually considered that the main structural determinant of a site’s rate of evolution is its Relative Solvent Accessibility (RSA). However, a recent comparative study has shown that the main structural determinant is the site’s Local Packing Density (LPD). LPD is related with dynamical flexibility, which has also been shown to correlate with sequence variability. Our purpose is to investigate the mechanism that connects a site’s LPD with its rate of evolution. Results We consider two models: an empirical Flexibility Model and a mechanistic Stress Model. The Flexibility Model postulates a linear increase of site-specific rate of evolution with dynamical flexibility. The Stress Model, introduced here, models mutations as random perturbations of the protein’s potential energy landscape, for which we use simple Elastic Network Models (ENMs). To account for natural selection we assume a single active conformation and use basic statistical physics to derive a linear relationship between site-specific evolutionary rates and the local stress of the mutant’s active conformation. We compare both models on a large and diverse dataset of enzymes. In a protein-by-protein study we found that the Stress Model outperforms the Flexibility Model for most proteins. Pooling all proteins together we show that the Stress Model is strongly supported by the total weight of evidence. Moreover, it accounts for the observed nonlinear dependence of sequence variability on flexibility. Finally, when mutational stress is controlled for, there is very little remaining correlation between sequence variability and dynamical flexibility. Conclusions We developed a mechanistic Stress Model of evolution according to which the rate of evolution of a site is predicted to depend linearly on the local mutational stress of the active conformation. Such local stress is proportional to LPD, so that this model explains the relationship between LPD and evolutionary rate. Moreover, the model also accounts for the nonlinear dependence between evolutionary rate and dynamical flexibility. PMID:24716445
A Mechanistic Model of Human Recall of Social Network Structure and Relationship Affect.
Omodei, Elisa; Brashears, Matthew E; Arenas, Alex
2017-12-07
The social brain hypothesis argues that the need to deal with social challenges was key to our evolution of high intelligence. Research with non-human primates as well as experimental and fMRI studies in humans produce results consistent with this claim, leading to an estimate that human primary groups should consist of roughly 150 individuals. Gaps between this prediction and empirical observations can be partially accounted for using "compression heuristics", or schemata that simplify the encoding and recall of social information. However, little is known about the specific algorithmic processes used by humans to store and recall social information. We describe a mechanistic model of human network recall and demonstrate its sufficiency for capturing human recall behavior observed in experimental contexts. We find that human recall is predicated on accurate recall of a small number of high degree network nodes and the application of heuristics for both structural and affective information. This provides new insight into human memory, social network evolution, and demonstrates a novel approach to uncovering human cognitive operations.
Impact of excipient interactions on solid dosage form stability.
Narang, Ajit S; Desai, Divyakant; Badawy, Sherif
2012-10-01
Drug-excipient interactions in solid dosage forms can affect drug product stability in physical aspects such as organoleptic changes and dissolution slowdown, or chemically by causing drug degradation. Recent research has allowed the distinction in chemical instability resulting from direct drug-excipient interactions and from drug interactions with excipient impurities. A review of chemical instability in solid dosage forms highlights common mechanistic themes applicable to multiple degradation pathways. These common themes include the role of water and microenvironmental pH. In addition, special aspects of solid-state reactions with excipients and/or excipient impurities add to the complexity in understanding and modeling reaction pathways. This paper discusses mechanistic basis of known drug-excipient interactions with case studies and provides an overview of common underlying themes. Recent developments in the understanding of degradation pathways further impact methodologies used in the pharmaceutical industry for prospective stability assessment. This paper discusses these emerging aspects in terms of limitations of drug-excipient compatibility studies, emerging paradigms in accelerated stability testing, and application of mathematical modeling for prediction of drug product stability.
Durairaj, Chandrasekar; Shen, Jie; Cherukury, Madhu
2014-08-01
To develop a mechanism based translational pharmacokinetic-pharmacodynamic (PKPD) model in preclinical species and to predict the intraocular pressure (IOP) following drug treatment in patients with glaucoma or ocular hypertension (OHT). Baseline diurnal IOP of normotensive albino rabbits, beagle dogs and patients with glaucoma or OHT was collected from literature. In addition, diurnal IOP of patients treated with brimonidine or Xalatan® were also obtained from literature. Healthy normotensive New Zealand rabbits were topically treated with a single drop of 0.15% brimonidine tartrate and normotensive beagle dogs were treated with a single drop of Xalatan®. At pre-determined time intervals, IOP was measured and aqueous humor samples were obtained from a satellite group of animals. Population based PKPD modeling was performed to describe the IOP data and the chosen model was extended to predict the IOP in patients. Baseline IOP clearly depicts a distinctive circadian rhythm in rabbits versus human. An aqueous humor dynamics based physiological model was developed to describe the baseline diurnal IOP across species. Model was extended to incorporate the effect of drug administration on baseline IOP in rabbits and dogs. The translational model with substituted human aqueous humor dynamic parameters predicted IOP in patients following drug treatment. A physiology based mechanistic PKPD model was developed to describe the baseline and post-treatment IOP in animals. The preclinical PKPD model was successfully translated to predict IOP in patients with glaucoma or OHT and can be applied in assisting dose and treatment selection and predicting outcome of glaucoma clinical trials.
NASA Astrophysics Data System (ADS)
Müller, M. F.; Thompson, S. E.
2015-09-01
The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drives of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by a strong wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are strongly favored over statistical models.
NASA Astrophysics Data System (ADS)
Müller, M. F.; Thompson, S. E.
2016-02-01
The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drivers of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by frequent wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are favored over statistical models.
Mathematical modeling of PDC bit drilling process based on a single-cutter mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wojtanowicz, A.K.; Kuru, E.
1993-12-01
An analytical development of a new mechanistic drilling model for polycrystalline diamond compact (PDC) bits is presented. The derivation accounts for static balance of forces acting on a single PDC cutter and is based on assumed similarity between bit and cutter. The model is fully explicit with physical meanings given to all constants and functions. Three equations constitute the mathematical model: torque, drilling rate, and bit life. The equations comprise cutter`s geometry, rock properties drilling parameters, and four empirical constants. The constants are used to match the model to a PDC drilling process. Also presented are qualitative and predictive verificationsmore » of the model. Qualitative verification shows that the model`s response to drilling process variables is similar to the behavior of full-size PDC bits. However, accuracy of the model`s predictions of PDC bit performance is limited primarily by imprecision of bit-dull evaluation. The verification study is based upon the reported laboratory drilling and field drilling tests as well as field data collected by the authors.« less
A comparative evaluation of models to predict human intestinal metabolism from nonclinical data
Yau, Estelle; Petersson, Carl; Dolgos, Hugues
2017-01-01
Abstract Extensive gut metabolism is often associated with the risk of low and variable bioavailability. The prediction of the fraction of drug escaping gut wall metabolism as well as transporter‐mediated secretion (F g) has been challenged by the lack of appropriate preclinical models. The purpose of this study is to compare the performance of models that are widely employed in the pharmaceutical industry today to estimate F g and, based on the outcome, to provide recommendations for the prediction of human F g during drug discovery and early drug development. The use of in vitro intrinsic clearance from human liver microsomes (HLM) in three mechanistic models – the ADAM, Q gut and Competing Rates – was evaluated for drugs whose metabolism is dominated by CYP450s, assuming that the effect of transporters is negligible. The utility of rat as a model for human F g was also explored. The ADAM, Q gut and Competing Rates models had comparable prediction success (70%, 74%, 69%, respectively) and bias (AFE = 1.26, 0.74 and 0.81, respectively). However, the ADAM model showed better accuracy compared with the Q gut and Competing Rates models (RMSE =0.20 vs 0.30 and 0.25, respectively). Rat is not a good model (prediction success =32%, RMSE =0.48 and AFE = 0.44) as it seems systematically to under‐predict human F g. Hence, we would recommend the use of rat to identify the need for F g assessment, followed by the use of HLM in simple models to predict human F g. © 2017 Merck KGaA. Biopharmaceutics & Drug Disposition Published by John Wiley & Sons, Ltd. PMID:28152562
A comparative evaluation of models to predict human intestinal metabolism from nonclinical data.
Yau, Estelle; Petersson, Carl; Dolgos, Hugues; Peters, Sheila Annie
2017-04-01
Extensive gut metabolism is often associated with the risk of low and variable bioavailability. The prediction of the fraction of drug escaping gut wall metabolism as well as transporter-mediated secretion (F g ) has been challenged by the lack of appropriate preclinical models. The purpose of this study is to compare the performance of models that are widely employed in the pharmaceutical industry today to estimate F g and, based on the outcome, to provide recommendations for the prediction of human F g during drug discovery and early drug development. The use of in vitro intrinsic clearance from human liver microsomes (HLM) in three mechanistic models - the ADAM, Q gut and Competing Rates - was evaluated for drugs whose metabolism is dominated by CYP450s, assuming that the effect of transporters is negligible. The utility of rat as a model for human F g was also explored. The ADAM, Q gut and Competing Rates models had comparable prediction success (70%, 74%, 69%, respectively) and bias (AFE = 1.26, 0.74 and 0.81, respectively). However, the ADAM model showed better accuracy compared with the Q gut and Competing Rates models (RMSE =0.20 vs 0.30 and 0.25, respectively). Rat is not a good model (prediction success =32%, RMSE =0.48 and AFE = 0.44) as it seems systematically to under-predict human F g . Hence, we would recommend the use of rat to identify the need for F g assessment, followed by the use of HLM in simple models to predict human F g . © 2017 Merck KGaA. Biopharmaceutics & Drug Disposition Published by John Wiley & Sons, Ltd. © 2017 Merck KGaA. Biopharmaceutics & Drug Disposition Published by John Wiley & Sons, Ltd.
Thermodynamics-based models of transcriptional regulation with gene sequence.
Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing
2015-12-01
Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.
Gering, Kevin L
2013-08-27
A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware periodically samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics of the electrochemical cell. The computing system also develops a mechanistic level model of the electrochemical cell to determine performance fade characteristics of the electrochemical cell and analyzing the mechanistic level model to estimate performance fade characteristics over aging of a similar electrochemical cell. The mechanistic level model uses first constant-current pulses applied to the electrochemical cell at a first aging period and at three or more current values bracketing a first exchange current density. The mechanistic level model also is based on second constant-current pulses applied to the electrochemical cell at a second aging period and at three or more current values bracketing the second exchange current density.
NASA Astrophysics Data System (ADS)
Hong, Yoon-Seok; Rosen, Michael R.
2002-03-01
An urban fractured-rock aquifer system, where disposal of storm water is via 'soak holes' drilled directly into the top of fractured-rock basalt, has a highly dynamic nature where theories or knowledge to generate the model are still incomplete and insufficient. Therefore, formulating an accurate mechanistic model, usually based on first principles (physical and chemical laws, mass balance, and diffusion and transport, etc.), requires time- and money-consuming tasks. Instead of a human developing the mechanistic-based model, this paper presents an approach to automatic model evolution in genetic programming (GP) to model dynamic behaviour of groundwater level fluctuations affected by storm water infiltration. This GP evolves mathematical models automatically that have an understandable structure using function tree representation by methods of natural selection ('survival of the fittest') through genetic operators (reproduction, crossover, and mutation). The simulation results have shown that GP is not only capable of predicting the groundwater level fluctuation due to storm water infiltration but also provides insight into the dynamic behaviour of a partially known urban fractured-rock aquifer system by allowing knowledge extraction of the evolved models. Our results show that GP can work as a cost-effective modelling tool, enabling us to create prototype models quickly and inexpensively and assists us in developing accurate models in less time, even if we have limited experience and incomplete knowledge for an urban fractured-rock aquifer system affected by storm water infiltration.
Canadian Field Soils IV: Modeling Thermal Conductivity at Dryness and Saturation
NASA Astrophysics Data System (ADS)
Tarnawski, V. R.; McCombie, M. L.; Leong, W. H.; Coppa, P.; Corasaniti, S.; Bovesecchi, G.
2018-03-01
The thermal conductivity data of 40 Canadian soils at dryness (λ _{dry}) and at full saturation (λ _{sat}) were used to verify 13 predictive models, i.e., four mechanistic, four semi-empirical and five empirical equations. The performance of each model, for λ _{dry} and λ _{sat}, was evaluated using a standard deviation ( SD) formula. Among the mechanistic models applied to dry soils, the closest λ _{dry} estimates were obtained by MaxRTCM (it{SD} = ± 0.018 Wm^{-1}\\cdot K^{-1}), followed by de Vries and a series-parallel model (S-{\\vert }{\\vert }). Among the semi-empirical equations (deVries-ave, Advanced Geometric Mean Model (A-GMM), Chaudhary and Bhandari (C-B) and Chen's equation), the closest λ _{dry} estimates were obtained by the C-B model (± 0.022 Wm^{-1}\\cdot K^{-1}). Among the empirical equations, the top λ _{dry} estimates were given by CDry-40 (± 0.021 Wm^{-1}\\cdot K^{-1} and ± 0.018 Wm^{-1}\\cdot K^{-1} for18-coarse and 22-fine soils, respectively). In addition, λ _{dry} and λ _{sat} models were applied to the λ _{sat} database of 21 other soils. From all the models tested, only the maxRTCM and the CDry-40 models provided the closest λ _{dry} estimates for the 40 Canadian soils as well as the 21 soils. The best λ _{sat} estimates for the 40-Canadian soils and the 21 soils were given by the A-GMM and the S-{\\vert }{\\vert } model.
Identifying gnostic predictors of the vaccine response.
Haining, W Nicholas; Pulendran, Bali
2012-06-01
Molecular predictors of the response to vaccination could transform vaccine development. They would allow larger numbers of vaccine candidates to be rapidly screened, shortening the development time for new vaccines. Gene-expression based predictors of vaccine response have shown early promise. However, a limitation of gene-expression based predictors is that they often fail to reveal the mechanistic basis of their ability to classify response. Linking predictive signatures to the function of their component genes would advance basic understanding of vaccine immunity and also improve the robustness of vaccine prediction. New analytic tools now allow more biological meaning to be extracted from predictive signatures. Functional genomic approaches to perturb gene expression in mammalian cells permit the function of predictive genes to be surveyed in highly parallel experiments. The challenge for vaccinologists is therefore to use these tools to embed mechanistic insights into predictors of vaccine response. Copyright © 2012 Elsevier Ltd. All rights reserved.
Agent-based models in translational systems biology
An, Gary; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram
2013-01-01
Effective translational methodologies for knowledge representation are needed in order to make strides against the constellation of diseases that affect the world today. These diseases are defined by their mechanistic complexity, redundancy, and nonlinearity. Translational systems biology aims to harness the power of computational simulation to streamline drug/device design, simulate clinical trials, and eventually to predict the effects of drugs on individuals. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggests that this modeling framework is well suited for translational systems biology. This review describes agent-based modeling and gives examples of its translational applications in the context of acute inflammation and wound healing. PMID:20835989
González-Domínguez, Elisa; Caffi, Tito; Ciliberti, Nicola; Rossi, Vittorio
2015-01-01
A mechanistic model for Botrytis cinerea on grapevine was developed. The model, which accounts for conidia production on various inoculum sources and for multiple infection pathways, considers two infection periods. During the first period (“inflorescences clearly visible” to “berries groat-sized”), the model calculates: i) infection severity on inflorescences and young clusters caused by conidia (SEV1). During the second period (“majority of berries touching” to “berries ripe for harvest”), the model calculates: ii) infection severity of ripening berries by conidia (SEV2); and iii) severity of berry-to-berry infection caused by mycelium (SEV3). The model was validated in 21 epidemics (vineyard × year combinations) between 2009 and 2014 in Italy and France. A discriminant function analysis (DFA) was used to: i) evaluate the ability of the model to predict mild, intermediate, and severe epidemics; and ii) assess how SEV1, SEV2, and SEV3 contribute to epidemics. The model correctly classified the severity of 17 of 21 epidemics. Results from DFA were also used to calculate the daily probabilities that an ongoing epidemic would be mild, intermediate, or severe. SEV1 was the most influential variable in discriminating between mild and intermediate epidemics, whereas SEV2 and SEV3 were relevant for discriminating between intermediate and severe epidemics. The model represents an improvement of previous B. cinerea models in viticulture and could be useful for making decisions about Botrytis bunch rot control. PMID:26457808
Fisher's geometrical model emerges as a property of complex integrated phenotypic networks.
Martin, Guillaume
2014-05-01
Models relating phenotype space to fitness (phenotype-fitness landscapes) have seen important developments recently. They can roughly be divided into mechanistic models (e.g., metabolic networks) and more heuristic models like Fisher's geometrical model. Each has its own drawbacks, but both yield testable predictions on how the context (genomic background or environment) affects the distribution of mutation effects on fitness and thus adaptation. Both have received some empirical validation. This article aims at bridging the gap between these approaches. A derivation of the Fisher model "from first principles" is proposed, where the basic assumptions emerge from a more general model, inspired by mechanistic networks. I start from a general phenotypic network relating unspecified phenotypic traits and fitness. A limited set of qualitative assumptions is then imposed, mostly corresponding to known features of phenotypic networks: a large set of traits is pleiotropically affected by mutations and determines a much smaller set of traits under optimizing selection. Otherwise, the model remains fairly general regarding the phenotypic processes involved or the distribution of mutation effects affecting the network. A statistical treatment and a local approximation close to a fitness optimum yield a landscape that is effectively the isotropic Fisher model or its extension with a single dominant phenotypic direction. The fit of the resulting alternative distributions is illustrated in an empirical data set. These results bear implications on the validity of Fisher's model's assumptions and on which features of mutation fitness effects may vary (or not) across genomic or environmental contexts.
Stomatal control and hydraulic conductance, with special reference to tall trees.
Franks, Peter J
2004-08-01
A better understanding of the mechanistic basis of stomatal control is necessary to understand why modes of stomatal response differ among individual trees, and to improve the theoretical foundation for predictive models and manipulative experiments. Current understanding of the mechanistic basis of stomatal control is reviewed here and discussed in relation to the plant hydraulic system. Analysis focused on: (1) the relative role of hydraulic conductance in the vicinity of the stomatal apparatus versus whole-plant hydraulic conductance; (2) the influence of guard cell inflation characteristics and the mechanical interaction between guard cells and epidermal cells; and (3) the system requirements for moderate versus dramatic reductions in stomatal conductance with increasing evaporation potential. Special consideration was given to the potential effect of changes in hydraulic properties as trees grow taller. Stomatal control of leaf gas exchange is coupled to the entire plant hydraulic system and the basis of this coupling is the interdependence of guard cell water potential and transpiration rate. This hydraulic feedback loop is always present, but its dynamic properties may be altered by growth or cavitation-induced changes in hydraulic conductance, and may vary with genetically related differences in hydraulic conductances. Mechanistic models should include this feedback loop. Plants vary in their ability to control transpiration rate sufficiently to maintain constant leaf water potential. Limited control may be achieved through the hydraulic feedback loop alone, but for tighter control, an additional element linking transpiration rate to guard cell osmotic pressure may be needed.
Multiscale mechanistic modeling in pharmaceutical research and development.
Kuepfer, Lars; Lippert, Jörg; Eissing, Thomas
2012-01-01
Discontinuation of drug development projects due to lack of efficacy or adverse events is one of the main cost drivers in pharmaceutical research and development (R&D). Investments have to be written-off and contribute to the total costs of a successful drug candidate receiving marketing authorization and allowing return on invest. A vital risk for pharmaceutical innovator companies is late stage clinical failure since costs for individual clinical trials may exceed the one billion Euro threshold. To guide investment decisions and to safeguard maximum medical benefit and safety for patients recruited in clinical trials, it is therefore essential to understand the clinical consequences of all information and data generated. The complexity of the physiological and pathophysiological processes and the sheer amount of information available overcharge the mental capacity of any human being and prevent a prediction of the success in clinical development. A rigorous integration of knowledge, assumption, and experimental data into computational models promises a significant improvement of the rationalization of decision making in pharmaceutical industry. We here give an overview of the current status of modeling and simulation in pharmaceutical R&D and outline the perspectives of more recent developments in mechanistic modeling. Specific modeling approaches for different biological scales ranging from intracellular processes to whole organism physiology are introduced and an example for integrative multiscale modeling of therapeutic efficiency in clinical oncology trials is showcased.
A global scale mechanistic model of photosynthetic capacity (LUNA V1.0)
Ali, Ashehad A.; Xu, Chonggang; Rogers, Alistair; ...
2016-02-12
Although plant photosynthetic capacity as determined by the maximum carboxylation rate (i.e., V c,max25) and the maximum electron transport rate (i.e., J max25) at a reference temperature (generally 25 °C) is known to vary considerably in space and time in response to environmental conditions, it is typically parameterized in Earth system models (ESMs) with tabulated values associated with plant functional types. In this study, we have developed a mechanistic model of leaf utilization of nitrogen for assimilation (LUNA) to predict photosynthetic capacity at the global scale under different environmental conditions. We adopt an optimality hypothesis to nitrogen allocation among lightmore » capture, electron transport, carboxylation and respiration. The LUNA model is able to reasonably capture the measured spatial and temporal patterns of photosynthetic capacity as it explains ~55 % of the global variation in observed values of V c,max25 and ~65 % of the variation in the observed values of J max25. Model simulations with LUNA under current and future climate conditions demonstrate that modeled values of V c,max25 are most affected in high-latitude regions under future climates. In conclusion, ESMs that relate the values of V c,max25 or J max25 to plant functional types only are likely to substantially overestimate future global photosynthesis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Lei; Shi, Zhenqing; Lu, Yang
Understanding the kinetics of toxic ion reactions with ferrihydrite is crucial for predicting the dynamic behavior of contaminants in soil environments. In this study, the kinetics of As(V), Cr(VI), Cu, and Pb adsorption and desorption on ferrihydrite were investigated with a combination of laboratory macroscopic experiments, microscopic investigation and mechanistic modeling. The rates of As(V), Cr(VI), Cu, and Pb adsorption and desorption on ferrihydrite, as systematically studied using a stirred-flow method, was highly dependent on the reaction pH and metal concentrations and varied significantly among four metals. Spherical aberration-corrected scanning transmission electron microscopy (Cs-STEM) showed, at sub-nano scales, all fourmore » metals were distributed within the ferrihydrite particle aggregates homogeneously after adsorption reactions, with no evidence of surface diffusion-controlled processes. Based on experimental results, we developed a unifying kinetics model for both cation and oxyanion adsorption/desorption on ferrihydrite based on the mechanistic-based equilibrium model CD-MUSIC. Overall, the model described the kinetic results well, and we quantitatively demonstrated how the equilibrium properties of the cation and oxyanion binding to various ferrihydrite sites affected the adsorption and desorption rates. Our results provided a unifying quantitative modeling method for the kinetics of both cation and oxyanion adsorption/desorption on iron minerals.« less
Xu, Xiangtao; Medvigy, David; Powers, Jennifer S; Becknell, Justin M; Guan, Kaiyu
2016-10-01
We assessed whether diversity in plant hydraulic traits can explain the observed diversity in plant responses to water stress in seasonally dry tropical forests (SDTFs). The Ecosystem Demography model 2 (ED2) was updated with a trait-driven mechanistic plant hydraulic module, as well as novel drought-phenology and plant water stress schemes. Four plant functional types were parameterized on the basis of meta-analysis of plant hydraulic traits. Simulations from both the original and the updated ED2 were evaluated against 5 yr of field data from a Costa Rican SDTF site and remote-sensing data over Central America. The updated model generated realistic plant hydraulic dynamics, such as leaf water potential and stem sap flow. Compared with the original ED2, predictions from our novel trait-driven model matched better with observed growth, phenology and their variations among functional groups. Most notably, the original ED2 produced unrealistically small leaf area index (LAI) and underestimated cumulative leaf litter. Both of these biases were corrected by the updated model. The updated model was also better able to simulate spatial patterns of LAI dynamics in Central America. Plant hydraulic traits are intercorrelated in SDTFs. Mechanistic incorporation of plant hydraulic traits is necessary for the simulation of spatiotemporal patterns of vegetation dynamics in SDTFs in vegetation models. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.