DOUBLE SHELL TANK (DST) HYDROXIDE DEPLETION MODEL FOR CARBON DIOXIDE ABSORPTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
OGDEN DM; KIRCH NW
2007-10-31
This document generates a supernatant hydroxide ion depletion model based on mechanistic principles. The carbon dioxide absorption mechanistic model is developed in this report. The report also benchmarks the model against historical tank supernatant hydroxide data and vapor space carbon dioxide data. A comparison of the newly generated mechanistic model with previously applied empirical hydroxide depletion equations is also performed.
Bridging paradigms: hybrid mechanistic-discriminative predictive models.
Doyle, Orla M; Tsaneva-Atansaova, Krasimira; Harte, James; Tiffin, Paul A; Tino, Peter; Díaz-Zuccarini, Vanessa
2013-03-01
Many disease processes are extremely complex and characterized by multiple stochastic processes interacting simultaneously. Current analytical approaches have included mechanistic models and machine learning (ML), which are often treated as orthogonal viewpoints. However, to facilitate truly personalized medicine, new perspectives may be required. This paper reviews the use of both mechanistic models and ML in healthcare as well as emerging hybrid methods, which are an exciting and promising approach for biologically based, yet data-driven advanced intelligent systems.
Safaie, Ammar; Wendzel, Aaron; Ge, Zhongfu; Nevers, Meredith; Whitman, Richard L.; Corsi, Steven R.; Phanikumar, Mantha S.
2016-01-01
Statistical and mechanistic models are popular tools for predicting the levels of indicator bacteria at recreational beaches. Researchers tend to use one class of model or the other, and it is difficult to generalize statements about their relative performance due to differences in how the models are developed, tested, and used. We describe a cooperative modeling approach for freshwater beaches impacted by point sources in which insights derived from mechanistic modeling were used to further improve the statistical models and vice versa. The statistical models provided a basis for assessing the mechanistic models which were further improved using probability distributions to generate high-resolution time series data at the source, long-term “tracer” transport modeling based on observed electrical conductivity, better assimilation of meteorological data, and the use of unstructured-grids to better resolve nearshore features. This approach resulted in improved models of comparable performance for both classes including a parsimonious statistical model suitable for real-time predictions based on an easily measurable environmental variable (turbidity). The modeling approach outlined here can be used at other sites impacted by point sources and has the potential to improve water quality predictions resulting in more accurate estimates of beach closures.
Combining correlative and mechanistic habitat suitability models to improve ecological compensation.
Meineri, Eric; Deville, Anne-Sophie; Grémillet, David; Gauthier-Clerc, Michel; Béchet, Arnaud
2015-02-01
Only a few studies have shown positive impacts of ecological compensation on species dynamics affected by human activities. We argue that this is due to inappropriate methods used to forecast required compensation in environmental impact assessments. These assessments are mostly descriptive and only valid at limited spatial and temporal scales. However, habitat suitability models developed to predict the impacts of environmental changes on potential species' distributions should provide rigorous science-based tools for compensation planning. Here we describe the two main classes of predictive models: correlative models and individual-based mechanistic models. We show how these models can be used alone or synoptically to improve compensation planning. While correlative models are easier to implement, they tend to ignore underlying ecological processes and lack accuracy. On the contrary, individual-based mechanistic models can integrate biological interactions, dispersal ability and adaptation. Moreover, among mechanistic models, those considering animal energy balance are particularly efficient at predicting the impact of foraging habitat loss. However, mechanistic models require more field data compared to correlative models. Hence we present two approaches which combine both methods for compensation planning, especially in relation to the spatial scale considered. We show how the availability of biological databases and software enabling fast and accurate population projections could be advantageously used to assess ecological compensation requirement efficiently in environmental impact assessments. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.
Zelić, B; Bolf, N; Vasić-Racki, D
2006-06-01
Three different models: the unstructured mechanistic black-box model, the input-output neural network-based model and the externally recurrent neural network model were used to describe the pyruvate production process from glucose and acetate using the genetically modified Escherichia coli YYC202 ldhA::Kan strain. The experimental data were used from the recently described batch and fed-batch experiments [ Zelić B, Study of the process development for Escherichia coli-based pyruvate production. PhD Thesis, University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb, Croatia, July 2003. (In English); Zelić et al. Bioproc Biosyst Eng 26:249-258 (2004); Zelić et al. Eng Life Sci 3:299-305 (2003); Zelić et al Biotechnol Bioeng 85:638-646 (2004)]. The neural networks were built out of the experimental data obtained in the fed-batch pyruvate production experiments with the constant glucose feed rate. The model validation was performed using the experimental results obtained from the batch and fed-batch pyruvate production experiments with the constant acetate feed rate. Dynamics of the substrate and product concentration changes was estimated using two neural network-based models for biomass and pyruvate. It was shown that neural networks could be used for the modeling of complex microbial fermentation processes, even in conditions in which mechanistic unstructured models cannot be applied.
Mechanistic species distribution modelling as a link between physiology and conservation.
Evans, Tyler G; Diamond, Sarah E; Kelly, Morgan W
2015-01-01
Climate change conservation planning relies heavily on correlative species distribution models that estimate future areas of occupancy based on environmental conditions encountered in present-day ranges. The approach benefits from rapid assessment of vulnerability over a large number of organisms, but can have poor predictive power when transposed to novel environments and reveals little in the way of causal mechanisms that define changes in species distribution or abundance. Having conservation planning rely largely on this single approach also increases the risk of policy failure. Mechanistic models that are parameterized with physiological information are expected to be more robust when extrapolating distributions to future environmental conditions and can identify physiological processes that set range boundaries. Implementation of mechanistic species distribution models requires knowledge of how environmental change influences physiological performance, and because this information is currently restricted to a comparatively small number of well-studied organisms, use of mechanistic modelling in the context of climate change conservation is limited. In this review, we propose that the need to develop mechanistic models that incorporate physiological data presents an opportunity for physiologists to contribute more directly to climate change conservation and advance the field of conservation physiology. We begin by describing the prevalence of species distribution modelling in climate change conservation, highlighting the benefits and drawbacks of both mechanistic and correlative approaches. Next, we emphasize the need to expand mechanistic models and discuss potential metrics of physiological performance suitable for integration into mechanistic models. We conclude by summarizing other factors, such as the need to consider demography, limiting broader application of mechanistic models in climate change conservation. Ideally, modellers, physiologists and conservation practitioners would work collaboratively to build models, interpret results and consider conservation management options, and articulating this need here may help to stimulate collaboration.
Bird Migration Under Climate Change - A Mechanistic Approach Using Remote Sensing
NASA Technical Reports Server (NTRS)
Smith, James A.; Blattner, Tim; Messmer, Peter
2010-01-01
The broad-scale reductions and shifts that may be expected under climate change in the availability and quality of stopover habitat for long-distance migrants is an area of increasing concern for conservation biologists. Researchers generally have taken two broad approaches to the modeling of migration behaviour to understand the impact of these changes on migratory bird populations. These include models based on causal processes and their response to environmental stimulation, "mechanistic models", or models that primarily are based on observed animal distribution patterns and the correlation of these patterns with environmental variables, i.e. "data driven" models. Investigators have applied the latter technique to forecast changes in migration patterns with changes in the environment, for example, as might be expected under climate change, by forecasting how the underlying environmental data layers upon which the relationships are built will change over time. The learned geostatstical correlations are then applied to the modified data layers.. However, this is problematic. Even if the projections of how the underlying data layers will change are correct, it is not evident that the statistical relationships will remain the same, i.e. that the animal organism may not adapt its' behaviour to the changing conditions. Mechanistic models that explicitly take into account the physical, biological, and behaviour responses of an organism as well as the underlying changes in the landscape offer an alternative to address these shortcomings. The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies enable the application of the mechanistic models to predict how continental bird migration patterns may change in response to environmental change. In earlier work, we simulated the impact of effects of wetland loss and inter-annual variability on the fitness of migratory shorebirds in the central fly ways of North America. We demonstrated the phenotypic plasticity of a migratory population of Pectoral sandpipers consisting of an ensemble of 10,000 individual birds in response to changes in stopover locations using an individual based migration model driven by remotely sensed land surface data, climate data and biological field data. With the advent of new computing capabilities enabled hy recent GPU-GP computing paradigms and commodity hardware, it now is possible to simulate both larger ensemble populations and to incorporate more realistic mechanistic factors into migration models. Here, we take our first steps use these tools to study the impact of long-term drought variability on shorebird survival.
A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.
Revell, Christopher; Somveille, Marius
2017-08-29
In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.
Tsamandouras, Nikolaos; Rostami-Hodjegan, Amin; Aarons, Leon
2015-01-01
Pharmacokinetic models range from being entirely exploratory and empirical, to semi-mechanistic and ultimately complex physiologically based pharmacokinetic (PBPK) models. This choice is conditional on the modelling purpose as well as the amount and quality of the available data. The main advantage of PBPK models is that they can be used to extrapolate outside the studied population and experimental conditions. The trade-off for this advantage is a complex system of differential equations with a considerable number of model parameters. When these parameters cannot be informed from in vitro or in silico experiments they are usually optimized with respect to observed clinical data. Parameter estimation in complex models is a challenging task associated with many methodological issues which are discussed here with specific recommendations. Concepts such as structural and practical identifiability are described with regards to PBPK modelling and the value of experimental design and sensitivity analyses is sketched out. Parameter estimation approaches are discussed, while we also highlight the importance of not neglecting the covariance structure between model parameters and the uncertainty and population variability that is associated with them. Finally the possibility of using model order reduction techniques and minimal semi-mechanistic models that retain the physiological-mechanistic nature only in the parts of the model which are relevant to the desired modelling purpose is emphasized. Careful attention to all the above issues allows us to integrate successfully information from in vitro or in silico experiments together with information deriving from observed clinical data and develop mechanistically sound models with clinical relevance. PMID:24033787
Assmus, Frauke; Houston, J Brian; Galetin, Aleksandra
2017-11-15
The prediction of tissue-to-plasma water partition coefficients (Kpu) from in vitro and in silico data using the tissue-composition based model (Rodgers & Rowland, J Pharm Sci. 2005, 94(6):1237-48.) is well established. However, distribution of basic drugs, in particular into lysosome-rich lung tissue, tends to be under-predicted by this approach. The aim of this study was to develop an extended mechanistic model for the prediction of Kpu which accounts for lysosomal sequestration and the contribution of different cell types in the tissue of interest. The extended model is based on compound-specific physicochemical properties and tissue composition data to describe drug ionization, distribution into tissue water and drug binding to neutral lipids, neutral phospholipids and acidic phospholipids in tissues, including lysosomes. Physiological data on the types of cells contributing to lung, kidney and liver, their lysosomal content and lysosomal pH were collated from the literature. The predictive power of the extended mechanistic model was evaluated using a dataset of 28 basic drugs (pK a ≥7.8, 17 β-blockers, 11 structurally diverse drugs) for which experimentally determined Kpu data in rat tissue have been reported. Accounting for the lysosomal sequestration in the extended mechanistic model improved the accuracy of Kpu predictions in lung compared to the original Rodgers model (56% drugs within 2-fold or 88% within 3-fold of observed values). Reduction in the extent of Kpu under-prediction was also evident in liver and kidney. However, consideration of lysosomal sequestration increased the occurrence of over-predictions, yielding overall comparable model performances for kidney and liver, with 68% and 54% of Kpu values within 2-fold error, respectively. High lysosomal concentration ratios relative to cytosol (>1000-fold) were predicted for the drugs investigated; the extent differed depending on the lysosomal pH and concentration of acidic phospholipids among cell types. Despite this extensive lysosomal sequestration in the individual cells types, the maximal change in the overall predicted tissue Kpu was <3-fold for lysosome-rich tissues investigated here. Accounting for the variability in cellular physiological model input parameters, in particular lysosomal pH and fraction of the cellular volume occupied by the lysosomes, only partially explained discrepancies between observed and predicted Kpu data in the lung. Improved understanding of the system properties, e.g., cell/organelle composition is required to support further development of mechanistic equations for the prediction of drug tissue distribution. Application of this revised mechanistic model is recommended for prediction of Kpu in lysosome-rich tissue to facilitate the advancement of physiologically-based prediction of volume of distribution and drug exposure in the tissues. Copyright © 2017 Elsevier B.V. All rights reserved.
Dose selection based on physiologically based pharmacokinetic (PBPK) approaches.
Jones, Hannah M; Mayawala, Kapil; Poulin, Patrick
2013-04-01
Physiologically based pharmacokinetic (PBPK) models are built using differential equations to describe the physiology/anatomy of different biological systems. Readily available in vitro and in vivo preclinical data can be incorporated into these models to not only estimate pharmacokinetic (PK) parameters and plasma concentration-time profiles, but also to gain mechanistic insight into compound properties. They provide a mechanistic framework to understand and extrapolate PK and dose across in vitro and in vivo systems and across different species, populations and disease states. Using small molecule and large molecule examples from the literature and our own company, we have shown how PBPK techniques can be utilised for human PK and dose prediction. Such approaches have the potential to increase efficiency, reduce the need for animal studies, replace clinical trials and increase PK understanding. Given the mechanistic nature of these models, the future use of PBPK modelling in drug discovery and development is promising, however some limitations need to be addressed to realise its application and utility more broadly.
Ground-Based Gas-Liquid Flow Research in Microgravity Conditions: State of Knowledge
NASA Technical Reports Server (NTRS)
McQuillen, J.; Colin, C.; Fabre, J.
1999-01-01
During the last decade, ground-based microgravity facilities have been utilized in order to obtain predictions for spacecraft system designers and further the fundamental understanding of two-phase flow. Although flow regime, pressure drop and heat transfer coefficient data has been obtained for straight tubes and a limited number of fittings, measurements of the void fraction, film thickness, wall shear stress, local velocity and void information are also required in order to develop general mechanistic models that can be utilized to ascertain the effects of fluid properties, tube geometry and acceleration levels. A review of this research is presented and includes both empirical data and mechanistic models of the flow behavior.
Mourning dove hunting regulation strategy based on annual harvest statistics and banding data
Otis, D.L.
2006-01-01
Although managers should strive to base game bird harvest management strategies on mechanistic population models, monitoring programs required to build and continuously update these models may not be in place. Alternatively, If estimates of total harvest and harvest rates are available, then population estimates derived from these harvest data can serve as the basis for making hunting regulation decisions based on population growth rates derived from these estimates. I present a statistically rigorous approach for regulation decision-making using a hypothesis-testing framework and an assumed framework of 3 hunting regulation alternatives. I illustrate and evaluate the technique with historical data on the mid-continent mallard (Anas platyrhynchos) population. I evaluate the statistical properties of the hypothesis-testing framework using the best available data on mourning doves (Zenaida macroura). I use these results to discuss practical implementation of the technique as an interim harvest strategy for mourning doves until reliable mechanistic population models and associated monitoring programs are developed.
When mechanism matters: Bayesian forecasting using models of ecological diffusion
Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.
2017-01-01
Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.
NASA Astrophysics Data System (ADS)
Marçais, J.; Gupta, H. V.; De Dreuzy, J. R.; Troch, P. A. A.
2016-12-01
Geomorphological structure and geological heterogeneity of hillslopes are major controls on runoff responses. The diversity of hillslopes (morphological shapes and geological structures) on one hand, and the highly non linear runoff mechanism response on the other hand, make it difficult to transpose what has been learnt at one specific hillslope to another. Therefore, making reliable predictions on runoff appearance or river flow for a given hillslope is a challenge. Applying a classic model calibration (based on inverse problems technique) requires doing it for each specific hillslope and having some data available for calibration. When applied to thousands of cases it cannot always be promoted. Here we propose a novel modeling framework based on coupling process based models with data based approach. First we develop a mechanistic model, based on hillslope storage Boussinesq equations (Troch et al. 2003), able to model non linear runoff responses to rainfall at the hillslope scale. Second we set up a model database, representing thousands of non calibrated simulations. These simulations investigate different hillslope shapes (real ones obtained by analyzing 5m digital elevation model of Brittany and synthetic ones), different hillslope geological structures (i.e. different parametrizations) and different hydrologic forcing terms (i.e. different infiltration chronicles). Then, we use this model library to train a machine learning model on this physically based database. Machine learning model performance is then assessed by a classic validating phase (testing it on new hillslopes and comparing machine learning with mechanistic outputs). Finally we use this machine learning model to learn what are the hillslope properties controlling runoffs. This methodology will be further tested combining synthetic datasets with real ones.
Mechanistic materials modeling for nuclear fuel performance
Tonks, Michael R.; Andersson, David; Phillpot, Simon R.; ...
2017-03-15
Fuel performance codes are critical tools for the design, certification, and safety analysis of nuclear reactors. However, their ability to predict fuel behavior under abnormal conditions is severely limited by their considerable reliance on empirical materials models correlated to burn-up (a measure of the number of fission events that have occurred, but not a unique measure of the history of the material). In this paper, we propose a different paradigm for fuel performance codes to employ mechanistic materials models that are based on the current state of the evolving microstructure rather than burn-up. In this approach, a series of statemore » variables are stored at material points and define the current state of the microstructure. The evolution of these state variables is defined by mechanistic models that are functions of fuel conditions and other state variables. The material properties of the fuel and cladding are determined from microstructure/property relationships that are functions of the state variables and the current fuel conditions. Multiscale modeling and simulation is being used in conjunction with experimental data to inform the development of these models. Finally, this mechanistic, microstructure-based approach has the potential to provide a more predictive fuel performance capability, but will require a team of researchers to complete the required development and to validate the approach.« less
Pathak, Shriram M; Ruff, Aaron; Kostewicz, Edmund S; Patel, Nikunjkumar; Turner, David B; Jamei, Masoud
2017-12-04
Mechanistic modeling of in vitro data generated from metabolic enzyme systems (viz., liver microsomes, hepatocytes, rCYP enzymes, etc.) facilitates in vitro-in vivo extrapolation (IVIV_E) of metabolic clearance which plays a key role in the successful prediction of clearance in vivo within physiologically-based pharmacokinetic (PBPK) modeling. A similar concept can be applied to solubility and dissolution experiments whereby mechanistic modeling can be used to estimate intrinsic parameters required for mechanistic oral absorption simulation in vivo. However, this approach has not widely been applied within an integrated workflow. We present a stepwise modeling approach where relevant biopharmaceutics parameters for ketoconazole (KTZ) are determined and/or confirmed from the modeling of in vitro experiments before being directly used within a PBPK model. Modeling was applied to various in vitro experiments, namely: (a) aqueous solubility profiles to determine intrinsic solubility, salt limiting solubility factors and to verify pK a ; (b) biorelevant solubility measurements to estimate bile-micelle partition coefficients; (c) fasted state simulated gastric fluid (FaSSGF) dissolution for formulation disintegration profiling; and (d) transfer experiments to estimate supersaturation and precipitation parameters. These parameters were then used within a PBPK model to predict the dissolved and total (i.e., including the precipitated fraction) concentrations of KTZ in the duodenum of a virtual population and compared against observed clinical data. The developed model well characterized the intraluminal dissolution, supersaturation, and precipitation behavior of KTZ. The mean simulated AUC 0-t of the total and dissolved concentrations of KTZ were comparable to (within 2-fold of) the corresponding observed profile. Moreover, the developed PBPK model of KTZ successfully described the impact of supersaturation and precipitation on the systemic plasma concentration profiles of KTZ for 200, 300, and 400 mg doses. These results demonstrate that IVIV_E applied to biopharmaceutical experiments can be used to understand and build confidence in the quality of the input parameters and mechanistic models used for mechanistic oral absorption simulations in vivo, thereby improving the prediction performance of PBPK models. Moreover, this approach can inform the selection and design of in vitro experiments, potentially eliminating redundant experiments and thus helping to reduce the cost and time of drug product development.
Li, Michael; Dushoff, Jonathan; Bolker, Benjamin M
2018-07-01
Simple mechanistic epidemic models are widely used for forecasting and parameter estimation of infectious diseases based on noisy case reporting data. Despite the widespread application of models to emerging infectious diseases, we know little about the comparative performance of standard computational-statistical frameworks in these contexts. Here we build a simple stochastic, discrete-time, discrete-state epidemic model with both process and observation error and use it to characterize the effectiveness of different flavours of Bayesian Markov chain Monte Carlo (MCMC) techniques. We use fits to simulated data, where parameters (and future behaviour) are known, to explore the limitations of different platforms and quantify parameter estimation accuracy, forecasting accuracy, and computational efficiency across combinations of modeling decisions (e.g. discrete vs. continuous latent states, levels of stochasticity) and computational platforms (JAGS, NIMBLE, Stan).
Chen, Tao; Lian, Guoping; Kattou, Panayiotis
2016-07-01
The purpose was to develop a mechanistic mathematical model for predicting the pharmacokinetics of topically applied solutes penetrating through the skin and into the blood circulation. The model could be used to support the design of transdermal drug delivery systems and skin care products, and risk assessment of occupational or consumer exposure. A recently reported skin penetration model [Pharm Res 32 (2015) 1779] was integrated with the kinetic equations for dermis-to-capillary transport and systemic circulation. All model parameters were determined separately from the molecular, microscopic and physiological bases, without fitting to the in vivo data to be predicted. Published clinical studies of nicotine were used for model demonstration. The predicted plasma kinetics is in good agreement with observed clinical data. The simulated two-dimensional concentration profile in the stratum corneum vividly illustrates the local sub-cellular disposition kinetics, including tortuous lipid pathway for diffusion and the "reservoir" effect of the corneocytes. A mechanistic model for predicting transdermal and systemic kinetics was developed and demonstrated with published clinical data. The integrated mechanistic approach has significantly extended the applicability of a recently reported microscopic skin penetration model by providing prediction of solute concentration in the blood.
Descriptive vs. mechanistic network models in plant development in the post-genomic era.
Davila-Velderrain, J; Martinez-Garcia, J C; Alvarez-Buylla, E R
2015-01-01
Network modeling is now a widespread practice in systems biology, as well as in integrative genomics, and it constitutes a rich and diverse scientific research field. A conceptually clear understanding of the reasoning behind the main existing modeling approaches, and their associated technical terminologies, is required to avoid confusions and accelerate the transition towards an undeniable necessary more quantitative, multidisciplinary approach to biology. Herein, we focus on two main network-based modeling approaches that are commonly used depending on the information available and the intended goals: inference-based methods and system dynamics approaches. As far as data-based network inference methods are concerned, they enable the discovery of potential functional influences among molecular components. On the other hand, experimentally grounded network dynamical models have been shown to be perfectly suited for the mechanistic study of developmental processes. How do these two perspectives relate to each other? In this chapter, we describe and compare both approaches and then apply them to a given specific developmental module. Along with the step-by-step practical implementation of each approach, we also focus on discussing their respective goals, utility, assumptions, and associated limitations. We use the gene regulatory network (GRN) involved in Arabidopsis thaliana Root Stem Cell Niche patterning as our illustrative example. We show that descriptive models based on functional genomics data can provide important background information consistent with experimentally supported functional relationships integrated in mechanistic GRN models. The rationale of analysis and modeling can be applied to any other well-characterized functional developmental module in multicellular organisms, like plants and animals.
Johnson, Douglas H.; Cook, R.D.
2013-01-01
In her AAAS News & Notes piece "Can the Southwest manage its thirst?" (26 July, p. 362), K. Wren quotes Ajay Kalra, who advocates a particular method for predicting Colorado River streamflow "because it eschews complex physical climate models for a statistical data-driven modeling approach." A preference for data-driven models may be appropriate in this individual situation, but it is not so generally, Data-driven models often come with a warning against extrapolating beyond the range of the data used to develop the models. When the future is like the past, data-driven models can work well for prediction, but it is easy to over-model local or transient phenomena, often leading to predictive inaccuracy (1). Mechanistic models are built on established knowledge of the process that connects the response variables with the predictors, using information obtained outside of an extant data set. One may shy away from a mechanistic approach when the underlying process is judged to be too complicated, but good predictive models can be constructed with statistical components that account for ingredients missing in the mechanistic analysis. Models with sound mechanistic components are more generally applicable and robust than data-driven models.
Computational modeling of neurostimulation in brain diseases.
Wang, Yujiang; Hutchings, Frances; Kaiser, Marcus
2015-01-01
Neurostimulation as a therapeutic tool has been developed and used for a range of different diseases such as Parkinson's disease, epilepsy, and migraine. However, it is not known why the efficacy of the stimulation varies dramatically across patients or why some patients suffer from severe side effects. This is largely due to the lack of mechanistic understanding of neurostimulation. Hence, theoretical computational approaches to address this issue are in demand. This chapter provides a review of mechanistic computational modeling of brain stimulation. In particular, we will focus on brain diseases, where mechanistic models (e.g., neural population models or detailed neuronal models) have been used to bridge the gap between cellular-level processes of affected neural circuits and the symptomatic expression of disease dynamics. We show how such models have been, and can be, used to investigate the effects of neurostimulation in the diseased brain. We argue that these models are crucial for the mechanistic understanding of the effect of stimulation, allowing for a rational design of stimulation protocols. Based on mechanistic models, we argue that the development of closed-loop stimulation is essential in order to avoid inference with healthy ongoing brain activity. Furthermore, patient-specific data, such as neuroanatomic information and connectivity profiles obtainable from neuroimaging, can be readily incorporated to address the clinical issue of variability in efficacy between subjects. We conclude that mechanistic computational models can and should play a key role in the rational design of effective, fully integrated, patient-specific therapeutic brain stimulation. © 2015 Elsevier B.V. All rights reserved.
A white-box model of S-shaped and double S-shaped single-species population growth
Kalmykov, Lev V.
2015-01-01
Complex systems may be mechanistically modelled by white-box modeling with using logical deterministic individual-based cellular automata. Mathematical models of complex systems are of three types: black-box (phenomenological), white-box (mechanistic, based on the first principles) and grey-box (mixtures of phenomenological and mechanistic models). Most basic ecological models are of black-box type, including Malthusian, Verhulst, Lotka–Volterra models. In black-box models, the individual-based (mechanistic) mechanisms of population dynamics remain hidden. Here we mechanistically model the S-shaped and double S-shaped population growth of vegetatively propagated rhizomatous lawn grasses. Using purely logical deterministic individual-based cellular automata we create a white-box model. From a general physical standpoint, the vegetative propagation of plants is an analogue of excitation propagation in excitable media. Using the Monte Carlo method, we investigate a role of different initial positioning of an individual in the habitat. We have investigated mechanisms of the single-species population growth limited by habitat size, intraspecific competition, regeneration time and fecundity of individuals in two types of boundary conditions and at two types of fecundity. Besides that, we have compared the S-shaped and J-shaped population growth. We consider this white-box modeling approach as a method of artificial intelligence which works as automatic hyper-logical inference from the first principles of the studied subject. This approach is perspective for direct mechanistic insights into nature of any complex systems. PMID:26038717
Learning to predict chemical reactions.
Kayala, Matthew A; Azencott, Chloé-Agathe; Chen, Jonathan H; Baldi, Pierre
2011-09-26
Being able to predict the course of arbitrary chemical reactions is essential to the theory and applications of organic chemistry. Approaches to the reaction prediction problems can be organized around three poles corresponding to: (1) physical laws; (2) rule-based expert systems; and (3) inductive machine learning. Previous approaches at these poles, respectively, are not high throughput, are not generalizable or scalable, and lack sufficient data and structure to be implemented. We propose a new approach to reaction prediction utilizing elements from each pole. Using a physically inspired conceptualization, we describe single mechanistic reactions as interactions between coarse approximations of molecular orbitals (MOs) and use topological and physicochemical attributes as descriptors. Using an existing rule-based system (Reaction Explorer), we derive a restricted chemistry data set consisting of 1630 full multistep reactions with 2358 distinct starting materials and intermediates, associated with 2989 productive mechanistic steps and 6.14 million unproductive mechanistic steps. And from machine learning, we pose identifying productive mechanistic steps as a statistical ranking, information retrieval problem: given a set of reactants and a description of conditions, learn a ranking model over potential filled-to-unfilled MO interactions such that the top-ranked mechanistic steps yield the major products. The machine learning implementation follows a two-stage approach, in which we first train atom level reactivity filters to prune 94.00% of nonproductive reactions with a 0.01% error rate. Then, we train an ensemble of ranking models on pairs of interacting MOs to learn a relative productivity function over mechanistic steps in a given system. Without the use of explicit transformation patterns, the ensemble perfectly ranks the productive mechanism at the top 89.05% of the time, rising to 99.86% of the time when the top four are considered. Furthermore, the system is generalizable, making reasonable predictions over reactants and conditions which the rule-based expert does not handle. A web interface to the machine learning based mechanistic reaction predictor is accessible through our chemoinformatics portal ( http://cdb.ics.uci.edu) under the Toolkits section.
Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models
NASA Astrophysics Data System (ADS)
Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea
2014-05-01
Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.
MECHANISTIC-BASED DISINFECTION AND DISINFECTION BYPRODUCT MODELS
We propose developing a mechanistic-based numerical model for chlorine decay and regulated DBP (THM and HAA) formation derived from (free) chlorination; the model framework will allow future modifications for other DBPs and chloramination. Predicted chlorine residual and DBP r...
A comprehensive mechanistic model for upward two-phase flow in wellbores
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sylvester, N.D.; Sarica, C.; Shoham, O.
1994-05-01
A comprehensive model is formulated to predict the flow behavior for upward two-phase flow. This model is composed of a model for flow-pattern prediction and a set of independent mechanistic models for predicting such flow characteristics as holdup and pressure drop in bubble, slug, and annular flow. The comprehensive model is evaluated by using a well data bank made up of 1,712 well cases covering a wide variety of field data. Model performance is also compared with six commonly used empirical correlations and the Hasan-Kabir mechanistic model. Overall model performance is in good agreement with the data. In comparison withmore » other methods, the comprehensive model performed the best.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenquist, Ian; Tonks, Michael
2016-10-01
Light water reactor fuel pellets are fabricated using sintering to final densities of 95% or greater. During reactor operation, the porosity remaining in the fuel after fabrication decreases further due to irradiation-assisted densification. While empirical models have been developed to describe this densification process, a mechanistic model is needed as part of the ongoing work by the NEAMS program to develop a more predictive fuel performance code. In this work we will develop a phase field model of sintering of UO 2 in the MARMOT code, and validate it by comparing to published sintering data. We will then add themore » capability to capture irradiation effects into the model, and use it to develop a mechanistic model of densification that will go into the BISON code and add another essential piece to the microstructure-based materials models. The final step will be to add the effects of applied fields, to model field-assisted sintering of UO 2. The results of the phase field model will be validated by comparing to data from field-assisted sintering. Tasks over three years: 1. Develop a sintering model for UO 2 in MARMOT 2. Expand model to account for irradiation effects 3. Develop a mechanistic macroscale model of densification for BISON« less
Four decades of modeling methane cycling in terrestrial ecosystems: Where we are heading?
NASA Astrophysics Data System (ADS)
Xu, X.; Yuan, F.; Hanson, P. J.; Wullschleger, S. D.; Thornton, P. E.; Tian, H.; Riley, W. J.; Song, X.; Graham, D. E.; Song, C.
2015-12-01
A modeling approach to methane (CH4) is widely used to quantify the budget, investigate spatial and temporal variabilities, and understand the mechanistic processes and environmental controls on CH4 fluxes across spatial and temporal scales. Moreover, CH4 models are an important tool for integrating CH4 data from multiple sources, such as laboratory-based incubation and molecular analysis, field observational experiments, remote sensing, and aircraft-based measurements across a variety of terrestrial ecosystems. We reviewed 39 terrestrial CH4 models to characterize their strengths and weaknesses and to design a roadmap for future model improvement and application. We found that: (1) the focus of CH4 models have been shifted from theoretical to site- to regional-level application over the past four decades, expressed as dramatic increases in CH4 model development on regional budget quantification; (2) large discrepancies exist among models in terms of representing CH4 processes and their environmental controls; (3) significant data-model and model-model mismatches are partially attributed to different representations of wetland characterization and inundation dynamics. Three efforts should be paid special attention for future improvements and applications of fully mechanistic CH4 models: (1) CH4 models should be improved to represent the mechanisms underlying land-atmosphere CH4 exchange, with emphasis on improving and validating individual CH4 processes over depth and horizontal space; (2) models should be developed that are capable of simulating CH4 fluxes across space and time (particularly hot moments and hot spots); (3) efforts should be invested to develop model benchmarking frameworks that can easily be used for model improvement, evaluation, and integration with data from molecular to global scales. A newly developed microbial functional group-based CH4 model (CLM-Microbe) was further used to demonstrate the features of mechanistic representation and integration with multiple source of observational datasets.
Vodovotz, Yoram; Xia, Ashley; Read, Elizabeth L.; Bassaganya-Riera, Josep; Hafler, David A.; Sontag, Eduardo; Wang, Jin; Tsang, John S.; Day, Judy D.; Kleinstein, Steven; Butte, Atul J.; Altman, Matthew C; Hammond, Ross; Sealfon, Stuart C.
2016-01-01
Emergent responses of the immune system result from integration of molecular and cellular networks over time and across multiple organs. High-content and high-throughput analysis technologies, concomitantly with data-driven and mechanistic modeling, hold promise for systematic interrogation of these complex pathways. However, connecting genetic variation and molecular mechanisms to individual phenotypes and health outcomes has proven elusive. Gaps remain in data, and disagreements persist about the value of mechanistic modeling for immunology. Here, we present the perspectives that emerged from the NIAID workshop “Complex Systems Science, Modeling and Immunity” and subsequent discussions regarding the potential synergy of high-throughput data acquisition, data-driven modeling and mechanistic modeling to define new mechanisms of immunological disease and to accelerate the translation of these insights into therapies. PMID:27986392
NASA Astrophysics Data System (ADS)
Mohanty, Subhasish; Soppet, William K.; Majumdar, Saurindranath; Natesan, Krishnamurti
2016-05-01
Argonne National Laboratory (ANL), under the sponsorship of Department of Energy's Light Water Reactor Sustainability (LWRS) program, is trying to develop a mechanistic approach for more accurate life estimation of LWR components. In this context, ANL has conducted many fatigue experiments under different test and environment conditions on type 316 stainless steel (316 SS) material which is widely used in the US reactors. Contrary to the conventional S ∼ N curve based empirical fatigue life estimation approach, the aim of the present DOE sponsored work is to develop an understanding of the material ageing issues more mechanistically (e.g. time dependent hardening and softening) under different test and environmental conditions. Better mechanistic understanding will help develop computer-based advanced modeling tools to better extrapolate stress-strain evolution of reactor components under multi-axial stress states and hence help predict their fatigue life more accurately. Mechanics-based modeling of fatigue such as by using finite element (FE) tools requires the time/cycle dependent material hardening properties. Presently such time-dependent material hardening properties are hardly available in fatigue modeling literature even under in-air conditions. Getting those material properties under PWR environment, are even harder. Through this work we made preliminary attempt to generate time/cycle dependent stress-strain data both under in-air and PWR water conditions for further study such as for possible development of material models and constitutive relations for FE model implementation. Although, there are open-ended possibility to further improve the discussed test methods and related material estimation techniques we anticipate that the data presented in this paper will help the metal fatigue research community particularly, the researchers who are dealing with mechanistic modeling of metal fatigue such as using FE tools. In this paper the fatigue experiments under different test and environment conditions and related stress-strain results for 316 SS are discussed.
NASA Astrophysics Data System (ADS)
Yamana, Teresa K.; Eltahir, Elfatih A. B.
2011-02-01
This paper describes the use of satellite-based estimates of rainfall to force the Hydrology, Entomology and Malaria Transmission Simulator (HYDREMATS), a hydrology-based mechanistic model of malaria transmission. We first examined the temporal resolution of rainfall input required by HYDREMATS. Simulations conducted over Banizoumbou village in Niger showed that for reasonably accurate simulation of mosquito populations, the model requires rainfall data with at least 1 h resolution. We then investigated whether HYDREMATS could be effectively forced by satellite-based estimates of rainfall instead of ground-based observations. The Climate Prediction Center morphing technique (CMORPH) precipitation estimates distributed by the National Oceanic and Atmospheric Administration are available at a 30 min temporal resolution and 8 km spatial resolution. We compared mosquito populations simulated by HYDREMATS when the model is forced by adjusted CMORPH estimates and by ground observations. The results demonstrate that adjusted rainfall estimates from satellites can be used with a mechanistic model to accurately simulate the dynamics of mosquito populations.
A Systems Biology Approach to Toxicology Research with Small Fish Models
Increasing use of mechanistically-based molecular and biochemical endpoints and in vitro assays is being advocated as a more efficient and cost-effective approach for generating chemical hazard data. However, development of effective assays and application of the resulting data i...
How to make a tree ring: Coupling stem water flow and cambial activity in mature Alpine conifers
NASA Astrophysics Data System (ADS)
Peters, Richard L.; Frank, David C.; Treydte, Kerstin; Steppe, Kathy; Kahmen, Ansgar; Fonti, Patrick
2017-04-01
Inter-annual tree-ring measurements are used to understand tree-growth responses to climatic variability and reconstruct past climate conditions. In parallel, mechanistic models use experimentally defined plant-atmosphere interactions to explain past growth responses and predict future environmental impact on forest productivity. Yet, substantial inconsistencies within mechanistic model ensembles and mismatches with empirical data indicate that significant progress is still needed to understand the processes occurring at an intra-annual resolution that drive annual growth. However, challenges arise due to i) few datasets describing climatic responses of high-resolution physiological processes over longer time-scales, ii) uncertainties on the main mechanistic process limiting radial stem growth and iii) complex interactions between multiple environmental factors which obscure detection of the main stem growth driver, generating a gap between our understanding of intra- and inter-annual growth mechanisms. We attempt to bridge the gap between inter-annual tree-ring width and sub-daily radial stem-growth and provide a mechanistic perspective on how environmental conditions affect physiological processes that shape tree rings in conifers. We combine sub-hourly sap flow and point dendrometer measurements performed on mature Alpine conifers (Larix decidua) into an individual-based mechanistic tree-growth model to simulate sub-hourly cambial activity. The monitored trees are located along a high elevational transect in the Swiss Alps (Lötschental) to analyse the effect of increasing temperature. The model quantifies internal tree hydraulic pathways that regulate the turgidity within the cambial zone and induce cell enlargement for radial growth. The simulations are validated against intra-annual growth patterns derived from xylogenesis data and anatomical analyses. Our efforts advance the process-based understanding of how climate shapes the annual tree-ring structures and could potentially improve our ability to reconstruct the climate of the past and predict future growth under changing climate.
Modeling food matrix effects on chemical reactivity: Challenges and perspectives.
Capuano, Edoardo; Oliviero, Teresa; van Boekel, Martinus A J S
2017-06-29
The same chemical reaction may be different in terms of its position of the equilibrium (i.e., thermodynamics) and its kinetics when studied in different foods. The diversity in the chemical composition of food and in its structural organization at macro-, meso-, and microscopic levels, that is, the food matrix, is responsible for this difference. In this viewpoint paper, the multiple, and interconnected ways the food matrix can affect chemical reactivity are summarized. Moreover, mechanistic and empirical approaches to explain and predict the effect of food matrix on chemical reactivity are described. Mechanistic models aim to quantify the effect of food matrix based on a detailed understanding of the chemical and physical phenomena occurring in food. Their applicability is limited at the moment to very simple food systems. Empirical modeling based on machine learning combined with data-mining techniques may represent an alternative, useful option to predict the effect of the food matrix on chemical reactivity and to identify chemical and physical properties to be further tested. In such a way the mechanistic understanding of the effect of the food matrix on chemical reactions can be improved.
Simulating the Risk of Liver Fluke Infection using a Mechanistic Hydro-epidemiological Model
NASA Astrophysics Data System (ADS)
Beltrame, Ludovica; Dunne, Toby; Rose, Hannah; Walker, Josephine; Morgan, Eric; Vickerman, Peter; Wagener, Thorsten
2016-04-01
Liver Fluke (Fasciola hepatica) is a common parasite found in livestock and responsible for considerable economic losses throughout the world. Risk of infection is strongly influenced by climatic and hydrological conditions, which characterise the host environment for parasite development and transmission. Despite on-going control efforts, increases in fluke outbreaks have been reported in recent years in the UK, and have been often attributed to climate change. Currently used fluke risk models are based on empirical relationships derived between historical climate and incidence data. However, hydro-climate conditions are becoming increasingly non-stationary due to climate change and direct anthropogenic impacts such as land use change, making empirical models unsuitable for simulating future risk. In this study we introduce a mechanistic hydro-epidemiological model for Liver Fluke, which explicitly simulates habitat suitability for disease development in space and time, representing the parasite life cycle in connection with key environmental conditions. The model is used to assess patterns of Liver Fluke risk for two catchments in the UK under current and potential future climate conditions. Comparisons are made with a widely used empirical model employing different datasets, including data from regional veterinary laboratories. Results suggest that mechanistic models can achieve adequate predictive ability and support adaptive fluke control strategies under climate change scenarios.
Tokunaga, Taisuke; Yatabe, Takeshi; Matsumoto, Takahiro; Ando, Tatsuya; Yoon, Ki-Seok; Ogo, Seiji
2017-01-01
We report the mechanistic investigation of catalytic H 2 evolution from formic acid in water using a formate-bridged dinuclear Ru complex as a formate hydrogen lyase model. The mechanistic study is based on isotope-labeling experiments involving hydrogen isotope exchange reaction.
Computational Modeling of Inflammation and Wound Healing
Ziraldo, Cordelia; Mi, Qi; An, Gary; Vodovotz, Yoram
2013-01-01
Objective Inflammation is both central to proper wound healing and a key driver of chronic tissue injury via a positive-feedback loop incited by incidental cell damage. We seek to derive actionable insights into the role of inflammation in wound healing in order to improve outcomes for individual patients. Approach To date, dynamic computational models have been used to study the time evolution of inflammation in wound healing. Emerging clinical data on histo-pathological and macroscopic images of evolving wounds, as well as noninvasive measures of blood flow, suggested the need for tissue-realistic, agent-based, and hybrid mechanistic computational simulations of inflammation and wound healing. Innovation We developed a computational modeling system, Simple Platform for Agent-based Representation of Knowledge, to facilitate the construction of tissue-realistic models. Results A hybrid equation–agent-based model (ABM) of pressure ulcer formation in both spinal cord-injured and -uninjured patients was used to identify control points that reduce stress caused by tissue ischemia/reperfusion. An ABM of arterial restenosis revealed new dynamics of cell migration during neointimal hyperplasia that match histological features, but contradict the currently prevailing mechanistic hypothesis. ABMs of vocal fold inflammation were used to predict inflammatory trajectories in individuals, possibly allowing for personalized treatment. Conclusions The intertwined inflammatory and wound healing responses can be modeled computationally to make predictions in individuals, simulate therapies, and gain mechanistic insights. PMID:24527362
Duan, J; Kesisoglou, F; Novakovic, J; Amidon, GL; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R
2017-01-01
On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled “Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation.”1 The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole‐body framework.2 PMID:28571121
Testing mechanistic models of growth in insects.
Maino, James L; Kearney, Michael R
2015-11-22
Insects are typified by their small size, large numbers, impressive reproductive output and rapid growth. However, insect growth is not simply rapid; rather, insects follow a qualitatively distinct trajectory to many other animals. Here we present a mechanistic growth model for insects and show that increasing specific assimilation during the growth phase can explain the near-exponential growth trajectory of insects. The presented model is tested against growth data on 50 insects, and compared against other mechanistic growth models. Unlike the other mechanistic models, our growth model predicts energy reserves per biomass to increase with age, which implies a higher production efficiency and energy density of biomass in later instars. These predictions are tested against data compiled from the literature whereby it is confirmed that insects increase their production efficiency (by 24 percentage points) and energy density (by 4 J mg(-1)) between hatching and the attainment of full size. The model suggests that insects achieve greater production efficiencies and enhanced growth rates by increasing specific assimilation and increasing energy reserves per biomass, which are less costly to maintain than structural biomass. Our findings illustrate how the explanatory and predictive power of mechanistic growth models comes from their grounding in underlying biological processes. © 2015 The Author(s).
Vodovotz, Yoram; Xia, Ashley; Read, Elizabeth L; Bassaganya-Riera, Josep; Hafler, David A; Sontag, Eduardo; Wang, Jin; Tsang, John S; Day, Judy D; Kleinstein, Steven H; Butte, Atul J; Altman, Matthew C; Hammond, Ross; Sealfon, Stuart C
2017-02-01
Emergent responses of the immune system result from the integration of molecular and cellular networks over time and across multiple organs. High-content and high-throughput analysis technologies, concomitantly with data-driven and mechanistic modeling, hold promise for the systematic interrogation of these complex pathways. However, connecting genetic variation and molecular mechanisms to individual phenotypes and health outcomes has proven elusive. Gaps remain in data, and disagreements persist about the value of mechanistic modeling for immunology. Here, we present the perspectives that emerged from the National Institute of Allergy and Infectious Disease (NIAID) workshop 'Complex Systems Science, Modeling and Immunity' and subsequent discussions regarding the potential synergy of high-throughput data acquisition, data-driven modeling, and mechanistic modeling to define new mechanisms of immunological disease and to accelerate the translation of these insights into therapies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Monte Carlo modeling of atomic oxygen attack of polymers with protective coatings on LDEF
NASA Technical Reports Server (NTRS)
Banks, Bruce A.; Degroh, Kim K.; Sechkar, Edward A.
1992-01-01
Characterization of the behavior of atomic oxygen interaction with materials on the Long Duration Exposure Facility (LDEF) will assist in understanding the mechanisms involved, and will lead to improved reliability in predicting in-space durability of materials based on ground laboratory testing. A computational simulation of atomic oxygen interaction with protected polymers was developed using Monte Carlo techniques. Through the use of assumed mechanistic behavior of atomic oxygen and results of both ground laboratory and LDEF data, a predictive Monte Carlo model was developed which simulates the oxidation processes that occur on polymers with applied protective coatings that have defects. The use of high atomic oxygen fluence-directed ram LDEF results has enabled mechanistic implications to be made by adjusting Monte Carlo modeling assumptions to match observed results based on scanning electron microscopy. Modeling assumptions, implications, and predictions are presented, along with comparison of observed ground laboratory and LDEF results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Jerden, James
2016-10-01
The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less
Towards predictive models of the human gut microbiome
2014-01-01
The intestinal microbiota is an ecosystem susceptible to external perturbations such as dietary changes and antibiotic therapies. Mathematical models of microbial communities could be of great value in the rational design of microbiota-tailoring diets and therapies. Here, we discuss how advances in another field, engineering of microbial communities for wastewater treatment bioreactors, could inspire development of mechanistic mathematical models of the gut microbiota. We review the current state-of-the-art in bioreactor modeling and current efforts in modeling the intestinal microbiota. Mathematical modeling could benefit greatly from the deluge of data emerging from metagenomic studies, but data-driven approaches such as network inference that aim to predict microbiome dynamics without explicit mechanistic knowledge seem better suited to model these data. Finally, we discuss how the integration of microbiome shotgun sequencing and metabolic modeling approaches such as flux balance analysis may fulfill the promise of a mechanistic model of the intestinal microbiota. PMID:24727124
Musther, Helen; Harwood, Matthew D; Yang, Jiansong; Turner, David B; Rostami-Hodjegan, Amin; Jamei, Masoud
2017-09-01
The use of in vitro-in vivo extrapolation (IVIVE) techniques, mechanistically incorporated within physiologically based pharmacokinetic (PBPK) models, can harness in vitro drug data and enhance understanding of in vivo pharmacokinetics. This study's objective was to develop a user-friendly rat (250 g, male Sprague-Dawley) IVIVE-linked PBPK model. A 13-compartment PBPK model including mechanistic absorption models was developed, with required system data (anatomical, physiological, and relevant IVIVE scaling factors) collated from literature and analyzed. Overall, 178 system parameter values for the model are provided. This study also highlights gaps in available system data required for strain-specific rat PBPK model development. The model's functionality and performance were assessed using previous literature-sourced in vitro properties for diazepam, metoprolol, and midazolam. The results of simulations were compared against observed pharmacokinetic rat data. Predicted and observed concentration profiles in 10 tissues for diazepam after a single intravenous (i.v.) dose making use of either observed i.v. clearance (CL iv ) or in vitro hepatocyte intrinsic clearance (CL int ) for simulations generally led to good predictions in various tissue compartments. Overall, all i.v. plasma concentration profiles were successfully predicted. However, there were challenges in predicting oral plasma concentration profiles for metoprolol and midazolam, and the potential reasons and according solutions are discussed. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Putting the psychology back into psychological models: mechanistic versus rational approaches.
Sakamoto, Yasuaki; Jones, Mattr; Love, Bradley C
2008-09-01
Two basic approaches to explaining the nature of the mind are the rational and the mechanistic approaches. Rational analyses attempt to characterize the environment and the behavioral outcomes that humans seek to optimize, whereas mechanistic models attempt to simulate human behavior using processes and representations analogous to those used by humans. We compared these approaches with regard to their accounts of how humans learn the variability of categories. The mechanistic model departs in subtle ways from rational principles. In particular, the mechanistic model incrementally updates its estimates of category means and variances through error-driven learning, based on discrepancies between new category members and the current representation of each category. The model yields a prediction, which we verify, regarding the effects of order manipulations that the rational approach does not anticipate. Although both rational and mechanistic models can successfully postdict known findings, we suggest that psychological advances are driven primarily by consideration of process and representation and that rational accounts trail these breakthroughs.
Gene arrays for elucidating mechanistic data from models of male infertility and chemical exposure in mice, rats and humans
John C. Rockett and David J. Dix
Gamete and Early Embryo Biology Branch, Reproductive Toxicology Division, National Health and Environmental Effects ...
Douglas Allen; William Dietrich; Peter Baker; Frank Ligon; Bruce Orr
2007-01-01
We describe a mechanistically-based stream model, BasinTemp, which assumes that direct shortwave radiation moderated by riparian and topographic shading, controls stream temperatures during the hottest part of the year. The model was developed to support a temperature TMDL for the South Fork Eel basin in Northern California and couples a GIS and a 1-D energy balance...
Causality, mediation and time: a dynamic viewpoint
Aalen, Odd O; Røysland, Kjetil; Gran, Jon Michael; Ledergerber, Bruno
2012-01-01
Summary. Time dynamics are often ignored in causal modelling. Clearly, causality must operate in time and we show how this corresponds to a mechanistic, or system, understanding of causality. The established counterfactual definitions of direct and indirect effects depend on an ability to manipulate the mediator which may not hold in practice, and we argue that a mechanistic view may be better. Graphical representations based on local independence graphs and dynamic path analysis are used to facilitate communication as well as providing an overview of the dynamic relations ‘at a glance’. The relationship between causality as understood in a mechanistic and in an interventionist sense is discussed. An example using data from the Swiss HIV Cohort Study is presented. PMID:23193356
CCl4 is a common environmental contaminant in water and superfund sites, and a model liver toxicant. One application of PBPK models used in risk assessment is simulation of internal dose for the metric involved with toxicity, particularly for different routes of exposure. Time-co...
Learning to Predict Chemical Reactions
Kayala, Matthew A.; Azencott, Chloé-Agathe; Chen, Jonathan H.
2011-01-01
Being able to predict the course of arbitrary chemical reactions is essential to the theory and applications of organic chemistry. Approaches to the reaction prediction problems can be organized around three poles corresponding to: (1) physical laws; (2) rule-based expert systems; and (3) inductive machine learning. Previous approaches at these poles respectively are not high-throughput, are not generalizable or scalable, or lack sufficient data and structure to be implemented. We propose a new approach to reaction prediction utilizing elements from each pole. Using a physically inspired conceptualization, we describe single mechanistic reactions as interactions between coarse approximations of molecular orbitals (MOs) and use topological and physicochemical attributes as descriptors. Using an existing rule-based system (Reaction Explorer), we derive a restricted chemistry dataset consisting of 1630 full multi-step reactions with 2358 distinct starting materials and intermediates, associated with 2989 productive mechanistic steps and 6.14 million unproductive mechanistic steps. And from machine learning, we pose identifying productive mechanistic steps as a statistical ranking, information retrieval, problem: given a set of reactants and a description of conditions, learn a ranking model over potential filled-to-unfilled MO interactions such that the top ranked mechanistic steps yield the major products. The machine learning implementation follows a two-stage approach, in which we first train atom level reactivity filters to prune 94.00% of non-productive reactions with a 0.01% error rate. Then, we train an ensemble of ranking models on pairs of interacting MOs to learn a relative productivity function over mechanistic steps in a given system. Without the use of explicit transformation patterns, the ensemble perfectly ranks the productive mechanism at the top 89.05% of the time, rising to 99.86% of the time when the top four are considered. Furthermore, the system is generalizable, making reasonable predictions over reactants and conditions which the rule-based expert does not handle. A web interface to the machine learning based mechanistic reaction predictor is accessible through our chemoinformatics portal (http://cdb.ics.uci.edu) under the Toolkits section. PMID:21819139
LASSIM-A network inference toolbox for genome-wide mechanistic modeling.
Magnusson, Rasmus; Mariotti, Guido Pio; Köpsén, Mattias; Lövfors, William; Gawel, Danuta R; Jörnsten, Rebecka; Linde, Jörg; Nordling, Torbjörn E M; Nyman, Elin; Schulze, Sylvie; Nestor, Colm E; Zhang, Huan; Cedersund, Gunnar; Benson, Mikael; Tjärnberg, Andreas; Gustafsson, Mika
2017-06-01
Recent technological advancements have made time-resolved, quantitative, multi-omics data available for many model systems, which could be integrated for systems pharmacokinetic use. Here, we present large-scale simulation modeling (LASSIM), which is a novel mathematical tool for performing large-scale inference using mechanistically defined ordinary differential equations (ODE) for gene regulatory networks (GRNs). LASSIM integrates structural knowledge about regulatory interactions and non-linear equations with multiple steady state and dynamic response expression datasets. The rationale behind LASSIM is that biological GRNs can be simplified using a limited subset of core genes that are assumed to regulate all other gene transcription events in the network. The LASSIM method is implemented as a general-purpose toolbox using the PyGMO Python package to make the most of multicore computers and high performance clusters, and is available at https://gitlab.com/Gustafsson-lab/lassim. As a method, LASSIM works in two steps, where it first infers a non-linear ODE system of the pre-specified core gene expression. Second, LASSIM in parallel optimizes the parameters that model the regulation of peripheral genes by core system genes. We showed the usefulness of this method by applying LASSIM to infer a large-scale non-linear model of naïve Th2 cell differentiation, made possible by integrating Th2 specific bindings, time-series together with six public and six novel siRNA-mediated knock-down experiments. ChIP-seq showed significant overlap for all tested transcription factors. Next, we performed novel time-series measurements of total T-cells during differentiation towards Th2 and verified that our LASSIM model could monitor those data significantly better than comparable models that used the same Th2 bindings. In summary, the LASSIM toolbox opens the door to a new type of model-based data analysis that combines the strengths of reliable mechanistic models with truly systems-level data. We demonstrate the power of this approach by inferring a mechanistically motivated, genome-wide model of the Th2 transcription regulatory system, which plays an important role in several immune related diseases.
Corrosion fatigue crack propagation in metals
NASA Technical Reports Server (NTRS)
Gangloff, Richard P.
1990-01-01
This review assesses fracture mechanics data and mechanistic models for corrosion fatigue crack propagation in structural alloys exposed to ambient temperature gases and electrolytes. Extensive stress intensity-crack growth rate data exist for ferrous, aluminum and nickel based alloys in a variety of environments. Interactive variables (viz., stress intensity range, mean stress, alloy composition and microstructure, loading frequency, temperature, gas pressure and electrode potential) strongly affect crack growth kinetics and complicate fatigue control. Mechanistic models to predict crack growth rates were formulated by coupling crack tip mechanics with occluded crack chemistry, and from both the hydrogen embrittlement and anodic dissolution/film rupture perspectives. Research is required to better define: (1) environmental effects near threshold and on crack closure; (2) damage tolerant life prediction codes and the validity of similitude; (3) the behavior of microcrack; (4) probes and improved models of crack tip damage; and (5) the cracking performance of advanced alloys and composites.
Zhang, X; Duan, J; Kesisoglou, F; Novakovic, J; Amidon, G L; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R
2017-08-01
On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled "Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation." The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole-body framework. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Dallmann, André; Ince, Ibrahim; Meyer, Michaela; Willmann, Stefan; Eissing, Thomas; Hempel, Georg
2017-11-01
In the past years, several repositories for anatomical and physiological parameters required for physiologically based pharmacokinetic modeling in pregnant women have been published. While providing a good basis, some important aspects can be further detailed. For example, they did not account for the variability associated with parameters or were lacking key parameters necessary for developing more detailed mechanistic pregnancy physiologically based pharmacokinetic models, such as the composition of pregnancy-specific tissues. The aim of this meta-analysis was to provide an updated and extended database of anatomical and physiological parameters in healthy pregnant women that also accounts for changes in the variability of a parameter throughout gestation and for the composition of pregnancy-specific tissues. A systematic literature search was carried out to collect study data on pregnancy-related changes of anatomical and physiological parameters. For each parameter, a set of mathematical functions was fitted to the data and to the standard deviation observed among the data. The best performing functions were selected based on numerical and visual diagnostics as well as based on physiological plausibility. The literature search yielded 473 studies, 302 of which met the criteria to be further analyzed and compiled in a database. In total, the database encompassed 7729 data. Although the availability of quantitative data for some parameters remained limited, mathematical functions could be generated for many important parameters. Gaps were filled based on qualitative knowledge and based on physiologically plausible assumptions. The presented results facilitate the integration of pregnancy-dependent changes in anatomy and physiology into mechanistic population physiologically based pharmacokinetic models. Such models can ultimately provide a valuable tool to investigate the pharmacokinetics during pregnancy in silico and support informed decision making regarding optimal dosing regimens in this vulnerable special population.
Optimal Chemotherapy for Leukemia: A Model-Based Strategy for Individualized Treatment
Jayachandran, Devaraj; Rundell, Ann E.; Hannemann, Robert E.; Vik, Terry A.; Ramkrishna, Doraiswami
2014-01-01
Acute Lymphoblastic Leukemia, commonly known as ALL, is a predominant form of cancer during childhood. With the advent of modern healthcare support, the 5-year survival rate has been impressive in the recent past. However, long-term ALL survivors embattle several treatment-related medical and socio-economic complications due to excessive and inordinate chemotherapy doses received during treatment. In this work, we present a model-based approach to personalize 6-Mercaptopurine (6-MP) treatment for childhood ALL with a provision for incorporating the pharmacogenomic variations among patients. Semi-mechanistic mathematical models were developed and validated for i) 6-MP metabolism, ii) red blood cell mean corpuscular volume (MCV) dynamics, a surrogate marker for treatment efficacy, and iii) leukopenia, a major side-effect. With the constraint of getting limited data from clinics, a global sensitivity analysis based model reduction technique was employed to reduce the parameter space arising from semi-mechanistic models. The reduced, sensitive parameters were used to individualize the average patient model to a specific patient so as to minimize the model uncertainty. Models fit the data well and mimic diverse behavior observed among patients with minimum parameters. The model was validated with real patient data obtained from literature and Riley Hospital for Children in Indianapolis. Patient models were used to optimize the dose for an individual patient through nonlinear model predictive control. The implementation of our approach in clinical practice is realizable with routinely measured complete blood counts (CBC) and a few additional metabolite measurements. The proposed approach promises to achieve model-based individualized treatment to a specific patient, as opposed to a standard-dose-for-all, and to prescribe an optimal dose for a desired outcome with minimum side-effects. PMID:25310465
Developing the next generation of forest ecosystem models
Christopher R. Schwalm; Alan R. Ek
2002-01-01
Forest ecology and management are model-rich areas for research. Models are often cast as either empirical or mechanistic. With evolving climate change, hybrid models gain new relevance because of their ability to integrate existing mechanistic knowledge with empiricism based on causal thinking. The utility of hybrid platforms results in the combination of...
Theil, P K; Flummer, C; Hurley, W L; Kristensen, N B; Labouriau, R L; Sørensen, M T
2014-12-01
The aims of the present study were to quantify colostrum intake (CI) of piglets using the D2O dilution technique, to develop a mechanistic model to predict CI, to compare these data with CI predicted by a previous empirical predictive model developed for bottle-fed piglets, and to study how composition of diets fed to gestating sows affected piglet CI, sow colostrum yield (CY), and colostrum composition. In total, 240 piglets from 40 litters were enriched with D2O. The CI measured by D2O from birth until 24 h after the birth of first-born piglet was on average 443 g (SD 151). Based on measured CI, a mechanistic model to predict CI was developed using piglet characteristics (24-h weight gain [WG; g], BW at birth [BWB; kg], and duration of CI [D; min]: CI, g=-106+2.26 WG+200 BWB+0.111 D-1,414 WG/D+0.0182 WG/BWB (R2=0.944). This model was used to predict the CI for all colostrum suckling piglets within the 40 litters (n=500, mean=437 g, SD=153 g) and was compared with the CI predicted by a previous empirical predictive model (mean=305 g, SD=140 g). The previous empirical model underestimated the CI by 30% compared with that obtained by the new mechanistic model. The sows were fed 1 of 4 gestation diets (n=10 per diet) based on different fiber sources (low fiber [17%] or potato pulp, pectin residue, or sugarbeet pulp [32 to 40%]) from mating until d 108 of gestation. From d 108 of gestation until parturition, sows were fed 1 of 5 prefarrowing diets (n=8 per diet) varying in supplemented fat (3% animal fat, 8% coconut oil, 8% sunflower oil, 8% fish oil, or 4% fish oil+4% octanoic acid). Sows fed diets with pectin residue or sugarbeet pulp during gestation produced colostrum with lower protein, fat, DM, and energy concentrations and higher lactose concentrations, and their piglets had greater CI as compared with sows fed potato pulp or the low-fiber diet (P<0.05), and sows fed pectin residue had a greater CY than potato pulp-fed sows (P<0.05). Prefarrowing diets affected neither CI nor CY, but the prefarrowing diet with coconut oil decreased lactose and increased DM concentrations of colostrum compared with other prefarrowing diets (P<0.05). In conclusion, the new mechanistic predictive model for CI suggests that the previous empirical predictive model underestimates CI of sow-reared piglets by 30%. It was also concluded that nutrition of sows during gestation affected CY and colostrum composition.
Sridharan, D M; Asaithamby, A; Bailey, S M; Costes, S V; Doetsch, P W; Dynan, W S; Kronenberg, A; Rithidech, K N; Saha, J; Snijders, A M; Werner, E; Wiese, C; Cucinotta, F A; Pluth, J M
2015-01-01
During space travel astronauts are exposed to a variety of radiations, including galactic cosmic rays composed of high-energy protons and high-energy charged (HZE) nuclei, and solar particle events containing low- to medium-energy protons. Risks from these exposures include carcinogenesis, central nervous system damage and degenerative tissue effects. Currently, career radiation limits are based on estimates of fatal cancer risks calculated using a model that incorporates human epidemiological data from exposed populations, estimates of relative biological effectiveness and dose-response data from relevant mammalian experimental models. A major goal of space radiation risk assessment is to link mechanistic data from biological studies at NASA Space Radiation Laboratory and other particle accelerators with risk models. Early phenotypes of HZE exposure, such as the induction of reactive oxygen species, DNA damage signaling and inflammation, are sensitive to HZE damage complexity. This review summarizes our current understanding of critical areas within the DNA damage and oxidative stress arena and provides insight into their mechanistic interdependence and their usefulness in accurately modeling cancer and other risks in astronauts exposed to space radiation. Our ultimate goals are to examine potential links and crosstalk between early response modules activated by charged particle exposure, to identify critical areas that require further research and to use these data to reduced uncertainties in modeling cancer risk for astronauts. A clearer understanding of the links between early mechanistic aspects of high-LET response and later surrogate cancer end points could reveal key nodes that can be therapeutically targeted to mitigate the health effects from charged particle exposures.
USDA-ARS?s Scientific Manuscript database
Although empirical models have been developed previously, a mechanistic model is needed for estimating electrical conductivity (EC) using time domain reflectometry (TDR) with variable lengths of coaxial cable. The goals of this study are to: (1) derive a mechanistic model based on multisection tra...
Pathophysiology of white-nose syndrome in bats: a mechanistic model linking wing damage to mortality
Warnecke, Lisa; Turner, James M.; Bollinger, Trent K.; Misra, Vikram; Cryan, Paul M.; Blehert, David S.; Wibbelt, Gudrun; Willis, Craig K.R.
2013-01-01
White-nose syndrome is devastating North American bat populations but we lack basic information on disease mechanisms. Altered blood physiology owing to epidermal invasion by the fungal pathogen Geomyces destructans (Gd) has been hypothesized as a cause of disrupted torpor patterns of affected hibernating bats, leading to mortality. Here, we present data on blood electrolyte concentration, haematology and acid–base balance of hibernating little brown bats, Myotis lucifugus, following experimental inoculation with Gd. Compared with controls, infected bats showed electrolyte depletion (i.e. lower plasma sodium), changes in haematology (i.e. increased haematocrit and decreased glucose) and disrupted acid–base balance (i.e. lower CO2 partial pressure and bicarbonate). These findings indicate hypotonic dehydration, hypovolaemia and metabolic acidosis. We propose a mechanistic model linking tissue damage to altered homeostasis and morbidity/mortality.
Song, Ling; Zhang, Yi; Jiang, Ji; Ren, Shuang; Chen, Li; Liu, Dongyang; Chen, Xijing; Hu, Pei
2018-04-06
The objective of this study was to develop a physiologically based pharmacokinetic (PBPK) model for sinogliatin (HMS-5552, dorzagliatin) by integrating allometric scaling (AS), in vitro to in vivo exploration (IVIVE), and steady-state concentration-mean residence time (C ss -MRT) methods and to provide mechanistic insight into its pharmacokinetic properties in humans. Human major pharmacokinetic parameters were analyzed using AS, IVIVE, and C ss -MRT methods with available preclinical in vitro and in vivo data to understand sinogliatin drug metabolism and pharmacokinetic (DMPK) characteristics and underlying mechanisms. On this basis, an initial mechanistic PBPK model of sinogliatin was developed. The initial PBPK model was verified using observed data from a single ascending dose (SAD) study and further optimized with various strategies. The final model was validated by simulating sinogliatin pharmacokinetics under a fed condition. The validated model was applied to support a clinical drug-drug interaction (DDI) study design and to evaluate the effects of intrinsic (hepatic cirrhosis, genetic) factors on drug exposure. The two-species scaling method using rat and dog data (TS- rat,dog ) was the best AS method in predicting human systemic clearance in the central compartment (CL). The IVIVE method confirmed that sinogliatin was predominantly metabolized by cytochrome P450 (CYP) 3A4. The C ss -MRT method suggested dog pharmacokinetic profiles were more similar to human pharmacokinetic profiles. The estimated CL using the AS and IVIVE approaches was within 1.5-fold of that observed. The C ss -MRT method in dogs also provided acceptable prediction of human pharmacokinetic characteristics. For the PBPK approach, the 90% confidence intervals (CIs) of the simulated maximum concentration (C max ), CL, and area under the plasma concentration-time curve (AUC) of sinogliatin were within those observed and the 90% CI of simulated time to C max (t max ) was closed to that observed for a dose range of 5-50 mg in the SAD study. The final PBPK model was validated by simulating sinogliatin pharmacokinetics with food. The 90% CIs of the simulated C max , CL, and AUC values for sinogliatin were within those observed and the 90% CI of the simulated t max was partially within that observed for the dose range of 25-200 mg in the multiple ascending dose (MAD) study. This PBPK model selected a final clinical DDI study design with itraconazole from four potential designs and also evaluated the effects of intrinsic (hepatic cirrhosis, genetic) factors on drug exposure. Sinogliatin pharmacokinetic properties were mechanistically understood by integrating all four methods and a mechanistic PBPK model was successfully developed and validated using clinical data. This PBPK model was applied to support the development of sinogliatin.
Comparing two-zone models of dust exposure.
Jones, Rachael M; Simmons, Catherine E; Boelter, Fred W
2011-09-01
The selection and application of mathematical models to work tasks is challenging. Previously, we developed and evaluated a semi-empirical two-zone model that predicts time-weighted average (TWA) concentrations (Ctwa) of dust emitted during the sanding of drywall joint compound. Here, we fit the emission rate and random air speed variables of a mechanistic two-zone model to testing event data and apply and evaluate the model using data from two field studies. We found that the fitted random air speed values and emission rate were sensitive to (i) the size of the near-field and (ii) the objective function used for fitting, but this did not substantially impact predicted dust Ctwa. The mechanistic model predictions were lower than the semi-empirical model predictions and measured respirable dust Ctwa at Site A but were within an acceptable range. At Site B, a 10.5 m3 room, the mechanistic model did not capture the observed difference between PBZ and area Ctwa. The model predicted uniform mixing and predicted dust Ctwa up to an order of magnitude greater than was measured. We suggest that applications of the mechanistic model be limited to contexts where the near-field volume is very small relative to the far-field volume.
Biomechanics meets the ecological niche: the importance of temporal data resolution.
Kearney, Michael R; Matzelle, Allison; Helmuth, Brian
2012-03-15
The emerging field of mechanistic niche modelling aims to link the functional traits of organisms to their environments to predict survival, reproduction, distribution and abundance. This approach has great potential to increase our understanding of the impacts of environmental change on individuals, populations and communities by providing functional connections between physiological and ecological response to increasingly available spatial environmental data. By their nature, such mechanistic models are more data intensive in comparison with the more widely applied correlative approaches but can potentially provide more spatially and temporally explicit predictions, which are often needed by decision makers. A poorly explored issue in this context is the appropriate level of temporal resolution of input data required for these models, and specifically the error in predictions that can be incurred through the use of temporally averaged data. Here, we review how biomechanical principles from heat-transfer and metabolic theory are currently being used as foundations for mechanistic niche models and consider the consequences of different temporal resolutions of environmental data for modelling the niche of a behaviourally thermoregulating terrestrial lizard. We show that fine-scale temporal resolution (daily) data can be crucial for unbiased inference of climatic impacts on survival, growth and reproduction. This is especially so for species with little capacity for behavioural buffering, because of behavioural or habitat constraints, and for detecting temporal trends. However, coarser-resolution data (long-term monthly averages) can be appropriate for mechanistic studies of climatic constraints on distribution and abundance limits in thermoregulating species at broad spatial scales.
Mechanistic modelling of the inhibitory effect of pH on microbial growth.
Akkermans, Simen; Van Impe, Jan F
2018-06-01
Modelling and simulation of microbial dynamics as a function of processing, transportation and storage conditions is a useful tool to improve microbial food safety and quality. The goal of this research is to improve an existing methodology for building mechanistic predictive models based on the environmental conditions. The effect of environmental conditions on microbial dynamics is often described by combining the separate effects in a multiplicative way (gamma concept). This idea was extended further in this work by including the effects of the lag and stationary growth phases on microbial growth rate as independent gamma factors. A mechanistic description of the stationary phase as a function of pH was included, based on a novel class of models that consider product inhibition. Experimental results on Escherichia coli growth dynamics indicated that also the parameters of the product inhibition equations can be modelled with the gamma approach. This work has extended a modelling methodology, resulting in predictive models that are (i) mechanistically inspired, (ii) easily identifiable with a limited work load and (iii) easily extended to additional environmental conditions. Copyright © 2017. Published by Elsevier Ltd.
DEVELOPMENT AND VALIDATION OF A MECHANISTIC GROUND SPRAYER MODEL
In the last ten years the Spray Drift Task Force (SDTF), U.S. Environmental Protection Agency (EPA), USDA Agricultural Research Service, and USDA Forest Service cooperated in the refinement and evaluation of a mechanistically-based aerial spray model (contained within AGDISP and ...
Quantifying fat, oil, and grease deposit formation kinetics.
Iasmin, Mahbuba; Dean, Lisa O; Ducoste, Joel J
2016-01-01
Fat, oil, and grease (FOG) deposits formed in sanitary sewers are calcium-based saponified solids that are responsible for a significant number of nationwide sanitary sewer overflows (SSOs) across United States. In the current study, the kinetics of lab-based saponified solids were determined to understand the kinetics of FOG deposit formation in sewers for two types of fat (Canola and Beef Tallow) and two types of calcium sources (calcium chloride and calcium sulfate) under three pH (7 ± 0.5, 10 ± 0.5, and ≈14) and two temperature conditions (22 ± 0.5 and 45 ± 0.5 °C). The results of this study displayed quick reactions of a fraction of fats with calcium ions to form calcium based saponified solids. Results further showed that increased palmitic fatty acid content in source fats, the magnitude of the pH, and temperature significantly affect the FOG deposit formation and saponification rates. The experimental data of the kinetics were compared with two empirical models: a) Cotte saponification model and b) Foubert crystallization model and a mass-action based mechanistic model that included alkali driven hydrolysis of triglycerides. Results showed that the mass action based mechanistic model was able to predict changes in the rate of formation of saponified solids under the different experimental conditions compared to both empirical models. The mass-action based saponification model also revealed that the hydrolysis of Beef Tallow was slower compared to liquid Canola fat resulting in smaller quantities of saponified solids. This mechanistic saponification model, with its ability to track the saponified solids chemical precursors, may provide an initial framework to predict the spatial formation of FOG deposits in municipal sewers using system wide sewer collection modeling software. Copyright © 2015 Elsevier Ltd. All rights reserved.
A mechanistic modeling and data assimilation framework for Mojave Desert ecohydrology
Ng, Gene-Hua Crystal.; Bedford, David; Miller, David
2014-01-01
This study demonstrates and addresses challenges in coupled ecohydrological modeling in deserts, which arise due to unique plant adaptations, marginal growing conditions, slow net primary production rates, and highly variable rainfall. We consider model uncertainty from both structural and parameter errors and present a mechanistic model for the shrub Larrea tridentata (creosote bush) under conditions found in the Mojave National Preserve in southeastern California (USA). Desert-specific plant and soil features are incorporated into the CLM-CN model by Oleson et al. (2010). We then develop a data assimilation framework using the ensemble Kalman filter (EnKF) to estimate model parameters based on soil moisture and leaf-area index observations. A new implementation procedure, the “multisite loop EnKF,” tackles parameter estimation difficulties found to affect desert ecohydrological applications. Specifically, the procedure iterates through data from various observation sites to alleviate adverse filter impacts from non-Gaussianity in small desert vegetation state values. It also readjusts inconsistent parameters and states through a model spin-up step that accounts for longer dynamical time scales due to infrequent rainfall in deserts. Observation error variance inflation may also be needed to help prevent divergence of estimates from true values. Synthetic test results highlight the importance of adequate observations for reducing model uncertainty, which can be achieved through data quality or quantity.
NASA Astrophysics Data System (ADS)
Abdul-Aziz, O. I.; Ishtiaq, K. S.
2015-12-01
We present a user-friendly modeling tool on MS Excel to predict the greenhouse gas (GHG) fluxes and estimate potential carbon sequestration from the coastal wetlands. The dominant controls of wetland GHG fluxes and their relative mechanistic linkages with various hydro-climatic, sea level, biogeochemical and ecological drivers were first determined by employing a systematic data-analytics method, including Pearson correlation matrix, principal component and factor analyses, and exploratory partial least squares regressions. The mechanistic knowledge and understanding was then utilized to develop parsimonious non-linear (power-law) models to predict wetland carbon dioxide (CO2) and methane (CH4) fluxes based on a sub-set of climatic, hydrologic and environmental drivers such as the photosynthetically active radiation, soil temperature, water depth, and soil salinity. The models were tested with field data for multiple sites and seasons (2012-13) collected from the Waquoit Bay, MA. The model estimated the annual wetland carbon storage by up-scaling the instantaneous predicted fluxes to an extended growing season (e.g., May-October) and by accounting for the net annual lateral carbon fluxes between the wetlands and estuary. The Excel Spreadsheet model is a simple ecological engineering tool for coastal carbon management and their incorporation into a potential carbon market under a changing climate, sea level and environment. Specifically, the model can help to determine appropriate GHG offset protocols and monitoring plans for projects that focus on tidal wetland restoration and maintenance.
Emmott, Stephen; Hutton, Jon; Lyutsarev, Vassily; Smith, Matthew J.; Scharlemann, Jörn P. W.; Purves, Drew W.
2014-01-01
Anthropogenic activities are causing widespread degradation of ecosystems worldwide, threatening the ecosystem services upon which all human life depends. Improved understanding of this degradation is urgently needed to improve avoidance and mitigation measures. One tool to assist these efforts is predictive models of ecosystem structure and function that are mechanistic: based on fundamental ecological principles. Here we present the first mechanistic General Ecosystem Model (GEM) of ecosystem structure and function that is both global and applies in all terrestrial and marine environments. Functional forms and parameter values were derived from the theoretical and empirical literature where possible. Simulations of the fate of all organisms with body masses between 10 µg and 150,000 kg (a range of 14 orders of magnitude) across the globe led to emergent properties at individual (e.g., growth rate), community (e.g., biomass turnover rates), ecosystem (e.g., trophic pyramids), and macroecological scales (e.g., global patterns of trophic structure) that are in general agreement with current data and theory. These properties emerged from our encoding of the biology of, and interactions among, individual organisms without any direct constraints on the properties themselves. Our results indicate that ecologists have gathered sufficient information to begin to build realistic, global, and mechanistic models of ecosystems, capable of predicting a diverse range of ecosystem properties and their response to human pressures. PMID:24756001
Harfoot, Michael B J; Newbold, Tim; Tittensor, Derek P; Emmott, Stephen; Hutton, Jon; Lyutsarev, Vassily; Smith, Matthew J; Scharlemann, Jörn P W; Purves, Drew W
2014-04-01
Anthropogenic activities are causing widespread degradation of ecosystems worldwide, threatening the ecosystem services upon which all human life depends. Improved understanding of this degradation is urgently needed to improve avoidance and mitigation measures. One tool to assist these efforts is predictive models of ecosystem structure and function that are mechanistic: based on fundamental ecological principles. Here we present the first mechanistic General Ecosystem Model (GEM) of ecosystem structure and function that is both global and applies in all terrestrial and marine environments. Functional forms and parameter values were derived from the theoretical and empirical literature where possible. Simulations of the fate of all organisms with body masses between 10 µg and 150,000 kg (a range of 14 orders of magnitude) across the globe led to emergent properties at individual (e.g., growth rate), community (e.g., biomass turnover rates), ecosystem (e.g., trophic pyramids), and macroecological scales (e.g., global patterns of trophic structure) that are in general agreement with current data and theory. These properties emerged from our encoding of the biology of, and interactions among, individual organisms without any direct constraints on the properties themselves. Our results indicate that ecologists have gathered sufficient information to begin to build realistic, global, and mechanistic models of ecosystems, capable of predicting a diverse range of ecosystem properties and their response to human pressures.
NASA Astrophysics Data System (ADS)
Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward
2018-04-01
A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.
Fjodorova, Natalja; Novič, Marjana
2012-01-01
The knowledge-based Toxtree expert system (SAR approach) was integrated with the statistically based counter propagation artificial neural network (CP ANN) model (QSAR approach) to contribute to a better mechanistic understanding of a carcinogenicity model for non-congeneric chemicals using Dragon descriptors and carcinogenic potency for rats as a response. The transparency of the CP ANN algorithm was demonstrated using intrinsic mapping technique specifically Kohonen maps. Chemical structures were represented by Dragon descriptors that express the structural and electronic features of molecules such as their shape and electronic surrounding related to reactivity of molecules. It was illustrated how the descriptors are correlated with particular structural alerts (SAs) for carcinogenicity with recognized mechanistic link to carcinogenic activity. Moreover, the Kohonen mapping technique enables one to examine the separation of carcinogens and non-carcinogens (for rats) within a family of chemicals with a particular SA for carcinogenicity. The mechanistic interpretation of models is important for the evaluation of safety of chemicals. PMID:24688639
Klinke, David J; Wang, Qing
2016-01-01
A major barrier for broadening the efficacy of immunotherapies for cancer is identifying key mechanisms that limit the efficacy of tumor infiltrating lymphocytes. Yet, identifying these mechanisms using human samples and mouse models for cancer remains a challenge. While interactions between cancer and the immune system are dynamic and non-linear, identifying the relative roles that biological components play in regulating anti-tumor immunity commonly relies on human intuition alone, which can be limited by cognitive biases. To assist natural intuition, modeling and simulation play an emerging role in identifying therapeutic mechanisms. To illustrate the approach, we developed a multi-scale mechanistic model to describe the control of tumor growth by a primary response of CD8+ T cells against defined tumor antigens using the B16 C57Bl/6 mouse model for malignant melanoma. The mechanistic model was calibrated to data obtained following adenovirus-based immunization and validated to data obtained following adoptive transfer of transgenic CD8+ T cells. More importantly, we use simulation to test whether the postulated network topology, that is the modeled biological components and their associated interactions, is sufficient to capture the observed anti-tumor immune response. Given the available data, the simulation results also provided a statistical basis for quantifying the relative importance of different mechanisms that underpin CD8+ T cell control of B16F10 growth. By identifying conditions where the postulated network topology is incomplete, we illustrate how this approach can be used as part of an iterative design-build-test cycle to expand the predictive power of the model.
There is an increased interest in utilizing mechanistic data in support of the cancer risk assessment process for ionizing radiation and environmental chemical exposures. In this regard the use of biologically based dose-response models is particularly advocated. The aim is to pr...
INCORPORATION OF MECHANISTIC INFORMATION IN THE ARSENIC PBPK MODEL DEVELOPMENT PROCESS
INCORPORATING MECHANISTIC INSIGHTS IN A PBPK MODEL FOR ARSENIC
Elaina M. Kenyon, Michael F. Hughes, Marina V. Evans, David J. Thomas, U.S. EPA; Miroslav Styblo, University of North Carolina; Michael Easterling, Analytical Sciences, Inc.
A physiologically based phar...
Modeling the risk of water pollution by pesticides from imbalanced data.
Trajanov, Aneta; Kuzmanovski, Vladimir; Real, Benoit; Perreau, Jonathan Marks; Džeroski, Sašo; Debeljak, Marko
2018-04-30
The pollution of ground and surface waters with pesticides is a serious ecological issue that requires adequate treatment. Most of the existing water pollution models are mechanistic mathematical models. While they have made a significant contribution to understanding the transfer processes, they face the problem of validation because of their complexity, the user subjectivity in their parameterization, and the lack of empirical data for validation. In addition, the data describing water pollution with pesticides are, in most cases, very imbalanced. This is due to strict regulations for pesticide applications, which lead to only a few pollution events. In this study, we propose the use of data mining to build models for assessing the risk of water pollution by pesticides in field-drained outflow water. Unlike the mechanistic models, the models generated by data mining are based on easily obtainable empirical data, while the parameterization of the models is not influenced by the subjectivity of ecological modelers. We used empirical data from field trials at the La Jaillière experimental site in France and applied the random forests algorithm to build predictive models that predict "risky" and "not-risky" pesticide application events. To address the problems of the imbalanced classes in the data, cost-sensitive learning and different measures of predictive performance were used. Despite the high imbalance between risky and not-risky application events, we managed to build predictive models that make reliable predictions. The proposed modeling approach can be easily applied to other ecological modeling problems where we encounter empirical data with highly imbalanced classes.
Development of traffic data input resources for the mechanistic empirical pavement design process.
DOT National Transportation Integrated Search
2011-12-12
The Mechanistic-Empirical Pavement Design Guide (MEPDG) for New and Rehabilitated Pavement Structures uses : nationally based data traffic inputs and recommends that state DOTs develop their own site-specific and regional : values. To support the MEP...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barua, Bipul; Mohanty, Subhasish; Listwan, Joseph T.
In this paper, a cyclic-plasticity based fully mechanistic fatigue modeling approach is presented. This is based on time-dependent stress-strain evolution of the material over the entire fatigue life rather than just based on the end of live information typically used for empirical S~N curve based fatigue evaluation approaches. Previously we presented constant amplitude fatigue test based related material models for 316 SS base, 508 LAS base and 316 SS- 316 SS weld which are used in nuclear reactor components such as pressure vessels, nozzles, and surge line pipes. However, we found that constant amplitude fatigue data based models have limitationmore » in capturing the stress-strain evolution under arbitrary fatigue loading. To address the above mentioned limitation, in this paper, we present a more advanced approach that can be used for modeling the cyclic stress-strain evolution and fatigue life not only under constant amplitude but also under any arbitrary (random/variable) fatigue loading. The related material model and analytical model results are presented for 316 SS base metal. Two methodologies (either based on time/cycle or based on accumulated plastic strain energy) to track the material parameters at a given time/cycle are discussed and associated analytical model results are presented. From the material model and analytical cyclic plasticity model results, it is found that the proposed cyclic plasticity model can predict all the important stages of material behavior during the entire fatigue life of the specimens with more than 90% accuracy« less
Barua, Bipul; Mohanty, Subhasish; Listwan, Joseph T.; ...
2017-12-05
In this paper, a cyclic-plasticity based fully mechanistic fatigue modeling approach is presented. This is based on time-dependent stress-strain evolution of the material over the entire fatigue life rather than just based on the end of live information typically used for empirical S~N curve based fatigue evaluation approaches. Previously we presented constant amplitude fatigue test based related material models for 316 SS base, 508 LAS base and 316 SS- 316 SS weld which are used in nuclear reactor components such as pressure vessels, nozzles, and surge line pipes. However, we found that constant amplitude fatigue data based models have limitationmore » in capturing the stress-strain evolution under arbitrary fatigue loading. To address the above mentioned limitation, in this paper, we present a more advanced approach that can be used for modeling the cyclic stress-strain evolution and fatigue life not only under constant amplitude but also under any arbitrary (random/variable) fatigue loading. The related material model and analytical model results are presented for 316 SS base metal. Two methodologies (either based on time/cycle or based on accumulated plastic strain energy) to track the material parameters at a given time/cycle are discussed and associated analytical model results are presented. From the material model and analytical cyclic plasticity model results, it is found that the proposed cyclic plasticity model can predict all the important stages of material behavior during the entire fatigue life of the specimens with more than 90% accuracy« less
Butler Ellis, M Clare; Kennedy, Marc C; Kuster, Christian J; Alanis, Rafael; Tuck, Clive R
2018-05-28
The BREAM (Bystander and Resident Exposure Assessment Model) (Kennedy et al. in BREAM: A probabilistic bystander and resident exposure assessment model of spray drift from an agricultural boom sprayer. Comput Electron Agric 2012;88:63-71) for bystander and resident exposure to spray drift from boom sprayers has recently been incorporated into the European Food Safety Authority (EFSA) guidance for determining non-dietary exposures of humans to plant protection products. The component of BREAM, which relates airborne spray concentrations to bystander and resident dermal exposure, has been reviewed to identify whether it is possible to improve this and its description of variability captured in the model. Two approaches have been explored: a more rigorous statistical analysis of the empirical data and a semi-mechanistic model based on established studies combined with new data obtained in a wind tunnel. A statistical comparison between field data and model outputs was used to determine which approach gave the better prediction of exposures. The semi-mechanistic approach gave the better prediction of experimental data and resulted in a reduction in the proposed regulatory values for the 75th and 95th percentiles of the exposure distribution.
Mechanistic model for catalytic recombination during aerobraking maneuvers
NASA Technical Reports Server (NTRS)
Willey, Ronald J.
1989-01-01
Several mechanistic models are developed to predict recombination coefficients for use in heat shield design for reusable surface insulation (RSI) on aerobraking vehicles such as space shuttles. The models are applied over a temperature range of 300 to 1800 K and a stagnation pressure range of 0 to 3,000 Pa. A four parameter model in temperature was found to work best; however, several models (including those with atom concentrations at the surface) were also investigated. Mechanistic models developed with atom concentration terms may be applicable when sufficient data becomes available. The requirement is shown for recombination experiments in the 300 to 1000 K and 1500 to 1850 K temperature range, with deliberate concentration variations.
NASA Astrophysics Data System (ADS)
Grein, M.; Roth-Nebelsick, A.; Konrad, W.
2006-12-01
A mechanistic model (Konrad &Roth-Nebelsick a, in prep.) was applied for the reconstruction of atmospheric carbon dioxide using stomatal densities and photosynthesis parameters of extant and fossil Fagaceae. The model is based on an approach which couples diffusion and the biochemical process of photosynthesis. Atmospheric CO2 is calculated on the basis of stomatal diffusion and photosynthesis parameters of the considered taxa. The considered species include the castanoid Castanea sativa, two quercoids Quercus petraea and Quercus rhenana and an intermediate species Eotrigonobalanus furcinervis. In the case of Quercus petraea literature data were used. Stomatal data of Eotrigonobalanus furcinervis, Quercus rhenana and Castanea sativa were determined by the authors. Data of the extant Castanea sativa were collected by applying a peeling method and by counting of stomatal densities on the digitalized images of the peels. Additionally, isotope data of leaf samples of Castanea sativa were determined to estimate the ratio of intercellular to ambient carbon dioxide. The CO2 values calculated by the model (on the basis of stomatal data and measured or estimated biochemical parameters) are in good agreement with literature data, with the exception of the Late Eocene. The results thus demonstrate that the applied approach is principally suitable for reconstructing palaeoatmospheric CO2.
Using energy budgets to combine ecology and toxicology in a mammalian sentinel species
NASA Astrophysics Data System (ADS)
Desforges, Jean-Pierre W.; Sonne, Christian; Dietz, Rune
2017-04-01
Process-driven modelling approaches can resolve many of the shortcomings of traditional descriptive and non-mechanistic toxicology. We developed a simple dynamic energy budget (DEB) model for the mink (Mustela vison), a sentinel species in mammalian toxicology, which coupled animal physiology, ecology and toxicology, in order to mechanistically investigate the accumulation and adverse effects of lifelong dietary exposure to persistent environmental toxicants, most notably polychlorinated biphenyls (PCBs). Our novel mammalian DEB model accurately predicted, based on energy allocations to the interconnected metabolic processes of growth, development, maintenance and reproduction, lifelong patterns in mink growth, reproductive performance and dietary accumulation of PCBs as reported in the literature. Our model results were consistent with empirical data from captive and free-ranging studies in mink and other wildlife and suggest that PCB exposure can have significant population-level impacts resulting from targeted effects on fetal toxicity, kit mortality and growth and development. Our approach provides a simple and cross-species framework to explore the mechanistic interactions of physiological processes and ecotoxicology, thus allowing for a deeper understanding and interpretation of stressor-induced adverse effects at all levels of biological organization.
Farrell, Tracy L; Poquet, Laure; Dew, Tristan P; Barber, Stuart; Williamson, Gary
2012-02-01
There is a considerable need to rationalize the membrane permeability and mechanism of transport for potential nutraceuticals. The aim of this investigation was to develop a theoretical permeability equation, based on a reported descriptive absorption model, enabling calculation of the transcellular component of absorption across Caco-2 monolayers. Published data for Caco-2 permeability of 30 drugs transported by the transcellular route were correlated with the descriptors 1-octanol/water distribution coefficient (log D, pH 7.4) and size, based on molecular mass. Nonlinear regression analysis was used to derive a set of model parameters a', β', and b' with an integrated molecular mass function. The new theoretical transcellular permeability (TTP) model obtained a good fit of the published data (R² = 0.93) and predicted reasonably well (R² = 0.86) the experimental apparent permeability coefficient (P(app)) for nine non-training set compounds reportedly transported by the transcellular route. For the first time, the TTP model was used to predict the absorption characteristics of six phenolic acids, and this original investigation was supported by in vitro Caco-2 cell mechanistic studies, which suggested that deviation of the P(app) value from the predicted transcellular permeability (P(app)(trans)) may be attributed to involvement of active uptake, efflux transporters, or paracellular flux.
MacLeod, Miles; Nersessian, Nancy J
2015-02-01
In this paper we draw upon rich ethnographic data of two systems biology labs to explore the roles of explanation and understanding in large-scale systems modeling. We illustrate practices that depart from the goal of dynamic mechanistic explanation for the sake of more limited modeling goals. These processes use abstract mathematical formulations of bio-molecular interactions and data fitting techniques which we call top-down abstraction to trade away accurate mechanistic accounts of large-scale systems for specific information about aspects of those systems. We characterize these practices as pragmatic responses to the constraints many modelers of large-scale systems face, which in turn generate more limited pragmatic non-mechanistic forms of understanding of systems. These forms aim at knowledge of how to predict system responses in order to manipulate and control some aspects of them. We propose that this analysis of understanding provides a way to interpret what many systems biologists are aiming for in practice when they talk about the objective of a "systems-level understanding." Copyright © 2014 Elsevier Ltd. All rights reserved.
Technical note: Bayesian calibration of dynamic ruminant nutrition models.
Reed, K F; Arhonditsis, G B; France, J; Kebreab, E
2016-08-01
Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Keane, R E; Ryan, K C; Running, S W
1996-03-01
A mechanistic, biogeochemical succession model, FIRE-BGC, was used to investigate the role of fire on long-term landscape dynamics in northern Rocky Mountain coniferous forests of Glacier National Park, Montana, USA. FIRE-BGC is an individual-tree model-created by merging the gap-phase process-based model FIRESUM with the mechanistic ecosystem biogeochemical model FOREST-BGC-that has mixed spatial and temporal resolution in its simulation architecture. Ecological processes that act at a landscape level, such as fire and seed dispersal, are simulated annually from stand and topographic information. Stand-level processes, such as tree establishment, growth and mortality, organic matter accumulation and decomposition, and undergrowth plant dynamics are simulated both daily and annually. Tree growth is mechanistically modeled based on the ecosystem process approach of FOREST-BGC where carbon is fixed daily by forest canopy photosynthesis at the stand level. Carbon allocated to the tree stem at the end of the year generates the corresponding diameter and height growth. The model also explicitly simulates fire behavior and effects on landscape characteristics. We simulated the effects of fire on ecosystem characteristics of net primary productivity, evapotranspiration, standing crop biomass, nitrogen cycling and leaf area index over 200 years for the 50,000-ha McDonald Drainage in Glacier National Park. Results show increases in net primary productivity and available nitrogen when fires are included in the simulation. Standing crop biomass and evapotranspiration decrease under a fire regime. Shade-intolerant species dominate the landscape when fires are excluded. Model tree increment predictions compared well with field data.
NASA Astrophysics Data System (ADS)
Thomas, Stephanie Margarete; Beierkuhnlein, Carl
2013-05-01
The occurrence of ectotherm disease vectors outside of their previous distribution area and the emergence of vector-borne diseases can be increasingly observed at a global scale and are accompanied by a growing number of studies which investigate the vast range of determining factors and their causal links. Consequently, a broad span of scientific disciplines is involved in tackling these complex phenomena. First, we evaluate the citation behaviour of relevant scientific literature in order to clarify the question "do scientists consider results of other disciplines to extend their expertise?" We then highlight emerging tools and concepts useful for risk assessment. Correlative models (regression-based, machine-learning and profile techniques), mechanistic models (basic reproduction number R 0) and methods of spatial regression, interaction and interpolation are described. We discuss further steps towards multidisciplinary approaches regarding new tools and emerging concepts to combine existing approaches such as Bayesian geostatistical modelling, mechanistic models which avoid the need for parameter fitting, joined correlative and mechanistic models, multi-criteria decision analysis and geographic profiling. We take the quality of both occurrence data for vector, host and disease cases, and data of the predictor variables into consideration as both determine the accuracy of risk area identification. Finally, we underline the importance of multidisciplinary research approaches. Even if the establishment of communication networks between scientific disciplines and the share of specific methods is time consuming, it promises new insights for the surveillance and control of vector-borne diseases worldwide.
Thomas E. Dilts; Peter J. Weisberg; Camie M. Dencker; Jeanne C. Chambers
2015-01-01
We have three goals. (1) To develop a suite of functionally relevant climate variables for modelling vegetation distribution on arid and semi-arid landscapes of the Great Basin, USA. (2) To compare the predictive power of vegetation distribution models based on mechanistically proximate factors (water deficit variables) and factors that are more mechanistically removed...
Gering, Kevin L
2013-08-27
A system includes an electrochemical cell, monitoring hardware, and a computing system. The monitoring hardware periodically samples performance characteristics of the electrochemical cell. The computing system determines cell information from the performance characteristics of the electrochemical cell. The computing system also develops a mechanistic level model of the electrochemical cell to determine performance fade characteristics of the electrochemical cell and analyzing the mechanistic level model to estimate performance fade characteristics over aging of a similar electrochemical cell. The mechanistic level model uses first constant-current pulses applied to the electrochemical cell at a first aging period and at three or more current values bracketing a first exchange current density. The mechanistic level model also is based on second constant-current pulses applied to the electrochemical cell at a second aging period and at three or more current values bracketing the second exchange current density.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.; Zawadzki, S.A.
The primary physical/chemical models that form the basis of the FASTGRASS mechanistic computer model for calculating fission-product release from nuclear fuel are described. Calculated results are compared with test data and the major mechanisms affecting the transport of fission products during steady-state and accident conditions are identified.
Predicting subsurface uranium transport: Mechanistic modeling constrained by experimental data
NASA Astrophysics Data System (ADS)
Ottman, Michael; Schenkeveld, Walter D. C.; Kraemer, Stephan
2017-04-01
Depleted uranium (DU) munitions and their widespread use throughout conflict zones around the world pose a persistent health threat to the inhabitants of those areas long after the conclusion of active combat. However, little emphasis has been put on developing a comprehensive, quantitative tool for use in remediation and hazard avoidance planning in a wide range of environments. In this context, we report experimental data on U interaction with soils and sediments. Here, we strive to improve existing risk assessment modeling paradigms by incorporating a variety of experimental data into a mechanistic U transport model for subsurface environments. 20 different soils and sediments from a variety of environments were chosen to represent a range of geochemical parameters that are relevant to U transport. The parameters included pH, organic matter content, CaCO3, Fe content and speciation, and clay content. pH ranged from 3 to 10, organic matter content from 6 to 120 g kg-1, CaCO3 from 0 to 700 g kg-1, amorphous Fe content from 0.3 to 6 g kg-1 and clay content from 4 to 580 g kg-1. Sorption experiments were then performed, and linear isotherms were constructed. Sorption experiment results show that among separate sets of sediments and soils, there is an inverse correlation between both soil pH and CaCO¬3 concentration relative to U sorptive affinity. The geological materials with the highest and lowest sorptive affinities for U differed in CaCO3 and organic matter concentrations, as well as clay content and pH. In a further step, we are testing if transport behavior in saturated porous media can be predicted based on adsorption isotherms and generic geochemical parameters, and comparing these modeling predictions with the results from column experiments. The comparison of these two data sets will examine if U transport can be effectively predicted from reactive transport modeling that incorporates the generic geochemical parameters. This work will serve to show whether a more mechanistic approach offers an improvement over statistical regression-based risk assessment models.
He, Xin; Samee, Md. Abul Hassan; Blatti, Charles; Sinha, Saurabh
2010-01-01
Quantitative models of cis-regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled, or heuristic approximations of the underlying regulatory mechanisms. We have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence, as a function of transcription factor concentrations and their DNA-binding specificities. It uses statistical thermodynamics theory to model not only protein-DNA interaction, but also the effect of DNA-bound activators and repressors on gene expression. In addition, the model incorporates mechanistic features such as synergistic effect of multiple activators, short range repression, and cooperativity in transcription factor-DNA binding, allowing us to systematically evaluate the significance of these features in the context of available expression data. Using this model on segmentation-related enhancers in Drosophila, we find that transcriptional synergy due to simultaneous action of multiple activators helps explain the data beyond what can be explained by cooperative DNA-binding alone. We find clear support for the phenomenon of short-range repression, where repressors do not directly interact with the basal transcriptional machinery. We also find that the binding sites contributing to an enhancer's function may not be conserved during evolution, and a noticeable fraction of these undergo lineage-specific changes. Our implementation of the model, called GEMSTAT, is the first publicly available program for simultaneously modeling the regulatory activities of a given set of sequences. PMID:20862354
Mathewson, Paul D; Moyer-Horner, Lucas; Beever, Erik A; Briscoe, Natalie J; Kearney, Michael; Yahn, Jeremiah M; Porter, Warren P
2017-03-01
How climate constrains species' distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8-19% less habitat loss in response to annual temperature increases of ~3-5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect: climate-imposed restrictions on activity. This more complete understanding is necessary to inform climate adaptation actions, management strategies, and conservation plans. © 2016 John Wiley & Sons Ltd.
Mathewson, Paul; Moyer-Horner, Lucas; Beever, Erik; Briscoe, Natalie; Kearney, Michael T.; Yahn, Jeremiah; Porter, Warren P.
2017-01-01
How climate constrains species’ distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8–19% less habitat loss in response to annual temperature increases of ~3–5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect: climate-imposed restrictions on activity. This more complete understanding is necessary to inform climate adaptation actions, management strategies, and conservation plans.
Toxicokinetic and Dosimetry Modeling Tools for Exposure ...
New technologies and in vitro testing approaches have been valuable additions to risk assessments that have historically relied solely on in vivo test results. Compared to in vivo methods, in vitro high throughput screening (HTS) assays are less expensive, faster and can provide mechanistic insights on chemical action. However, extrapolating from in vitro chemical concentrations to target tissue or blood concentrations in vivo is fraught with uncertainties, and modeling is dependent upon pharmacokinetic variables not measured in in vitro assays. To address this need, new tools have been created for characterizing, simulating, and evaluating chemical toxicokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissue microdosimetry PK models relate whole-body chemical exposures to cell-scale concentrations. These tools rely on high-throughput in vitro measurements, and successful methods exist for pharmaceutical compounds that determine PK from limited in vitro measurements and chemical structure-derived property predictions. These high throughput (HT) methods provide a more rapid and less resource–intensive alternative to traditional PK model development. We have augmented these in vitro data with chemical structure-based descriptors and mechanistic tissue partitioning models to construct HTPBPK models for over three hundred environmental and pharmace
Shorebird Migration Patterns in Response to Climate Change: A Modeling Approach
NASA Technical Reports Server (NTRS)
Smith, James A.
2010-01-01
The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies offer new opportunities for the application of mechanistic models to predict how continental scale bird migration patterns may change in response to environmental change. In earlier studies, we explored the phenotypic plasticity of a migratory population of Pectoral sandpipers by simulating the movement patterns of an ensemble of 10,000 individual birds in response to changes in stopover locations as an indicator of the impacts of wetland loss and inter-annual variability on the fitness of migratory shorebirds. We used an individual based, biophysical migration model, driven by remotely sensed land surface data, climate data, and biological field data. Mean stop-over durations and stop-over frequency with latitude predicted from our model for nominal cases were consistent with results reported in the literature and available field data. In this study, we take advantage of new computing capabilities enabled by recent GP-GPU computing paradigms and commodity hardware (general purchase computing on graphics processing units). Several aspects of our individual based (agent modeling) approach lend themselves well to GP-GPU computing. We have been able to allocate compute-intensive tasks to the graphics processing units, and now simulate ensembles of 400,000 birds at varying spatial resolutions along the central North American flyway. We are incorporating additional, species specific, mechanistic processes to better reflect the processes underlying bird phenotypic plasticity responses to different climate change scenarios in the central U.S.
NASA Astrophysics Data System (ADS)
Pagel, Holger; Kandeler, Ellen; Seifert, Jana; Camarinha-Silva, Amélia; Kügler, Philipp; Rennert, Thilo; Poll, Christian; Streck, Thilo
2016-04-01
Matter cycling in soils and associated soil functions are intrinsically controlled by microbial dynamics. It is therefore crucial to consider functional traits of microorganisms in biogeochemical models. Tremendous advances in 'omic' methods provide a plethora of data on physiology, metabolic capabilities and ecological life strategies of microorganisms in soil. Combined with isotopic techniques, biochemical pathways and transformations can be identified and quantified. Such data have been, however, rarely used to improve the mechanistic representation of microbial dynamics in soil organic matter models. It is the goal of the Young Investigator Group SoilReg to address this challenge. Our general approach is to tightly integrate experiments and biochemical modeling. NextGen sequencing will be applied to identify key functional groups. Active microbial groups will be quantified by measurements of functional genes and by stable isotope probing methods of DNA and proteins. Based on this information a biogeochemical model that couples a mechanistic representation of microbial dynamics with physicochemical processes will be set up and calibrated. Sensitivity and stability analyses of the model as well as scenario simulations will reveal the importance of intrinsic and extrinsic controls of organic matter turnover. We will demonstrate our concept and present first results of two case studies on pesticide degradation and methane oxidation.
Computational Modeling of Cobalt-Based Water Oxidation: Current Status and Future Challenges
Schilling, Mauro; Luber, Sandra
2018-01-01
A lot of effort is nowadays put into the development of novel water oxidation catalysts. In this context, mechanistic studies are crucial in order to elucidate the reaction mechanisms governing this complex process, new design paradigms and strategies how to improve the stability and efficiency of those catalysts. This review is focused on recent theoretical mechanistic studies in the field of homogeneous cobalt-based water oxidation catalysts. In the first part, computational methodologies and protocols are summarized and evaluated on the basis of their applicability toward real catalytic or smaller model systems, whereby special emphasis is laid on the choice of an appropriate model system. In the second part, an overview of mechanistic studies is presented, from which conceptual guidelines are drawn on how to approach novel studies of catalysts and how to further develop the field of computational modeling of water oxidation reactions. PMID:29721491
Computational Modeling of Cobalt-based Water Oxidation: Current Status and Future Challenges
NASA Astrophysics Data System (ADS)
Schilling, Mauro; Luber, Sandra
2018-04-01
A lot of effort is nowadays put into the development of novel water oxidation catalysts. In this context mechanistic studies are crucial in order to elucidate the reaction mechanisms governing this complex process, new design paradigms and strategies how to improve the stability and efficiency of those catalysis. This review is focused on recent theoretical mechanistic studies in the field of homogeneous cobalt-based water oxidation catalysts. In the first part, computational methodologies and protocols are summarized and evaluated on the basis of their applicability towards real catalytic or smaller model systems, whereby special emphasis is laid on the choice of an appropriate model system. In the second part, an overview of mechanistic studies is presented, from which conceptual guidelines are drawn on how to approach novel studies of catalysts and how to further develop the field of computational modeling of water oxidation reactions.
Kayala, Matthew A; Baldi, Pierre
2012-10-22
Proposing reasonable mechanisms and predicting the course of chemical reactions is important to the practice of organic chemistry. Approaches to reaction prediction have historically used obfuscating representations and manually encoded patterns or rules. Here we present ReactionPredictor, a machine learning approach to reaction prediction that models elementary, mechanistic reactions as interactions between approximate molecular orbitals (MOs). A training data set of productive reactions known to occur at reasonable rates and yields and verified by inclusion in the literature or textbooks is derived from an existing rule-based system and expanded upon with manual curation from graduate level textbooks. Using this training data set of complex polar, hypervalent, radical, and pericyclic reactions, a two-stage machine learning prediction framework is trained and validated. In the first stage, filtering models trained at the level of individual MOs are used to reduce the space of possible reactions to consider. In the second stage, ranking models over the filtered space of possible reactions are used to order the reactions such that the productive reactions are the top ranked. The resulting model, ReactionPredictor, perfectly ranks polar reactions 78.1% of the time and recovers all productive reactions 95.7% of the time when allowing for small numbers of errors. Pericyclic and radical reactions are perfectly ranked 85.8% and 77.0% of the time, respectively, rising to >93% recovery for both reaction types with a small number of allowed errors. Decisions about which of the polar, pericyclic, or radical reaction type ranking models to use can be made with >99% accuracy. Finally, for multistep reaction pathways, we implement the first mechanistic pathway predictor using constrained tree-search to discover a set of reasonable mechanistic steps from given reactants to given products. Webserver implementations of both the single step and pathway versions of ReactionPredictor are available via the chemoinformatics portal http://cdb.ics.uci.edu/.
DOT National Transportation Integrated Search
2017-01-01
This report summarizes the local calibration of the distress models for the Northeast (NE) region of the United States and the development of new design tables for new flexible pavement structures. Design, performance, and traffic data collected on t...
Geerts, Hugo; Spiros, Athan; Roberts, Patrick; Twyman, Roy; Alphs, Larry; Grace, Anthony A.
2012-01-01
The tremendous advances in understanding the neurobiological circuits involved in schizophrenia have not translated into more effective treatments. An alternative strategy is to use a recently published ‘Quantitative Systems Pharmacology’ computer-based mechanistic disease model of cortical/subcortical and striatal circuits based upon preclinical physiology, human pathology and pharmacology. The physiology of 27 relevant dopamine, serotonin, acetylcholine, norepinephrine, gamma-aminobutyric acid (GABA) and glutamate-mediated targets is calibrated using retrospective clinical data on 24 different antipsychotics. The model was challenged to predict quantitatively the clinical outcome in a blinded fashion of two experimental antipsychotic drugs; JNJ37822681, a highly selective low-affinity dopamine D2 antagonist and ocaperidone, a very high affinity dopamine D2 antagonist, using only pharmacology and human positron emission tomography (PET) imaging data. The model correctly predicted the lower performance of JNJ37822681 on the positive and negative syndrome scale (PANSS) total score and the higher extra-pyramidal symptom (EPS) liability compared to olanzapine and the relative performance of ocaperidone against olanzapine, but did not predict the absolute PANSS total score outcome and EPS liability for ocaperidone, possibly due to placebo responses and EPS assessment methods. Because of its virtual nature, this modeling approach can support central nervous system research and development by accounting for unique human drug properties, such as human metabolites, exposure, genotypes and off-target effects and can be a helpful tool for drug discovery and development. PMID:23251349
Vugmeyster, Yulia; Rohde, Cynthia; Perreault, Mylene; Gimeno, Ruth E; Singh, Pratap
2013-01-01
TAM-163, an agonist monoclonal antibody targeting tyrosine receptor kinase-B (TrkB), is currently being investigated as a potential body weight modulatory agent in humans. To support the selection of the dose range for the first-in-human (FIH) trial of TAM-163, we conducted a mechanistic analysis of the pharmacokinetic (PK) and pharmacodynamic (PD) data (e.g., body weight gain) obtained in lean cynomolgus and obese rhesus monkeys following single doses ranging from 0.3 to 60 mg/kg. A target-mediated drug disposition (TMDD) model was used to describe the observed nonlinear PK and Emax approach was used to describe the observed dose-dependent PD effect. The TMDD model development was supported by the experimental determination of the binding affinity constant (9.4 nM) and internalization rate of the drug-target complex (2.08 h(-1)). These mechanistic analyses enabled linking of exposure, target (TrkB) coverage, and pharmacological activity (e.g., PD) in monkeys, and indicated that ≥ 38% target coverage (time-average) was required to achieve significant body weight gain in monkeys. Based on the scaling of the TMDD model from monkeys to humans and assuming similar relationship between the target coverage and pharmacological activity between monkey and humans, subcutaneous (SC) doses of 1 and 15 mg/kg in humans were projected to be the minimally and the fully pharmacologically active doses, respectively. Based on the minimal anticipated biological effect level (MABEL) approach for starting dose selection, the dose of 0.05 mg/kg (3 mg for a 60 kg human) SC was recommended as the starting dose for FIH trials, because at this dose level<10% target coverage was projected at Cmax (and all other time points). This study illustrates a rational mechanistic approach for the selection of FIH dose range for a therapeutic protein with a complex model of action.
A method to identify and analyze biological programs through automated reasoning
Yordanov, Boyan; Dunn, Sara-Jane; Kugler, Hillel; Smith, Austin; Martello, Graziano; Emmott, Stephen
2016-01-01
Predictive biology is elusive because rigorous, data-constrained, mechanistic models of complex biological systems are difficult to derive and validate. Current approaches tend to construct and examine static interaction network models, which are descriptively rich, but often lack explanatory and predictive power, or dynamic models that can be simulated to reproduce known behavior. However, in such approaches implicit assumptions are introduced as typically only one mechanism is considered, and exhaustively investigating all scenarios is impractical using simulation. To address these limitations, we present a methodology based on automated formal reasoning, which permits the synthesis and analysis of the complete set of logical models consistent with experimental observations. We test hypotheses against all candidate models, and remove the need for simulation by characterizing and simultaneously analyzing all mechanistic explanations of observed behavior. Our methodology transforms knowledge of complex biological processes from sets of possible interactions and experimental observations to precise, predictive biological programs governing cell function. PMID:27668090
Nøst, Therese Haugdahl; Breivik, Knut; Wania, Frank; Rylander, Charlotta; Odland, Jon Øyvind; Sandanger, Torkjel Manning
2016-03-01
Studies on the health effects of polychlorinated biphenyls (PCBs) call for an understanding of past and present human exposure. Time-resolved mechanistic models may supplement information on concentrations in individuals obtained from measurements and/or statistical approaches if they can be shown to reproduce empirical data. Here, we evaluated the capability of one such mechanistic model to reproduce measured PCB concentrations in individual Norwegian women. We also assessed individual life-course concentrations. Concentrations of four PCB congeners in pregnant (n = 310, sampled in 2007-2009) and postmenopausal (n = 244, 2005) women were compared with person-specific predictions obtained using CoZMoMAN, an emission-based environmental fate and human food-chain bioaccumulation model. Person-specific predictions were also made using statistical regression models including dietary and lifestyle variables and concentrations. CoZMoMAN accurately reproduced medians and ranges of measured concentrations in the two study groups. Furthermore, rank correlations between measurements and predictions from both CoZMoMAN and regression analyses were strong (Spearman's r > 0.67). Precision in quartile assignments from predictions was strong overall as evaluated by weighted Cohen's kappa (> 0.6). Simulations indicated large inter-individual differences in concentrations experienced in the past. The mechanistic model reproduced all measurements of PCB concentrations within a factor of 10, and subject ranking and quartile assignments were overall largely consistent, although they were weak within each study group. Contamination histories for individuals predicted by CoZMoMAN revealed variation between study subjects, particularly in the timing of peak concentrations. Mechanistic models can provide individual PCB exposure metrics that could serve as valuable supplements to measurements.
Fawcett, Tim W.; Higginson, Andrew D.; Metsä-Simola, Niina; Hagen, Edward H.; Houston, Alasdair I.; Martikainen, Pekka
2017-01-01
Divorce is associated with an increased probability of a depressive episode, but the causation of events remains unclear. Adaptive models of depression propose that depression is a social strategy in part, whereas non-adaptive models tend to propose a diathesis-stress mechanism. We compare an adaptive evolutionary model of depression to three alternative non-adaptive models with respect to their ability to explain the temporal pattern of depression around the time of divorce. Register-based data (304,112 individuals drawn from a random sample of 11% of Finnish people) on antidepressant purchases is used as a proxy for depression. This proxy affords an unprecedented temporal resolution (a 3-monthly prevalence estimates over 10 years) without any bias from non-compliance, and it can be linked with underlying episodes via a statistical model. The evolutionary-adaptation model (all time periods with risk of divorce are depressogenic) was the best quantitative description of the data. The non-adaptive stress-relief model (period before divorce is depressogenic and period afterwards is not) provided the second best quantitative description of the data. The peak-stress model (periods before and after divorce can be depressogenic) fit the data less well, and the stress-induction model (period following divorce is depressogenic and the preceding period is not) did not fit the data at all. The evolutionary model was the most detailed mechanistic description of the divorce-depression link among the models, and the best fit in terms of predicted curvature; thus, it offers most rigorous hypotheses for further study. The stress-relief model also fit very well and was the best model in a sensitivity analysis, encouraging development of more mechanistic models for that hypothesis. PMID:28614385
Rosenström, Tom; Fawcett, Tim W; Higginson, Andrew D; Metsä-Simola, Niina; Hagen, Edward H; Houston, Alasdair I; Martikainen, Pekka
2017-01-01
Divorce is associated with an increased probability of a depressive episode, but the causation of events remains unclear. Adaptive models of depression propose that depression is a social strategy in part, whereas non-adaptive models tend to propose a diathesis-stress mechanism. We compare an adaptive evolutionary model of depression to three alternative non-adaptive models with respect to their ability to explain the temporal pattern of depression around the time of divorce. Register-based data (304,112 individuals drawn from a random sample of 11% of Finnish people) on antidepressant purchases is used as a proxy for depression. This proxy affords an unprecedented temporal resolution (a 3-monthly prevalence estimates over 10 years) without any bias from non-compliance, and it can be linked with underlying episodes via a statistical model. The evolutionary-adaptation model (all time periods with risk of divorce are depressogenic) was the best quantitative description of the data. The non-adaptive stress-relief model (period before divorce is depressogenic and period afterwards is not) provided the second best quantitative description of the data. The peak-stress model (periods before and after divorce can be depressogenic) fit the data less well, and the stress-induction model (period following divorce is depressogenic and the preceding period is not) did not fit the data at all. The evolutionary model was the most detailed mechanistic description of the divorce-depression link among the models, and the best fit in terms of predicted curvature; thus, it offers most rigorous hypotheses for further study. The stress-relief model also fit very well and was the best model in a sensitivity analysis, encouraging development of more mechanistic models for that hypothesis.
Pradeep, Prachi; Povinelli, Richard J; Merrill, Stephen J; Bozdag, Serdar; Sem, Daniel S
2015-04-01
The availability of large in vitro datasets enables better insight into the mode of action of chemicals and better identification of potential mechanism(s) of toxicity. Several studies have shown that not all in vitro assays can contribute as equal predictors of in vivo carcinogenicity for development of hybrid Quantitative Structure Activity Relationship (QSAR) models. We propose two novel approaches for the use of mechanistically relevant in vitro assay data in the identification of relevant biological descriptors and development of Quantitative Biological Activity Relationship (QBAR) models for carcinogenicity prediction. We demonstrate that in vitro assay data can be used to develop QBAR models for in vivo carcinogenicity prediction via two case studies corroborated with firm scientific rationale. The case studies demonstrate the similarities between QBAR and QSAR modeling in: (i) the selection of relevant descriptors to be used in the machine learning algorithm, and (ii) the development of a computational model that maps chemical or biological descriptors to a toxic endpoint. The results of both the case studies show: (i) improved accuracy and sensitivity which is especially desirable under regulatory requirements, and (ii) overall adherence with the OECD/REACH guidelines. Such mechanism based models can be used along with QSAR models for prediction of mechanistically complex toxic endpoints. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Smith, P. J.; Beven, K.; Panziera, L.
2012-04-01
The issuing of timely flood alerts may be dependant upon the ability to predict future values of water level or discharge at locations where observations are available. Catchments at risk of flash flooding often have a rapid natural response time, typically less then the forecast lead time desired for issuing alerts. This work focuses on the provision of short-range (up to 6 hours lead time) predictions of discharge in small catchments based on utilising radar forecasts to drive a hydrological model. An example analysis based upon the Verzasca catchment (Ticino, Switzerland) is presented. Parsimonious time series models with a mechanistic interpretation (so called Data-Based Mechanistic model) have been shown to provide reliable accurate forecasts in many hydrological situations. In this study such a model is developed to predict the discharge at an observed location from observed precipitation data. The model is shown to capture the snow melt response at this site. Observed discharge data is assimilated to improve the forecasts, of up to two hours lead time, that can be generated from observed precipitation. To generate forecasts with greater lead time ensemble precipitation forecasts are utilised. In this study the Nowcasting ORographic precipitation in the Alps (NORA) product outlined in more detail elsewhere (Panziera et al. Q. J. R. Meteorol. Soc. 2011; DOI:10.1002/qj.878) is utilised. NORA precipitation forecasts are derived from historical analogues based on the radar field and upper atmospheric conditions. As such, they avoid the need to explicitly model the evolution of the rainfall field through for example Lagrangian diffusion. The uncertainty in the forecasts is represented by characterisation of the joint distribution of the observed discharge, the discharge forecast using the (in operational conditions unknown) future observed precipitation and that forecast utilising the NORA ensembles. Constructing the joint distribution in this way allows the full historic record of data at the site to inform the predictive distribution. It is shown that, in part due to the limited availability of forecasts, the uncertainty in the relationship between the NORA based forecasts and other variates dominated the resulting predictive uncertainty.
Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V
2017-03-01
A mechanistic model-based soft sensor is developed and validated for 550L filamentous fungus fermentations operated at Novozymes A/S. The soft sensor is comprised of a parameter estimation block based on a stoichiometric balance, coupled to a dynamic process model. The on-line parameter estimation block models the changing rates of formation of product, biomass, and water, and the rate of consumption of feed using standard, available on-line measurements. This parameter estimation block, is coupled to a mechanistic process model, which solves the current states of biomass, product, substrate, dissolved oxygen and mass, as well as other process parameters including k L a, viscosity and partial pressure of CO 2 . State estimation at this scale requires a robust mass model including evaporation, which is a factor not often considered at smaller scales of operation. The model is developed using a historical data set of 11 batches from the fermentation pilot plant (550L) at Novozymes A/S. The model is then implemented on-line in 550L fermentation processes operated at Novozymes A/S in order to validate the state estimator model on 14 new batches utilizing a new strain. The product concentration in the validation batches was predicted with an average root mean sum of squared error (RMSSE) of 16.6%. In addition, calculation of the Janus coefficient for the validation batches shows a suitably calibrated model. The robustness of the model prediction is assessed with respect to the accuracy of the input data. Parameter estimation uncertainty is also carried out. The application of this on-line state estimator allows for on-line monitoring of pilot scale batches, including real-time estimates of multiple parameters which are not able to be monitored on-line. With successful application of a soft sensor at this scale, this allows for improved process monitoring, as well as opening up further possibilities for on-line control algorithms, utilizing these on-line model outputs. Biotechnol. Bioeng. 2017;114: 589-599. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Wang, Yi; Lee, Sui Mae; Dykes, Gary
2015-01-01
Bacterial attachment to abiotic surfaces can be explained as a physicochemical process. Mechanisms of the process have been widely studied but are not yet well understood due to their complexity. Physicochemical processes can be influenced by various interactions and factors in attachment systems, including, but not limited to, hydrophobic interactions, electrostatic interactions and substratum surface roughness. Mechanistic models and control strategies for bacterial attachment to abiotic surfaces have been established based on the current understanding of the attachment process and the interactions involved. Due to a lack of process control and standardization in the methodologies used to study the mechanisms of bacterial attachment, however, various challenges are apparent in the development of models and control strategies. In this review, the physicochemical mechanisms, interactions and factors affecting the process of bacterial attachment to abiotic surfaces are described. Mechanistic models established based on these parameters are discussed in terms of their limitations. Currently employed methods to study these parameters and bacterial attachment are critically compared. The roles of these parameters in the development of control strategies for bacterial attachment are reviewed, and the challenges that arise in developing mechanistic models and control strategies are assessed.
Pharmacometric Models for Characterizing the Pharmacokinetics of Orally Inhaled Drugs.
Borghardt, Jens Markus; Weber, Benjamin; Staab, Alexander; Kloft, Charlotte
2015-07-01
During the last decades, the importance of modeling and simulation in clinical drug development, with the goal to qualitatively and quantitatively assess and understand mechanisms of pharmacokinetic processes, has strongly increased. However, this increase could not equally be observed for orally inhaled drugs. The objectives of this review are to understand the reasons for this gap and to demonstrate the opportunities that mathematical modeling of pharmacokinetics of orally inhaled drugs offers. To achieve these objectives, this review (i) discusses pulmonary physiological processes and their impact on the pharmacokinetics after drug inhalation, (ii) provides a comprehensive overview of published pharmacokinetic models, (iii) categorizes these models into physiologically based pharmacokinetic (PBPK) and (clinical data-derived) empirical models, (iv) explores both their (mechanistic) plausibility, and (v) addresses critical aspects of different pharmacometric approaches pertinent for drug inhalation. In summary, pulmonary deposition, dissolution, and absorption are highly complex processes and may represent the major challenge for modeling and simulation of PK after oral drug inhalation. Challenges in relating systemic pharmacokinetics with pulmonary efficacy may be another factor contributing to the limited number of existing pharmacokinetic models for orally inhaled drugs. Investigations comprising in vitro experiments, clinical studies, and more sophisticated mathematical approaches are considered to be necessary for elucidating these highly complex pulmonary processes. With this additional knowledge, the PBPK approach might gain additional attractiveness. Currently, (semi-)mechanistic modeling offers an alternative to generate and investigate hypotheses and to more mechanistically understand the pulmonary and systemic pharmacokinetics after oral drug inhalation including the impact of pulmonary diseases.
Dwivedi, Dipankar; Mohanty, Binayak P.; Lesikar, Bruce J.
2013-01-01
Microbes have been identified as a major contaminant of water resources. Escherichia coli (E. coli) is a commonly used indicator organism. It is well recognized that the fate of E. coli in surface water systems is governed by multiple physical, chemical, and biological factors. The aim of this work is to provide insight into the physical, chemical, and biological factors along with their interactions that are critical in the estimation of E. coli loads in surface streams. There are various models to predict E. coli loads in streams, but they tend to be system or site specific or overly complex without enhancing our understanding of these factors. Hence, based on available data, a Bayesian Neural Network (BNN) is presented for estimating E. coli loads based on physical, chemical, and biological factors in streams. The BNN has the dual advantage of overcoming the absence of quality data (with regards to consistency in data) and determination of mechanistic model parameters by employing a probabilistic framework. This study evaluates whether the BNN model can be an effective alternative tool to mechanistic models for E. coli loads estimation in streams. For this purpose, a comparison with a traditional model (LOADEST, USGS) is conducted. The models are compared for estimated E. coli loads based on available water quality data in Plum Creek, Texas. All the model efficiency measures suggest that overall E. coli loads estimations by the BNN model are better than the E. coli loads estimations by the LOADEST model on all the three occasions (three-fold cross validation). Thirteen factors were used for estimating E. coli loads with the exhaustive feature selection technique, which indicated that six of thirteen factors are important for estimating E. coli loads. Physical factors included temperature and dissolved oxygen; chemical factors include phosphate and ammonia; biological factors include suspended solids and chlorophyll. The results highlight that the LOADEST model estimates E. coli loads better in the smaller ranges, whereas the BNN model estimates E. coli loads better in the higher ranges. Hence, the BNN model can be used to design targeted monitoring programs and implement regulatory standards through TMDL programs. PMID:24511166
Kavlock, R J
1997-01-01
During the last several years, significant changes in the risk assessment process for developmental toxicity of environmental contaminants have begun to emerge. The first of these changes is the development and beginning use of statistically based dose-response models [the benchmark dose (BMD) approach] that better utilize data derived from existing testing approaches. Accompanying this change is the greater emphasis placed on understanding and using mechanistic information to yield more accurate, reliable, and less uncertain risk assessments. The next stage in the evolution of risk assessment will be the use of biologically based dose-response (BBDR) models that begin to build into the statistically based models factors related to the underlying kinetic, biochemical, and/or physiologic processes perturbed by a toxicant. Such models are now emerging from several research laboratories. The introduction of quantitative models and the incorporation of biologic information into them has pointed to the need for even more sophisticated modifications for which we offer the term embryologically based dose-response (EBDR) models. Because these models would be based upon the understanding of normal morphogenesis, they represent a quantum leap in our thinking, but their complexity presents daunting challenges both to the developmental biologist and the developmental toxicologist. Implementation of these models will require extensive communication between developmental toxicologists, molecular embryologists, and biomathematicians. The remarkable progress in the understanding of mammalian embryonic development at the molecular level that has occurred over the last decade combined with advances in computing power and computational models should eventually enable these as yet hypothetical models to be brought into use.
Kawamura, Takahisa; Kasai, Hidefumi; Fermanelli, Valentina; Takahashi, Toshiaki; Sakata, Yukinori; Matsuoka, Toshiyuki; Ishii, Mika; Tanigawara, Yusuke
2018-06-22
Post-marketing surveillance is useful to collect safety data in real-world clinical settings. In this study, we firstly applied the post-marketing real-world data on a mechanistic model analysis for neutropenic profiles of eribulin in patients with recurrent or metastatic breast cancer (RBC/MBC). Demographic and safety data were collected using an active surveillance method from eribulin-treated RBC/MBC patients. Changes in neutrophil counts over time were analyzed using a mechanistic pharmacodynamic model. Pathophysiological factors that may affect the severity of neutropenia were investigated and neutropenic patterns were simulated for different treatment schedules. Clinical and laboratory data were collected from 401 patients (5199 neutrophil count measurements) who had not received granulocyte colony stimulating factor and were eligible for pharmacodynamic analysis. The estimated mean parameters were: mean transit time = 104.5 h, neutrophil proliferation rate constant = 0.0377 h -1 , neutrophil elimination rate constant = 0.0295 h -1 , and linear coefficient of drug effect = 0.0413 mL/ng. Low serum albumin levels and low baseline neutrophil counts were associated with severe neutropenia. The probability of grade ≥3 neutropenia was predicted to be 69%, 27%, and 27% for patients on standard, biweekly, and triweekly treatment scenarios, respectively, based on virtual simulations using the developed pharmacodynamic model. In conclusion, this is the first application of post-marketing surveillance data to a model-based safety analysis. This analysis of safety data reflecting authentic clinical settings will provide useful information on the safe use and potential risk factors of eribulin. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Body Fineness Ratio as a Predictor of Maximum Prolonged-Swimming Speed in Coral Reef Fishes
Walker, Jeffrey A.; Alfaro, Michael E.; Noble, Mae M.; Fulton, Christopher J.
2013-01-01
The ability to sustain high swimming speeds is believed to be an important factor affecting resource acquisition in fishes. While we have gained insights into how fin morphology and motion influences swimming performance in coral reef fishes, the role of other traits, such as body shape, remains poorly understood. We explore the ability of two mechanistic models of the causal relationship between body fineness ratio and endurance swimming-performance to predict maximum prolonged-swimming speed (Umax) among 84 fish species from the Great Barrier Reef, Australia. A drag model, based on semi-empirical data on the drag of rigid, submerged bodies of revolution, was applied to species that employ pectoral-fin propulsion with a rigid body at U max. An alternative model, based on the results of computer simulations of optimal shape in self-propelled undulating bodies, was applied to the species that swim by body-caudal-fin propulsion at Umax. For pectoral-fin swimmers, Umax increased with fineness, and the rate of increase decreased with fineness, as predicted by the drag model. While the mechanistic and statistical models of the relationship between fineness and Umax were very similar, the mechanistic (and statistical) model explained only a small fraction of the variance in Umax. For body-caudal-fin swimmers, we found a non-linear relationship between fineness and Umax, which was largely negative over most of the range of fineness. This pattern fails to support either predictions from the computational models or standard functional interpretations of body shape variation in fishes. Our results suggest that the widespread hypothesis that a more optimal fineness increases endurance-swimming performance via reduced drag should be limited to fishes that swim with rigid bodies. PMID:24204575
Mechanistic models versus machine learning, a fight worth fighting for the biological community?
Baker, Ruth E; Peña, Jose-Maria; Jayamohan, Jayaratnam; Jérusalem, Antoine
2018-05-01
Ninety per cent of the world's data have been generated in the last 5 years ( Machine learning: the power and promise of computers that learn by example Report no. DES4702. Issued April 2017. Royal Society). A small fraction of these data is collected with the aim of validating specific hypotheses. These studies are led by the development of mechanistic models focused on the causality of input-output relationships. However, the vast majority is aimed at supporting statistical or correlation studies that bypass the need for causality and focus exclusively on prediction. Along these lines, there has been a vast increase in the use of machine learning models, in particular in the biomedical and clinical sciences, to try and keep pace with the rate of data generation. Recent successes now beg the question of whether mechanistic models are still relevant in this area. Said otherwise, why should we try to understand the mechanisms of disease progression when we can use machine learning tools to directly predict disease outcome? © 2018 The Author(s).
Dynamic Modeling of Cell-Free Biochemical Networks Using Effective Kinetic Models
2015-03-03
whether we could simultaneously estimate kinetic parameters and regulatory connectivity, in the absence of specific mechanistic knowledge , from synthetic...that manage metabolism. Of course, these issues are not independent; any description of enzyme activity regulation will be a function of system state...the absence of specific mechanistic knowledge , from synthetic experimental data. Toward these questions, we explored five hypothetical cell-free
A model for life predictions of nickel-base superalloys in high-temperature low cycle fatigue
NASA Technical Reports Server (NTRS)
Romanoski, Glenn R.; Pelloux, Regis M.; Antolovich, Stephen D.
1988-01-01
Extensive characterization of low-cycle fatigue damage mechanisms was performed on polycrystalline Rene 80 and IN100 tested in the temperature range from 871 to 1000 C. Low-cycle fatigue life was found to be dominated by propagation of microcracks to a critical size governed by the maximum tensile stress. A model was developed which incorporates a threshold stress for crack extension, a stress-based crack growth expression, and a failure criterion. The mathematical equivalence between this mechanistically based model and the strain-life low-cycle fatigue law was demonstrated using cyclic stress-strain relationships. The model was shown to correlate the high-temperature low-cycle fatigue data of the different nickel-base superalloys considered in this study.
Rougier, Thibaud; Lassalle, Géraldine; Drouineau, Hilaire; Dumoulin, Nicolas; Faure, Thierry; Deffuant, Guillaume; Rochard, Eric; Lambert, Patrick
2015-01-01
Species can respond to climate change by tracking appropriate environmental conditions in space, resulting in a range shift. Species Distribution Models (SDMs) can help forecast such range shift responses. For few species, both correlative and mechanistic SDMs were built, but allis shad (Alosa alosa), an endangered anadromous fish species, is one of them. The main purpose of this study was to provide a framework for joint analyses of correlative and mechanistic SDMs projections in order to strengthen conservation measures for species of conservation concern. Guidelines for joint representation and subsequent interpretation of models outputs were defined and applied. The present joint analysis was based on the novel mechanistic model GR3D (Global Repositioning Dynamics of Diadromous fish Distribution) which was parameterized on allis shad and then used to predict its future distribution along the European Atlantic coast under different climate change scenarios (RCP 4.5 and RCP 8.5). We then used a correlative SDM for this species to forecast its distribution across the same geographic area and under the same climate change scenarios. First, projections from correlative and mechanistic models provided congruent trends in probability of habitat suitability and population dynamics. This agreement was preferentially interpreted as referring to the species vulnerability to climate change. Climate change could not be accordingly listed as a major threat for allis shad. The congruence in predicted range limits between SDMs projections was the next point of interest. The difference, when noticed, required to deepen our understanding of the niche modelled by each approach. In this respect, the relative position of the northern range limit between the two methods strongly suggested here that a key biological process related to intraspecific variability was potentially lacking in the mechanistic SDM. Based on our knowledge, we hypothesized that local adaptations to cold temperatures deserved more attention in terms of modelling, but further in conservation planning as well.
Rougier, Thibaud; Lassalle, Géraldine; Drouineau, Hilaire; Dumoulin, Nicolas; Faure, Thierry; Deffuant, Guillaume; Rochard, Eric; Lambert, Patrick
2015-01-01
Species can respond to climate change by tracking appropriate environmental conditions in space, resulting in a range shift. Species Distribution Models (SDMs) can help forecast such range shift responses. For few species, both correlative and mechanistic SDMs were built, but allis shad (Alosa alosa), an endangered anadromous fish species, is one of them. The main purpose of this study was to provide a framework for joint analyses of correlative and mechanistic SDMs projections in order to strengthen conservation measures for species of conservation concern. Guidelines for joint representation and subsequent interpretation of models outputs were defined and applied. The present joint analysis was based on the novel mechanistic model GR3D (Global Repositioning Dynamics of Diadromous fish Distribution) which was parameterized on allis shad and then used to predict its future distribution along the European Atlantic coast under different climate change scenarios (RCP 4.5 and RCP 8.5). We then used a correlative SDM for this species to forecast its distribution across the same geographic area and under the same climate change scenarios. First, projections from correlative and mechanistic models provided congruent trends in probability of habitat suitability and population dynamics. This agreement was preferentially interpreted as referring to the species vulnerability to climate change. Climate change could not be accordingly listed as a major threat for allis shad. The congruence in predicted range limits between SDMs projections was the next point of interest. The difference, when noticed, required to deepen our understanding of the niche modelled by each approach. In this respect, the relative position of the northern range limit between the two methods strongly suggested here that a key biological process related to intraspecific variability was potentially lacking in the mechanistic SDM. Based on our knowledge, we hypothesized that local adaptations to cold temperatures deserved more attention in terms of modelling, but further in conservation planning as well. PMID:26426280
Mechanistic equivalent circuit modelling of a commercial polymer electrolyte membrane fuel cell
NASA Astrophysics Data System (ADS)
Giner-Sanz, J. J.; Ortega, E. M.; Pérez-Herranz, V.
2018-03-01
Electrochemical impedance spectroscopy (EIS) has been widely used in the fuel cell field since it allows deconvolving the different physic-chemical processes that affect the fuel cell performance. Typically, EIS spectra are modelled using electric equivalent circuits. In this work, EIS spectra of an individual cell of a commercial PEM fuel cell stack were obtained experimentally. The goal was to obtain a mechanistic electric equivalent circuit in order to model the experimental EIS spectra. A mechanistic electric equivalent circuit is a semiempirical modelling technique which is based on obtaining an equivalent circuit that does not only correctly fit the experimental spectra, but which elements have a mechanistic physical meaning. In order to obtain the aforementioned electric equivalent circuit, 12 different models with defined physical meanings were proposed. These equivalent circuits were fitted to the obtained EIS spectra. A 2 step selection process was performed. In the first step, a group of 4 circuits were preselected out of the initial list of 12, based on general fitting indicators as the determination coefficient and the fitted parameter uncertainty. In the second step, one of the 4 preselected circuits was selected on account of the consistency of the fitted parameter values with the physical meaning of each parameter.
The US EPA ToxCast program aims to develop methods for mechanistically-based chemical prioritization using a suite of high throughput, in vitro assays that probe relevant biological pathways, and coupling them with statistical and machine learning methods that produce predictive ...
Mechanistic-empirical evaluation of the Mn/ROAD low volume road test sections.
DOT National Transportation Integrated Search
1998-05-01
The purpose of this study was to use Mn/ROAD mainline flexible pavement data to verify, refine, and modify the Illinois Department of Transportation (IDOT) Mechanistic-Empirical (M-E) based flexible pavement design procedures and concepts.
Minimum area requirements for an at-risk butterfly based on movement and demography.
Brown, Leone M; Crone, Elizabeth E
2016-02-01
Determining the minimum area required to sustain populations has a long history in theoretical and conservation biology. Correlative approaches are often used to estimate minimum area requirements (MARs) based on relationships between area and the population size required for persistence or between species' traits and distribution patterns across landscapes. Mechanistic approaches to estimating MAR facilitate prediction across space and time but are few. We used a mechanistic MAR model to determine the critical minimum patch size (CMP) for the Baltimore checkerspot butterfly (Euphydryas phaeton), a locally abundant species in decline along its southern range, and sister to several federally listed species. Our CMP is based on principles of diffusion, where individuals in smaller patches encounter edges and leave with higher probability than those in larger patches, potentially before reproducing. We estimated a CMP for the Baltimore checkerspot of 0.7-1.5 ha, in accordance with trait-based MAR estimates. The diffusion rate on which we based this CMP was broadly similar when estimated at the landscape scale (comparing flight path vs. capture-mark-recapture data), and the estimated population growth rate was consistent with observed site trends. Our mechanistic approach to estimating MAR is appropriate for species whose movement follows a correlated random walk and may be useful where landscape-scale distributions are difficult to assess, but demographic and movement data are obtainable from a single site or the literature. Just as simple estimates of lambda are often used to assess population viability, the principles of diffusion and CMP could provide a starting place for estimating MAR for conservation. © 2015 Society for Conservation Biology.
Eric J. Gustafson
2013-01-01
Researchers and natural resource managers need predictions of how multiple global changes (e.g., climate change, rising levels of air pollutants, exotic invasions) will affect landscape composition and ecosystem function. Ecological predictive models used for this purpose are constructed using either a mechanistic (process-based) or a phenomenological (empirical)...
A bioinformatics expert system linking functional data to anatomical outcomes in limb regeneration
Lobo, Daniel; Feldman, Erica B.; Shah, Michelle; Malone, Taylor J.
2014-01-01
Abstract Amphibians and molting arthropods have the remarkable capacity to regenerate amputated limbs, as described by an extensive literature of experimental cuts, amputations, grafts, and molecular techniques. Despite a rich history of experimental effort, no comprehensive mechanistic model exists that can account for the pattern regulation observed in these experiments. While bioinformatics algorithms have revolutionized the study of signaling pathways, no such tools have heretofore been available to assist scientists in formulating testable models of large‐scale morphogenesis that match published data in the limb regeneration field. Major barriers to preventing an algorithmic approach are the lack of formal descriptions for experimental regenerative information and a repository to centralize storage and mining of functional data on limb regeneration. Establishing a new bioinformatics of shape would significantly accelerate the discovery of key insights into the mechanisms that implement complex regeneration. Here, we describe a novel mathematical ontology for limb regeneration to unambiguously encode phenotype, manipulation, and experiment data. Based on this formalism, we present the first centralized formal database of published limb regeneration experiments together with a user‐friendly expert system tool to facilitate its access and mining. These resources are freely available for the community and will assist both human biologists and artificial intelligence systems to discover testable, mechanistic models of limb regeneration. PMID:25729585
Hauschild, L; Lovatto, P A; Pomar, J; Pomar, C
2012-07-01
The objective of this study was to develop and evaluate a mathematical model used to estimate the daily amino acid requirements of individual growing-finishing pigs. The model includes empirical and mechanistic model components. The empirical component estimates daily feed intake (DFI), BW, and daily gain (DG) based on individual pig information collected in real time. Based on DFI, BW, and DG estimates, the mechanistic component uses classic factorial equations to estimate the optimal concentration of amino acids that must be offered to each pig to meet its requirements. The model was evaluated with data from a study that investigated the effect of feeding pigs with a 3-phase or daily multiphase system. The DFI and BW values measured in this study were compared with those estimated by the empirical component of the model. The coherence of the values estimated by the mechanistic component was evaluated by analyzing if it followed a normal pattern of requirements. Lastly, the proposed model was evaluated by comparing its estimates with those generated by the existing growth model (InraPorc). The precision of the proposed model and InraPorc in estimating DFI and BW was evaluated through the mean absolute error. The empirical component results indicated that the DFI and BW trajectories of individual pigs fed ad libitum could be predicted 1 d (DFI) or 7 d (BW) ahead with the average mean absolute error of 12.45 and 1.85%, respectively. The average mean absolute error obtained with the InraPorc for the average individual of the population was 14.72% for DFI and 5.38% for BW. Major differences were observed when estimates from InraPorc were compared with individual observations. The proposed model, however, was effective in tracking the change in DFI and BW for each individual pig. The mechanistic model component estimated the optimal standardized ileal digestible Lys to NE ratio with reasonable between animal (average CV = 7%) and overtime (average CV = 14%) variation. Thus, the amino acid requirements estimated by model are animal- and time-dependent and follow, in real time, the individual DFI and BW growth patterns. The proposed model can follow the average feed intake and feed weight trajectory of each individual pig in real time with good accuracy. Based on these trajectories and using classical factorial equations, the model makes it possible to estimate dynamically the AA requirements of each animal, taking into account the intake and growth changes of the animal.
Xu, Dake; Li, Yingchao; Gu, Tingyue
2016-08-01
Biocorrosion is also known as microbiologically influenced corrosion (MIC). Most anaerobic MIC cases can be classified into two major types. Type I MIC involves non-oxygen oxidants such as sulfate and nitrate that require biocatalysis for their reduction in the cytoplasm of microbes such as sulfate reducing bacteria (SRB) and nitrate reducing bacteria (NRB). This means that the extracellular electrons from the oxidation of metal such as iron must be transported across cell walls into the cytoplasm. Type II MIC involves oxidants such as protons that are secreted by microbes such as acid producing bacteria (APB). The biofilms in this case supply the locally high concentrations of oxidants that are corrosive without biocatalysis. This work describes a mechanistic model that is based on the biocatalytic cathodic sulfate reduction (BCSR) theory. The model utilizes charge transfer and mass transfer concepts to describe the SRB biocorrosion process. The model also includes a mechanism to describe APB attack based on the local acidic pH at a pit bottom. A pitting prediction software package has been created based on the mechanisms. It predicts long-term pitting rates and worst-case scenarios after calibration using SRB short-term pit depth data. Various parameters can be investigated through computer simulation. Copyright © 2016 Elsevier B.V. All rights reserved.
The use and misuse of V(c,max) in Earth System Models.
Rogers, Alistair
2014-02-01
Earth System Models (ESMs) aim to project global change. Central to this aim is the need to accurately model global carbon fluxes. Photosynthetic carbon dioxide assimilation by the terrestrial biosphere is the largest of these fluxes, and in many ESMs is represented by the Farquhar, von Caemmerer and Berry (FvCB) model of photosynthesis. The maximum rate of carboxylation by the enzyme Rubisco, commonly termed V c,max, is a key parameter in the FvCB model. This study investigated the derivation of the values of V c,max used to represent different plant functional types (PFTs) in ESMs. Four methods for estimating V c,max were identified; (1) an empirical or (2) mechanistic relationship was used to relate V c,max to leaf N content, (3) V c,max was estimated using an approach based on the optimization of photosynthesis and respiration or (4) calibration of a user-defined V c,max to obtain a target model output. Despite representing the same PFTs, the land model components of ESMs were parameterized with a wide range of values for V c,max (-46 to +77% of the PFT mean). In many cases, parameterization was based on limited data sets and poorly defined coefficients that were used to adjust model parameters and set PFT-specific values for V c,max. Examination of the models that linked leaf N mechanistically to V c,max identified potential changes to fixed parameters that collectively would decrease V c,max by 31% in C3 plants and 11% in C4 plants. Plant trait data bases are now available that offer an excellent opportunity for models to update PFT-specific parameters used to estimate V c,max. However, data for parameterizing some PFTs, particularly those in the Tropics and the Arctic are either highly variable or largely absent.
Model for estimating enteric methane emissions from United States dairy and feedlot cattle.
Kebreab, E; Johnson, K A; Archibeque, S L; Pape, D; Wirth, T
2008-10-01
Methane production from enteric fermentation in cattle is one of the major sources of anthropogenic greenhouse gas emission in the United States and worldwide. National estimates of methane emissions rely on mathematical models such as the one recommended by the Intergovernmental Panel for Climate Change (IPCC). Models used for prediction of methane emissions from cattle range from empirical to mechanistic with varying input requirements. Two empirical and 2 mechanistic models (COWPOLL and MOLLY) were evaluated for their prediction ability using individual cattle measurements. Model selection was based on mean square prediction error (MSPE), concordance correlation coefficient, and residuals vs. predicted values analyses. In dairy cattle, COWPOLL had the lowest root MSPE and greatest accuracy and precision of predicting methane emissions (correlation coefficient estimate = 0.75). The model simulated differences in diet more accurately than the other models, and the residuals vs. predicted value analysis showed no mean bias (P = 0.71). In feedlot cattle, MOLLY had the lowest root MSPE with almost all errors from random sources (correlation coefficient estimate = 0.69). The IPCC model also had good agreement with observed values, and no significant mean (P = 0.74) or linear bias (P = 0.11) was detected when residuals were plotted against predicted values. A fixed methane conversion factor (Ym) might be an easier alternative to diet-dependent variable Ym. Based on the results, the 2 mechanistic models were used to simulate methane emissions from representative US diets and were compared with the IPCC model. The average Ym in dairy cows was 5.63% of GE (range 3.78 to 7.43%) compared with 6.5% +/- 1% recommended by IPCC. In feedlot cattle, the average Ym was 3.88% (range 3.36 to 4.56%) compared with 3% +/- 1% recommended by IPCC. Based on our simulations, using IPCC values can result in an overestimate of about 12.5% and underestimate of emissions by about 9.8% for dairy and feedlot cattle, respectively. In addition to providing improved estimates of emissions based on diets, mechanistic models can be used to assess mitigation options such as changing source of carbohydrate or addition of fat to decrease methane, which is not possible with empirical models. We recommend national inventories use diet-specific Ym values predicted by mechanistic models to estimate methane emissions from cattle.
DOT National Transportation Integrated Search
2013-06-01
This report summarizes a research project aimed at developing degradation models for bridge decks in the state of Michigan based on durability mechanics. A probabilistic framework to implement local-level mechanistic-based models for predicting the c...
Robust PBPK/PD-Based Model Predictive Control of Blood Glucose.
Schaller, Stephan; Lippert, Jorg; Schaupp, Lukas; Pieber, Thomas R; Schuppert, Andreas; Eissing, Thomas
2016-07-01
Automated glucose control (AGC) has not yet reached the point where it can be applied clinically [3]. Challenges are accuracy of subcutaneous (SC) glucose sensors, physiological lag times, and both inter- and intraindividual variability. To address above issues, we developed a novel scheme for MPC that can be applied to AGC. An individualizable generic whole-body physiology-based pharmacokinetic and dynamics (PBPK/PD) model of the glucose, insulin, and glucagon metabolism has been used as the predictive kernel. The high level of mechanistic detail represented by the model takes full advantage of the potential of MPC and may make long-term prediction possible as it captures at least some relevant sources of variability [4]. Robustness against uncertainties was increased by a control cascade relying on proportional-integrative derivative-based offset control. The performance of this AGC scheme was evaluated in silico and retrospectively using data from clinical trials. This analysis revealed that our approach handles sensor noise with a MARD of 10%-14%, and model uncertainties and disturbances. The results suggest that PBPK/PD models are well suited for MPC in a glucose control setting, and that their predictive power in combination with the integrated database-driven (a priori individualizable) model framework will help overcome current challenges in the development of AGC systems. This study provides a new, generic, and robust mechanistic approach to AGC using a PBPK platform with extensive a priori (database) knowledge for individualization.
Varma, Manthena V S; Lin, Jian; Bi, Yi-An; Rotter, Charles J; Fahmi, Odette A; Lam, Justine L; El-Kattan, Ayman F; Goosen, Theunis C; Lai, Yurong
2013-05-01
Repaglinide is mainly metabolized by cytochrome P450 enzymes CYP2C8 and CYP3A4, and it is also a substrate to a hepatic uptake transporter, organic anion transporting polypeptide (OATP)1B1. The purpose of this study is to predict the dosing time-dependent pharmacokinetic interactions of repaglinide with rifampicin, using mechanistic models. In vitro hepatic transport of repaglinide, characterized using sandwich-cultured human hepatocytes, and intrinsic metabolic parameters were used to build a dynamic whole-body physiologically-based pharmacokinetic (PBPK) model. The PBPK model adequately described repaglinide plasma concentration-time profiles and successfully predicted area under the plasma concentration-time curve ratios of repaglinide (within ± 25% error), dosed (staggered 0-24 hours) after rifampicin treatment when primarily considering induction of CYP3A4 and reversible inhibition of OATP1B1 by rifampicin. Further, a static mechanistic "extended net-effect" model incorporating transport and metabolic disposition parameters of repaglinide and interaction potency of rifampicin was devised. Predictions based on the static model are similar to those observed in the clinic (average error ∼19%) and to those based on the PBPK model. Both the models suggested that the combined effect of increased gut extraction and decreased hepatic uptake caused minimal repaglinide systemic exposure change when repaglinide is dosed simultaneously or 1 hour after the rifampicin dose. On the other hand, isolated induction effect as a result of temporal separation of the two drugs translated to an approximate 5-fold reduction in repaglinide systemic exposure. In conclusion, both dynamic and static mechanistic models are instrumental in delineating the quantitative contribution of transport and metabolism in the dosing time-dependent repaglinide-rifampicin interactions.
Chiu, Weihsueh A.; Guyton, Kathryn Z.; Martin, Matthew T.; Reif, David M.; Rusyn, Ivan
2017-01-01
Evidence regarding carcinogenic mechanisms serves a critical role in International Agency for Research on Cancer (IARC) Monograph evaluations. Three recent IARC Working Groups pioneered inclusion of the US Environmental Protection Agency (EPA) ToxCast program high-throughput screening (HTS) data to supplement other mechanistic evidence. In Monograph V110, HTS profiles were compared between perfluorooctanoic acid (PFOA) and prototypical activators across multiple nuclear receptors. For Monograph V112 -113, HTS assays were mapped to 10 key characteristics of carcinogens identified by an IARC expert group, and systematically considered as an additional mechanistic data stream. Both individual assay results and ToxPi-based rankings informed mechanistic evaluations. Activation of multiple nuclear receptors in HTS assays showed that PFOA targets peroxisome proliferator activated and other receptors. ToxCast assays substantially covered 5 of 10 key characteristics, corroborating literature evidence of “induces oxidative stress” and “alters cell proliferation, cell death or nutrient supply” and filling gaps for “modulates receptor-mediated effects.” Thus, ToxCast HTS data were useful both in evaluating specific mechanistic hypotheses and in the overall evaluation of mechanistic evidence. However, additional HTS assays are needed to provide more comprehensive coverage of the 10 key characteristics of carcinogens that form the basis of current IARC mechanistic evaluations. PMID:28738424
Chiu, Weihsueh A; Guyton, Kathryn Z; Martin, Matthew T; Reif, David M; Rusyn, Ivan
2018-01-01
Evidence regarding carcinogenic mechanisms serves a critical role in International Agency for Research on Cancer (IARC) Monograph evaluations. Three recent IARC Working Groups pioneered inclusion of the US Environmental Protection Agency (EPA) ToxCast program high-throughput screening (HTS) data to supplement other mechanistic evidence. In Monograph V110, HTS profiles were compared between perfluorooctanoic acid (PFOA) and prototypical activators across multiple nuclear receptors. For Monograph V112-113, HTS assays were mapped to 10 key characteristics of carcinogens identified by an IARC expert group, and systematically considered as an additional mechanistic data stream. Both individual assay results and ToxPi-based rankings informed mechanistic evaluations. Activation of multiple nuclear receptors in HTS assays showed that PFOA targets not only peroxisome proliferator activated receptors, but also other receptors. ToxCast assays substantially covered 5 of 10 key characteristics, corroborating literature evidence of "induces oxidative stress" and "alters cell proliferation, cell death or nutrient supply" and filling gaps for "modulates receptor-mediated effects." Thus, ToxCast HTS data were useful both in evaluating specific mechanistic hypotheses and in contributing to the overall evaluation of mechanistic evidence. However, additional HTS assays are needed to provide more comprehensive coverage of the 10 key characteristics of carcinogens that form the basis of current IARC mechanistic evaluations.
Emami Riedmaier, Arian; Lindley, David J; Hall, Jeffrey A; Castleberry, Steven; Slade, Russell T; Stuart, Patricia; Carr, Robert A; Borchardt, Thomas B; Bow, Daniel A J; Nijsen, Marjoleen
2018-01-01
Venetoclax, a selective B-cell lymphoma-2 inhibitor, is a biopharmaceutics classification system class IV compound. The aim of this study was to develop a physiologically based pharmacokinetic (PBPK) model to mechanistically describe absorption and disposition of an amorphous solid dispersion formulation of venetoclax in humans. A mechanistic PBPK model was developed incorporating measured amorphous solubility, dissolution, metabolism, and plasma protein binding. A middle-out approach was used to define permeability. Model predictions of oral venetoclax pharmacokinetics were verified against clinical studies of fed and fasted healthy volunteers, and clinical drug interaction studies with strong CYP3A inhibitor (ketoconazole) and inducer (rifampicin). Model verification demonstrated accurate prediction of the observed food effect following a low-fat diet. Ratios of predicted versus observed C max and area under the curve of venetoclax were within 0.8- to 1.25-fold of observed ratios for strong CYP3A inhibitor and inducer interactions, indicating that the venetoclax elimination pathway was correctly specified. The verified venetoclax PBPK model is one of the first examples mechanistically capturing absorption, food effect, and exposure of an amorphous solid dispersion formulated compound. This model allows evaluation of untested drug-drug interactions, especially those primarily occurring in the intestine, and paves the way for future modeling of biopharmaceutics classification system IV compounds. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Flood forecasting using non-stationarity in a river with tidal influence - a feasibility study
NASA Astrophysics Data System (ADS)
Killick, Rebecca; Kretzschmar, Ann; Ilic, Suzi; Tych, Wlodek
2017-04-01
Flooding is the most common natural hazard causing damage, disruption and loss of life worldwide. Despite improvements in modelling and forecasting of water levels and flood inundation (Kretzschmar et al., 2014; Hoitink and Jay, 2016), there are still large discrepancies between predictions and observations particularly during storm events when accurate predictions are most important. Many models exist for forecasting river levels (Smith et al., 2013; Leedal et al., 2013) however they commonly assume that the errors in the data are independent, stationary and normally distributed. This is generally not the case especially during storm events suggesting that existing models are not describing the drivers of river level in an appropriate fashion. Further challenges exist in the lower sections of a river influenced by both river and tidal flows and their interaction and there is scope for improvement in prediction. This paper investigates the use of a powerful statistical technique to adaptively forecast river levels by modelling the process as locally stationary. The proposed methodology takes information on both upstream and downstream river levels and incorporates meteorological information (rainfall forecasts) and tidal levels when required to forecast river levels at a specified location. Using this approach, a single model will be capable of predicting water levels in both tidal and non-tidal river reaches. In this pilot project, the methodology of Smith et al. (2013) using harmonic tidal analysis and data based mechanistic modelling is compared with the methodology developed by Killick et al. (2016) utilising data-driven wavelet decomposition to account for the information contained in the upstream and downstream river data to forecast a non-stationary time-series. Preliminary modelling has been carried out using the tidal stretch of the River Lune in North-west England and initial results are presented here. Future work includes expanding the methodology to forecast river levels at a network of locations simultaneously. References Hoitink, A. J. F., and D. A. Jay (2016), Tidal river dynamics: Implications for deltas, Rev. Geophys., 54, 240-272 Killick, R., Knight, M., Nason, G.P., Eckley, I.A. (2016) The Local Partial Autocorrelation Function and its Application to the Forecasting of Locally Stationary Time Series. Submitted Kretzschmar, Ann and Tych, Wlodek and Chappell, Nick A (2014) Reversing hydrology: estimation of sub-hourly rainfall time-series from streamflow. Env. Modell Softw., 60. pp. 290-301 D. Leedal, A. H. Weerts, P. J. Smith, & K. J. Beven. (2013). Application of data-based mechanistic modelling for flood forecasting at multiple locations in the Eden catchment in the National Flood Forecasting System (England and Wales). HESS, 17(1), 177-185. Smith, P., Beven, K., Horsburgh, K., Hardaker, P., & Collier, C. (2013). Data-based mechanistic modelling of tidally affected river reaches for flood warning purposes: An example on the River Dee, UK. , Q.J.R. Meteorol. Soc. 139(671), 340-349.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kruzic, Jamie J.; Evans, T. Matthew; Greaney, P. Alex
The report describes the development of a discrete element method (DEM) based modeling approach to quantitatively predict deformation and failure of typical nickel based superalloys. A series of experimental data, including microstructure and mechanical property characterization at 600°C, was collected for a relatively simple, model solid solution Ni-20Cr alloy (Nimonic 75) to determine inputs for the model and provide data for model validation. Nimonic 75 was considered ideal for this study because it is a certified tensile and creep reference material. A series of new DEM modeling approaches were developed to capture the complexity of metal deformation, including cubic elasticmore » anisotropy and plastic deformation both with and without strain hardening. Our model approaches were implemented into a commercially available DEM code, PFC3D, that is commonly used by engineers. It is envisioned that once further developed, this new DEM modeling approach can be adapted to a wide range of engineering applications.« less
Helmlinger, Gabriel; Al-Huniti, Nidal; Aksenov, Sergey; Peskov, Kirill; Hallow, Karen M; Chu, Lulu; Boulton, David; Eriksson, Ulf; Hamrén, Bengt; Lambert, Craig; Masson, Eric; Tomkinson, Helen; Stanski, Donald
2017-11-15
Modeling & simulation (M&S) methodologies are established quantitative tools, which have proven to be useful in supporting the research, development (R&D), regulatory approval, and marketing of novel therapeutics. Applications of M&S help design efficient studies and interpret their results in context of all available data and knowledge to enable effective decision-making during the R&D process. In this mini-review, we focus on two sets of modeling approaches: population-based models, which are well-established within the pharmaceutical industry today, and fall under the discipline of clinical pharmacometrics (PMX); and systems dynamics models, which encompass a range of models of (patho-)physiology amenable to pharmacological intervention, of signaling pathways in biology, and of substance distribution in the body (today known as physiologically-based pharmacokinetic models) - which today may be collectively referred to as quantitative systems pharmacology models (QSP). We next describe the convergence - or rather selected integration - of PMX and QSP approaches into 'middle-out' drug-disease models, which retain selected mechanistic aspects, while remaining parsimonious, fit-for-purpose, and able to address variability and the testing of covariates. We further propose development opportunities for drug-disease systems models, to increase their utility and applicability throughout the preclinical and clinical spectrum of pharmaceutical R&D. Copyright © 2017 Elsevier B.V. All rights reserved.
Larsen, Malte Selch; Keizer, Ron; Munro, Gordon; Mørk, Arne; Holm, René; Savic, Rada; Kreilgaard, Mads
2016-05-01
Gabapentin displays non-linear drug disposition, which complicates dosing for optimal therapeutic effect. Thus, the current study was performed to elucidate the pharmacokinetic/pharmacodynamic (PKPD) relationship of gabapentin's effect on mechanical hypersensitivity in a rat model of CFA-induced inflammatory hyperalgesia. A semi-mechanistic population-based PKPD model was developed using nonlinear mixed-effects modelling, based on gabapentin plasma and brain extracellular fluid (ECF) time-concentration data and measurements of CFA-evoked mechanical hyperalgesia following administration of a range of gabapentin doses (oral and intravenous). The plasma/brain ECF concentration-time profiles of gabapentin were adequately described with a two-compartment plasma model with saturable intestinal absorption rate (K m = 44.1 mg/kg, V max = 41.9 mg/h∙kg) and dose-dependent oral bioavailability linked to brain ECF concentration through a transit compartment. Brain ECF concentration was directly linked to a sigmoid E max function describing reversal of hyperalgesia (EC 50, plasma = 16.7 μg/mL, EC 50, brain = 3.3 μg/mL). The proposed semi-mechanistic population-based PKPD model provides further knowledge into the understanding of gabapentin's non-linear pharmacokinetics and the link between plasma/brain disposition and anti-hyperalgesic effects. The model suggests that intestinal absorption is the primary source of non-linearity and that the investigated rat model provides reasonable predictions of clinically effective plasma concentrations for gabapentin.
Langenbucher, Frieder
2007-08-01
This paper discusses Excel applications related to the prediction of drug absorbability from physicochemical constants. PHDISSOC provides a generalized model for pH profiles of electrolytic dissociation, water solubility, and partition coefficient. SKMODEL predicts drug absorbability, based on a log-log plot of water solubility and O/W partitioning; augmented by additional features such as electrolytic dissociation, melting point, and the dose administered. GIABS presents a mechanistic model of g.i. drug absorption. BIODATCO presents a database compiling relevant drug data to be used for quantitative predictions.
Fuel thermal conductivity (FTHCON). Status report. [PWR; BWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagrman, D. L.
1979-02-01
An improvement of the fuel thermal conductivity subcode is described which is part of the fuel rod behavior modeling task performed at EG and G Idaho, Inc. The original version was published in the Materials Properties (MATPRO) Handbook, Section A-2 (Fuel Thermal Conductivity). The improved version incorporates data which were not included in the previous work and omits some previously used data which are believed to come from cracked specimens. The models for the effect of porosity on thermal conductivity and for the electronic contribution to thermal coductivity have been completely revised in order to place these models on amore » more mechanistic basis. As a result of modeling improvements the standard error of the model with respect to its data base has been significantly reduced.« less
Cañete-Valdeón, José M; Wieringa, Roel; Smallbone, Kieran
2012-12-01
There is a growing interest in mathematical mechanistic modelling as a promising strategy for understanding tumour progression. This approach is accompanied by a methodological change of making research, in which models help to actively generate hypotheses instead of waiting for general principles to become apparent once sufficient data are accumulated. This paper applies recent research from philosophy of science to uncover three important problems of mechanistic modelling which may compromise its mainstream application, namely: the dilemma of formal and informal descriptions, the need to express degrees of confidence and the need of an argumentation framework. We report experience and research on similar problems from software engineering and provide evidence that the solutions adopted there can be transferred to the biological domain. We hope this paper can provoke new opportunities for further and profitable interdisciplinary research in the field.
Reinterpreting maximum entropy in ecology: a null hypothesis constrained by ecological mechanism.
O'Dwyer, James P; Rominger, Andrew; Xiao, Xiao
2017-07-01
Simplified mechanistic models in ecology have been criticised for the fact that a good fit to data does not imply the mechanism is true: pattern does not equal process. In parallel, the maximum entropy principle (MaxEnt) has been applied in ecology to make predictions constrained by just a handful of state variables, like total abundance or species richness. But an outstanding question remains: what principle tells us which state variables to constrain? Here we attempt to solve both problems simultaneously, by translating a given set of mechanisms into the state variables to be used in MaxEnt, and then using this MaxEnt theory as a null model against which to compare mechanistic predictions. In particular, we identify the sufficient statistics needed to parametrise a given mechanistic model from data and use them as MaxEnt constraints. Our approach isolates exactly what mechanism is telling us over and above the state variables alone. © 2017 John Wiley & Sons Ltd/CNRS.
Dynamical properties of maps fitted to data in the noise-free limit
Lindström, Torsten
2013-01-01
We argue that any attempt to classify dynamical properties from nonlinear finite time-series data requires a mechanistic model fitting the data better than piecewise linear models according to standard model selection criteria. Such a procedure seems necessary but still not sufficient. PMID:23768079
Testing the molecular clock using mechanistic models of fossil preservation and molecular evolution
2017-01-01
Molecular sequence data provide information about relative times only, and fossil-based age constraints are the ultimate source of information about absolute times in molecular clock dating analyses. Thus, fossil calibrations are critical to molecular clock dating, but competing methods are difficult to evaluate empirically because the true evolutionary time scale is never known. Here, we combine mechanistic models of fossil preservation and sequence evolution in simulations to evaluate different approaches to constructing fossil calibrations and their impact on Bayesian molecular clock dating, and the relative impact of fossil versus molecular sampling. We show that divergence time estimation is impacted by the model of fossil preservation, sampling intensity and tree shape. The addition of sequence data may improve molecular clock estimates, but accuracy and precision is dominated by the quality of the fossil calibrations. Posterior means and medians are poor representatives of true divergence times; posterior intervals provide a much more accurate estimate of divergence times, though they may be wide and often do not have high coverage probability. Our results highlight the importance of increased fossil sampling and improved statistical approaches to generating calibrations, which should incorporate the non-uniform nature of ecological and temporal fossil species distributions. PMID:28637852
Kolokotroni, Eleni; Dionysiou, Dimitra; Veith, Christian; Kim, Yoo-Jin; Franz, Astrid; Grgic, Aleksandar; Bohle, Rainer M.; Stamatakos, Georgios
2016-01-01
The 5-year survival of non-small cell lung cancer patients can be as low as 1% in advanced stages. For patients with resectable disease, the successful choice of preoperative chemotherapy is critical to eliminate micrometastasis and improve operability. In silico experimentations can suggest the optimal treatment protocol for each patient based on their own multiscale data. A determinant for reliable predictions is the a priori estimation of the drugs’ cytotoxic efficacy on cancer cells for a given treatment. In the present work a mechanistic model of cancer response to treatment is applied for the estimation of a plausible value range of the cell killing efficacy of various cisplatin-based doublet regimens. Among others, the model incorporates the cancer related mechanism of uncontrolled proliferation, population heterogeneity, hypoxia and treatment resistance. The methodology is based on the provision of tumor volumetric data at two time points, before and after or during treatment. It takes into account the effect of tumor microenvironment and cell repopulation on treatment outcome. A thorough sensitivity analysis based on one-factor-at-a-time and latin hypercube sampling/partial rank correlation coefficient approaches has established the volume growth rate and the growth fraction at diagnosis as key features for more accurate estimates. The methodology is applied on the retrospective data of thirteen patients with non-small cell lung cancer who received cisplatin in combination with gemcitabine, vinorelbine or docetaxel in the neoadjuvant context. The selection of model input values has been guided by a comprehensive literature survey on cancer-specific proliferation kinetics. The latin hypercube sampling has been recruited to compensate for patient-specific uncertainties. Concluding, the present work provides a quantitative framework for the estimation of the in-vivo cell-killing ability of various chemotherapies. Correlation studies of such estimates with the molecular profile of patients could serve as a basis for reliable personalized predictions. PMID:27657742
DOT National Transportation Integrated Search
2017-02-08
The study re-evaluates distress prediction models using the Mechanistic-Empirical Pavement Design Guide (MEPDG) and expands the sensitivity analysis to a wide range of pavement structures and soils. In addition, an extensive validation analysis of th...
Improving the forecast for biodiversity under climate change.
Urban, M C; Bocedi, G; Hendry, A P; Mihoub, J-B; Pe'er, G; Singer, A; Bridle, J R; Crozier, L G; De Meester, L; Godsoe, W; Gonzalez, A; Hellmann, J J; Holt, R D; Huth, A; Johst, K; Krug, C B; Leadley, P W; Palmer, S C F; Pantel, J H; Schmitz, A; Zollner, P A; Travis, J M J
2016-09-09
New biological models are incorporating the realistic processes underlying biological responses to climate change and other human-caused disturbances. However, these more realistic models require detailed information, which is lacking for most species on Earth. Current monitoring efforts mainly document changes in biodiversity, rather than collecting the mechanistic data needed to predict future changes. We describe and prioritize the biological information needed to inform more realistic projections of species' responses to climate change. We also highlight how trait-based approaches and adaptive modeling can leverage sparse data to make broader predictions. We outline a global effort to collect the data necessary to better understand, anticipate, and reduce the damaging effects of climate change on biodiversity. Copyright © 2016, American Association for the Advancement of Science.
Calibration and analysis of genome-based models for microbial ecology.
Louca, Stilianos; Doebeli, Michael
2015-10-16
Microbial ecosystem modeling is complicated by the large number of unknown parameters and the lack of appropriate calibration tools. Here we present a novel computational framework for modeling microbial ecosystems, which combines genome-based model construction with statistical analysis and calibration to experimental data. Using this framework, we examined the dynamics of a community of Escherichia coli strains that emerged in laboratory evolution experiments, during which an ancestral strain diversified into two coexisting ecotypes. We constructed a microbial community model comprising the ancestral and the evolved strains, which we calibrated using separate monoculture experiments. Simulations reproduced the successional dynamics in the evolution experiments, and pathway activation patterns observed in microarray transcript profiles. Our approach yielded detailed insights into the metabolic processes that drove bacterial diversification, involving acetate cross-feeding and competition for organic carbon and oxygen. Our framework provides a missing link towards a data-driven mechanistic microbial ecology.
Resolving Microzooplankton Functional Groups In A Size-Structured Planktonic Model
NASA Astrophysics Data System (ADS)
Taniguchi, D.; Dutkiewicz, S.; Follows, M. J.; Jahn, O.; Menden-Deuer, S.
2016-02-01
Microzooplankton are important marine grazers, often consuming a large fraction of primary productivity. They consist of a great diversity of organisms with different behaviors, characteristics, and rates. This functional diversity, and its consequences, are not currently reflected in large-scale ocean ecological simulations. How should these organisms be represented, and what are the implications for their biogeography? We develop a size-structured, trait-based model to characterize a diversity of microzooplankton functional groups. We compile and examine size-based laboratory data on the traits, revealing some patterns with size and functional group that we interpret with mechanistic theory. Fitting the model to the data provides parameterizations of key rates and properties, which we employ in a numerical ocean model. The diversity of grazing preference, rates, and trophic strategies enables the coexistence of different functional groups of micro-grazers under various environmental conditions, and the model produces testable predictions of the biogeography.
Mechanistic species distribution modeling reveals a niche shift during invasion.
Chapman, Daniel S; Scalone, Romain; Štefanić, Edita; Bullock, James M
2017-06-01
Niche shifts of nonnative plants can occur when they colonize novel climatic conditions. However, the mechanistic basis for niche shifts during invasion is poorly understood and has rarely been captured within species distribution models. We quantified the consequence of between-population variation in phenology for invasion of common ragweed (Ambrosia artemisiifolia L.) across Europe. Ragweed is of serious concern because of its harmful effects as a crop weed and because of its impact on public health as a major aeroallergen. We developed a forward mechanistic species distribution model based on responses of ragweed development rates to temperature and photoperiod. The model was parameterized and validated from the literature and by reanalyzing data from a reciprocal common garden experiment in which native and invasive populations were grown within and beyond the current invaded range. It could therefore accommodate between-population variation in the physiological requirements for flowering, and predict the potentially invaded ranges of individual populations. Northern-origin populations that were established outside the generally accepted climate envelope of the species had lower thermal requirements for bud development, suggesting local adaptation of phenology had occurred during the invasion. The model predicts that this will extend the potentially invaded range northward and increase the average suitability across Europe by 90% in the current climate and 20% in the future climate. Therefore, trait variation observed at the population scale can trigger a climatic niche shift at the biogeographic scale. For ragweed, earlier flowering phenology in established northern populations could allow the species to spread beyond its current invasive range, substantially increasing its risk to agriculture and public health. Mechanistic species distribution models offer the possibility to represent niche shifts by varying the traits and niche responses of individual populations. Ignoring such effects could substantially underestimate the extent and impact of invasions. © 2017 by the Ecological Society of America.
An, Gary; Christley, Scott
2012-01-01
Given the panoply of system-level diseases that result from disordered inflammation, such as sepsis, atherosclerosis, cancer, and autoimmune disorders, understanding and characterizing the inflammatory response is a key target of biomedical research. Untangling the complex behavioral configurations associated with a process as ubiquitous as inflammation represents a prototype of the translational dilemma: the ability to translate mechanistic knowledge into effective therapeutics. A critical failure point in the current research environment is a throughput bottleneck at the level of evaluating hypotheses of mechanistic causality; these hypotheses represent the key step toward the application of knowledge for therapy development and design. Addressing the translational dilemma will require utilizing the ever-increasing power of computers and computational modeling to increase the efficiency of the scientific method in the identification and evaluation of hypotheses of mechanistic causality. More specifically, development needs to focus on facilitating the ability of non-computer trained biomedical researchers to utilize and instantiate their knowledge in dynamic computational models. This is termed "dynamic knowledge representation." Agent-based modeling is an object-oriented, discrete-event, rule-based simulation method that is well suited for biomedical dynamic knowledge representation. Agent-based modeling has been used in the study of inflammation at multiple scales. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggest that this modeling framework is well suited for addressing the translational dilemma. This review describes agent-based modeling, gives examples of its applications in the study of inflammation, and introduces a proposed general expansion of the use of modeling and simulation to augment the generation and evaluation of knowledge by the biomedical research community at large.
Progress toward an explicit mechanistic model for the light-driven pump, bacteriorhodopsin
NASA Technical Reports Server (NTRS)
Lanyi, J. K.
1999-01-01
Recent crystallographic information about the structure of bacteriorhodopsin and some of its photointermediates, together with a large amount of spectroscopic and mutational data, suggest a mechanistic model for how this protein couples light energy to the translocation of protons across the membrane. Now nearing completion, this detailed molecular model will describe the nature of the steric and electrostatic conflicts at the photoisomerized retinal, as well as the means by which it induces proton transfers in the two half-channels leading to the two membrane surfaces, thereby causing unidirectional, uphill transport.
Karabelas, Elias; Gsell, Matthias A. F.; Augustin, Christoph M.; Marx, Laura; Neic, Aurel; Prassl, Anton J.; Goubergrits, Leonid; Kuehne, Titus; Plank, Gernot
2018-01-01
Computational fluid dynamics (CFD) models of blood flow in the left ventricle (LV) and aorta are important tools for analyzing the mechanistic links between myocardial deformation and flow patterns. Typically, the use of image-based kinematic CFD models prevails in applications such as predicting the acute response to interventions which alter LV afterload conditions. However, such models are limited in their ability to analyze any impacts upon LV load or key biomarkers known to be implicated in driving remodeling processes as LV function is not accounted for in a mechanistic sense. This study addresses these limitations by reporting on progress made toward a novel electro-mechano-fluidic (EMF) model that represents the entire physics of LV electromechanics (EM) based on first principles. A biophysically detailed finite element (FE) model of LV EM was coupled with a FE-based CFD solver for moving domains using an arbitrary Eulerian-Lagrangian (ALE) formulation. Two clinical cases of patients suffering from aortic coarctations (CoA) were built and parameterized based on clinical data under pre-treatment conditions. For one patient case simulations under post-treatment conditions after geometric repair of CoA by a virtual stenting procedure were compared against pre-treatment results. Numerical stability of the approach was demonstrated by analyzing mesh quality and solver performance under the significantly large deformations of the LV blood pool. Further, computational tractability and compatibility with clinical time scales were investigated by performing strong scaling benchmarks up to 1536 compute cores. The overall cost of the entire workflow for building, fitting and executing EMF simulations was comparable to those reported for image-based kinematic models, suggesting that EMF models show potential of evolving into a viable clinical research tool. PMID:29892227
Constales, Denis; Yablonsky, Gregory S.; Wang, Lucun; ...
2017-04-25
This paper presents a straightforward and user-friendly procedure for extracting a reactivity characterization of catalytic reactions on solid materials under non-steady-state conditions, particularly in temporal analysis of products (TAP) experiments. The kinetic parameters derived by this procedure can help with the development of detailed mechanistic understanding. The procedure consists of the following two major steps: 1) Three “Laplace reactivities” are first determined based on the moments of the exit flow pulse response data; 2) Depending on a select kinetic model, kinetic constants of elementary reaction steps can then be expressed as a function of reactivities and determined accordingly. In particular,more » we distinguish two calculation methods based on the availability and reliability of reactant and product data. The theoretical results are illustrated using a reverse example with given parameters as well as an experimental example of CO oxidation over a supported Au/SiO 2 catalyst. The procedure presented here provides an efficient tool for kinetic characterization of many complex chemical reactions.« less
Ganusov, Vitaly V.; De Boer, Rob J.
2013-01-01
Bromodeoxyuridine (BrdU) is widely used in immunology to detect cell division, and several mathematical models have been proposed to estimate proliferation and death rates of lymphocytes from BrdU labelling and de-labelling curves. One problem in interpreting BrdU data is explaining the de-labelling curves. Because shortly after label withdrawal, BrdU+ cells are expected to divide into BrdU+ daughter cells, one would expect a flat down-slope. As for many cell types, the fraction of BrdU+ cells decreases during de-labelling, previous mathematical models had to make debatable assumptions to be able to account for the data. We develop a mechanistic model tracking the number of divisions that each cell has undergone in the presence and absence of BrdU, and allow cells to accumulate and dilute their BrdU content. From the same mechanistic model, one can naturally derive expressions for the mean BrdU content (MBC) of all cells, or the MBC of the BrdU+ subset, which is related to the mean fluorescence intensity of BrdU that can be measured in experiments. The model is extended to include subpopulations with different rates of division and death (i.e. kinetic heterogeneity). We fit the extended model to previously published BrdU data from memory T lymphocytes in simian immunodeficiency virus-infected and uninfected macaques, and find that the model describes the data with at least the same quality as previous models. Because the same model predicts a modest decline in the MBC of BrdU+ cells, which is consistent with experimental observations, BrdU dilution seems a natural explanation for the observed down-slopes in self-renewing populations. PMID:23034350
Wren, S A C; Alhusban, F; Barry, A R; Hughes, L P
2017-08-30
The impact of varying Sodium Starch Glycolate (SSG) grade and wet granulation intensity on the mechanism of disintegration and dissolution of mannitol-based Immediate Release (IR) placebo tablets was investigated. MRI and 1 H NMR provided mechanistic insight, and revealed a four-fold range in both tablet disintegration and dissolution rates. MRI was used to quantify the rates of change in tablet volumes and the data fitted to a hydration/erosion model. Reduced levels of cross-linking change SSG from a swelling to a gelling matrix. The tablet hydration and dissolution rates are related to the viscosity at the tablet-solution interface, with high viscosities limiting mass transport. Copyright © 2017 Elsevier B.V. All rights reserved.
Upton, J; Murphy, M; Shalloo, L; Groot Koerkamp, P W G; De Boer, I J M
2014-01-01
Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical data of 1yr on commercial spring calving, grass-based dairy farms with 45, 88, and 195 milking cows; and (3) demonstrated the functionality of the model by applying 2 electricity tariffs to the electricity consumption data and examining the effect on total dairy farm electricity costs. The MECD was developed using a mechanistic modeling approach and required the key inputs of milk production, cow number, and details relating to the milk-cooling system, milking machine system, water-heating system, lighting systems, water pump systems, and the winter housing facilities as well as details relating to the management of the farm (e.g., season of calving). Model validation showed an overall relative prediction error (RPE) of less than 10% for total electricity consumption. More than 87% of the mean square prediction error of total electricity consumption was accounted for by random variation. The RPE values of the milk-cooling systems, water-heating systems, and milking machine systems were less than 20%. The RPE values for automatic scraper systems, lighting systems, and water pump systems varied from 18 to 113%, indicating a poor prediction for these metrics. However, automatic scrapers, lighting, and water pumps made up only 14% of total electricity consumption across all farms, reducing the overall impact of these poor predictions. Demonstration of the model showed that total farm electricity costs increased by between 29 and 38% by moving from a day and night tariff to a flat tariff. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ebrahimi, Ali; Or, Dani
2017-05-01
The sensitivity of polar regions to raising global temperatures is reflected in rapidly changing hydrological processes associated with pronounced seasonal thawing of permafrost soil and increased biological activity. Of particular concern is the potential release of large amounts of soil carbon and stimulation of other soil-borne greenhouse gas emissions such as methane. Soil methanotrophic and methanogenic microbial communities rapidly adjust their activity and spatial organization in response to permafrost thawing and other environmental factors. Soil structural elements such as aggregates and layering affect oxygen and nutrient diffusion processes thereby contributing to methanogenic activity within temporal anoxic niches (hot spots). We developed a mechanistic individual-based model to quantify microbial activity dynamics in soil pore networks considering transport processes and enzymatic activity associated with methane production in soil. The model was upscaled from single aggregates to the soil profile where freezing/thawing provides macroscopic boundary conditions for microbial activity at different soil depths. The model distinguishes microbial activity in aerate bulk soil from aggregates (or submerged profile) for resolving methane production and oxidation rates. Methane transport pathways by diffusion and ebullition of bubbles vary with hydration dynamics. The model links seasonal thermal and hydrologic dynamics with evolution of microbial community composition and function affecting net methane emissions in good agreement with experimental data. The mechanistic model enables systematic evaluation of key controlling factors in thawing permafrost and microbial response (e.g., nutrient availability and enzyme activity) on long-term methane emissions and carbon decomposition rates in the rapidly changing polar regions.
Equation-free mechanistic ecosystem forecasting using empirical dynamic modeling
Ye, Hao; Beamish, Richard J.; Glaser, Sarah M.; Grant, Sue C. H.; Hsieh, Chih-hao; Richards, Laura J.; Schnute, Jon T.; Sugihara, George
2015-01-01
It is well known that current equilibrium-based models fall short as predictive descriptions of natural ecosystems, and particularly of fisheries systems that exhibit nonlinear dynamics. For example, model parameters assumed to be fixed constants may actually vary in time, models may fit well to existing data but lack out-of-sample predictive skill, and key driving variables may be misidentified due to transient (mirage) correlations that are common in nonlinear systems. With these frailties, it is somewhat surprising that static equilibrium models continue to be widely used. Here, we examine empirical dynamic modeling (EDM) as an alternative to imposed model equations and that accommodates both nonequilibrium dynamics and nonlinearity. Using time series from nine stocks of sockeye salmon (Oncorhynchus nerka) from the Fraser River system in British Columbia, Canada, we perform, for the the first time to our knowledge, real-data comparison of contemporary fisheries models with equivalent EDM formulations that explicitly use spawning stock and environmental variables to forecast recruitment. We find that EDM models produce more accurate and precise forecasts, and unlike extensions of the classic Ricker spawner–recruit equation, they show significant improvements when environmental factors are included. Our analysis demonstrates the strategic utility of EDM for incorporating environmental influences into fisheries forecasts and, more generally, for providing insight into how environmental factors can operate in forecast models, thus paving the way for equation-free mechanistic forecasting to be applied in management contexts. PMID:25733874
Mechanistic modeling of developmental defects through computational embryology (WC10th)
Abstract: An important consideration for 3Rs is to identify developmental hazards utilizing mechanism-based in vitro assays (e.g., ToxCast) and in silico predictive models. Steady progress has been made with agent-based models that recapitulate morphogenetic drivers for angiogen...
Refined pipe theory for mechanistic modeling of wood development.
Deckmyn, Gaby; Evans, Sam P; Randle, Tim J
2006-06-01
We present a mechanistic model of wood tissue development in response to changes in competition, management and climate. The model is based on a refinement of the pipe theory, where the constant ratio between sapwood and leaf area (pipe theory) is replaced by a ratio between pipe conductivity and leaf area. Simulated pipe conductivity changes with age, stand density and climate in response to changes in allocation or pipe radius, or both. The central equation of the model, which calculates the ratio of carbon (C) allocated to leaves and pipes, can be parameterized to describe the contrasting stem conductivity behavior of different tree species: from constant stem conductivity (functional homeostasis hypothesis) to height-related reduction in stem conductivity with age (hydraulic limitation hypothesis). The model simulates the daily growth of pipes (vessels or tracheids), fibers and parenchyma as well as vessel size and simulates the wood density profile and the earlywood to latewood ratio from these data. Initial runs indicate the model yields realistic seasonal changes in pipe radius (decreasing pipe radius from spring to autumn) and wood density, as well as realistic differences associated with the competitive status of trees (denser wood in suppressed trees).
Blakeley, Matthew P.; Ruiz, Federico; Cachau, Raul; Hazemann, Isabelle; Meilleur, Flora; Mitschler, Andre; Ginell, Stephan; Afonine, Pavel; Ventura, Oscar N.; Cousido-Siah, Alexandra; Haertlein, Michael; Joachimiak, Andrzej; Myles, Dean; Podjarny, Alberto
2008-01-01
We present results of combined studies of the enzyme human aldose reductase (h-AR, 36 kDa) using single-crystal x-ray data (0.66 Å, 100K; 0.80 Å, 15K; 1.75 Å, 293K), neutron Laue data (2.2 Å, 293K), and quantum mechanical modeling. These complementary techniques unveil the internal organization and mobility of the hydrogen bond network that defines the properties of the catalytic engine, explaining how this promiscuous enzyme overcomes the simultaneous requirements of efficiency and promiscuity offering a general mechanistic view for this class of enzymes. PMID:18250329
Blakeley, Matthew P; Ruiz, Federico; Cachau, Raul; Hazemann, Isabelle; Meilleur, Flora; Mitschler, Andre; Ginell, Stephan; Afonine, Pavel; Ventura, Oscar N; Cousido-Siah, Alexandra; Haertlein, Michael; Joachimiak, Andrzej; Myles, Dean; Podjarny, Alberto
2008-02-12
We present results of combined studies of the enzyme human aldose reductase (h-AR, 36 kDa) using single-crystal x-ray data (0.66 A, 100K; 0.80 A, 15K; 1.75 A, 293K), neutron Laue data (2.2 A, 293K), and quantum mechanical modeling. These complementary techniques unveil the internal organization and mobility of the hydrogen bond network that defines the properties of the catalytic engine, explaining how this promiscuous enzyme overcomes the simultaneous requirements of efficiency and promiscuity offering a general mechanistic view for this class of enzymes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blakeley, M. P.; Ruiz, Fredrico; Cachau, Raul
2008-01-01
We present results of combined studies of the enzyme human aldose reductase (h-AR, 36 kDa) using single-crystal x-ray data (0.66 Angstroms, 100K; 0.80 Angstroms, 15K; 1.75 Angstroms, 293K), neutron Laue data (2.2 Angstroms, 293K), and quantum mechanical modeling. These complementary techniques unveil the internal organization and mobility of the hydrogen bond network that defines the properties of the catalytic engine, explaining how this promiscuous enzyme overcomes the simultaneous requirements of efficiency and promiscuity offering a general mechanistic view for this class of enzymes.
Forbes, Valery E; Salice, Chris J; Birnir, Bjorn; Bruins, Randy J F; Calow, Peter; Ducrot, Virginie; Galic, Nika; Garber, Kristina; Harvey, Bret C; Jager, Henriette; Kanarek, Andrew; Pastorok, Robert; Railsback, Steve F; Rebarber, Richard; Thorbek, Pernille
2017-04-01
Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. Environ Toxicol Chem 2017;36:845-859. © 2017 SETAC. © 2017 SETAC.
Nozaki, Sachiko; Yamaguchi, Masayuki; Lefèvre, Gilbert
2016-07-01
Rivastigmine is an inhibitor of acetylcholinesterases and butyrylcholinesterases for symptomatic treatment of Alzheimer disease and is available as oral and transdermal patch formulations. A dermal absorption pharmacokinetic (PK) model was developed to simulate the plasma concentration-time profile of rivastigmine to answer questions relative to the efficacy and safety risks after misuse of the patch (e.g., longer application than 24 h, multiple patches applied at the same time, and so forth). The model comprised 2 compartments which was a combination of mechanistic dermal absorption model and a basic 1-compartment model. The initial values for the model were determined based on the physicochemical characteristics of rivastigmine and PK parameters after intravenous administration. The model was fitted to the clinical PK profiles after single application of rivastigmine patch to obtain model parameters. The final model was validated by confirming that the simulated concentration-time curves and PK parameters (Cmax and area under the drug plasma concentration-time curve) conformed to the observed values and then was used to simulate the PK profiles of rivastigmine. This work demonstrated that the mechanistic dermal PK model fitted the clinical data well and was able to simulate the PK profile after patch misuse. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Wang, Gang; Briskot, Till; Hahn, Tobias; Baumann, Pascal; Hubbuch, Jürgen
2017-03-03
Mechanistic modeling has been repeatedly successfully applied in process development and control of protein chromatography. For each combination of adsorbate and adsorbent, the mechanistic models have to be calibrated. Some of the model parameters, such as system characteristics, can be determined reliably by applying well-established experimental methods, whereas others cannot be measured directly. In common practice of protein chromatography modeling, these parameters are identified by applying time-consuming methods such as frontal analysis combined with gradient experiments, curve-fitting, or combined Yamamoto approach. For new components in the chromatographic system, these traditional calibration approaches require to be conducted repeatedly. In the presented work, a novel method for the calibration of mechanistic models based on artificial neural network (ANN) modeling was applied. An in silico screening of possible model parameter combinations was performed to generate learning material for the ANN model. Once the ANN model was trained to recognize chromatograms and to respond with the corresponding model parameter set, it was used to calibrate the mechanistic model from measured chromatograms. The ANN model's capability of parameter estimation was tested by predicting gradient elution chromatograms. The time-consuming model parameter estimation process itself could be reduced down to milliseconds. The functionality of the method was successfully demonstrated in a study with the calibration of the transport-dispersive model (TDM) and the stoichiometric displacement model (SDM) for a protein mixture. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Schneider, Martina; Goss, Kai-Uwe
2012-11-20
Volatilization of pesticides from the bare soil surface is drastically reduced when the soil is under dry conditions (i.e., water content lower than the permanent wilting point). This effect is caused by the hydrated mineral surfaces that become available as additional sorption sites under dry conditions. However, established volatilization models do not explicitly consider the hydrated mineral surfaces as an independent sorption compartment and cannot correctly cover the moisture effect on volatilization. Here we integrated the existing mechanistic understanding of sorption of organic compounds to mineral surfaces and its dependence on the hydration status into a simple volatilization model. The resulting model was tested with reported experimental data for two herbicides from a wind tunnel experiment under various well-defined humidity conditions. The required equilibrium sorption coefficients of triallate and trifluralin to the mineral surfaces, K(min/air), at 60% relative humidity were fitted to experimental data and extrapolated to other humidity conditions. The model captures the general trend of the volatilization in different humidity scenarios. The results reveal that it is essential to have high quality input data for K(min/air), the available specific surface area (SSA), the penetration depth of the applied pesticide solution, and the humidity conditions in the soil. The model approach presented here in combination with an improved description of the humidity conditions under dry conditions can be integrated into existing volatilization models that already work well for humid conditions but still lack the mechanistically based description of the volatilization process under dry conditions.
ERIC Educational Resources Information Center
Grover, Anita; Lam, Tai Ning; Hunt, C. Anthony
2008-01-01
We present a simulation tool to aid the study of basic pharmacology principles. By taking advantage of the properties of agent-based modeling, the tool facilitates taking a mechanistic approach to learning basic concepts, in contrast to the traditional empirical methods. Pharmacodynamics is a particular aspect of pharmacology that can benefit from…
McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.
2017-01-01
In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216
Lau, Max S Y; Gibson, Gavin J; Adrakey, Hola; McClelland, Amanda; Riley, Steven; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D; Grenfell, Bryan T
2017-10-01
In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.
NASA Astrophysics Data System (ADS)
Yamana, T. K.; Eltahir, E. A.
2009-12-01
The Hydrology, Entomology and Malaria Transmission Simulator (HYDREMATS) is a mechanistic model developed to assess malaria risk in areas where the disease is water-limited. This model relies on precipitation inputs as its primary forcing. Until now, applications of the model have used ground-based precipitation observations. However, rain gauge networks in the areas most affected by malaria are often sparse. The increasing availability of satellite based rainfall estimates could greatly extend the range of the model. The minimum temporal resolution of precipitation data needed was determined to be one hour. The CPC Morphing technique (CMORPH ) distributed by NOAA fits this criteria, as it provides 30-minute estimates at 8km resolution. CMORPH data were compared to ground observations in four West African villages, and calibrated to reduce overestimation and false alarm biases. The calibrated CMORPH data were used to force HYDREMATS, resulting in outputs for mosquito populations, vectorial capacity and malaria transmission.
Data Assimilation at FLUXNET to Improve Models towards Ecological Forecasting (Invited)
NASA Astrophysics Data System (ADS)
Luo, Y.
2009-12-01
Dramatically increased volumes of data from observational and experimental networks such as FLUXNET call for transformation of ecological research to increase its emphasis on quantitative forecasting. Ecological forecasting will also meet the societal need to develop better strategies for natural resource management in a world of ongoing global change. Traditionally, ecological forecasting has been based on process-based models, informed by data in largely ad hoc ways. Although most ecological models incorporate some representation of mechanistic processes, today’s ecological models are generally not adequate to quantify real-world dynamics and provide reliable forecasts with accompanying estimates of uncertainty. A key tool to improve ecological forecasting is data assimilation, which uses data to inform initial conditions and to help constrain a model during simulation to yield results that approximate reality as closely as possible. In an era with dramatically increased availability of data from observational and experimental networks, data assimilation is a key technique that helps convert the raw data into ecologically meaningful products so as to accelerate our understanding of ecological processes, test ecological theory, forecast changes in ecological services, and better serve the society. This talk will use examples to illustrate how data from FLUXNET have been assimilated with process-based models to improve estimates of model parameters and state variables; to quantify uncertainties in ecological forecasting arising from observations, models and their interactions; and to evaluate information contributions of data and model toward short- and long-term forecasting of ecosystem responses to global change.
Personalized glucose forecasting for type 2 diabetes using data assimilation
Albers, David J.; Gluckman, Bruce; Ginsberg, Henry; Hripcsak, George; Mamykina, Lena
2017-01-01
Type 2 diabetes leads to premature death and reduced quality of life for 8% of Americans. Nutrition management is critical to maintaining glycemic control, yet it is difficult to achieve due to the high individual differences in glycemic response to nutrition. Anticipating glycemic impact of different meals can be challenging not only for individuals with diabetes, but also for expert diabetes educators. Personalized computational models that can accurately forecast an impact of a given meal on an individual’s blood glucose levels can serve as the engine for a new generation of decision support tools for individuals with diabetes. However, to be useful in practice, these computational engines need to generate accurate forecasts based on limited datasets consistent with typical self-monitoring practices of individuals with type 2 diabetes. This paper uses three forecasting machines: (i) data assimilation, a technique borrowed from atmospheric physics and engineering that uses Bayesian modeling to infuse data with human knowledge represented in a mechanistic model, to generate real-time, personalized, adaptable glucose forecasts; (ii) model averaging of data assimilation output; and (iii) dynamical Gaussian process model regression. The proposed data assimilation machine, the primary focus of the paper, uses a modified dual unscented Kalman filter to estimate states and parameters, personalizing the mechanistic models. Model selection is used to make a personalized model selection for the individual and their measurement characteristics. The data assimilation forecasts are empirically evaluated against actual postprandial glucose measurements captured by individuals with type 2 diabetes, and against predictions generated by experienced diabetes educators after reviewing a set of historical nutritional records and glucose measurements for the same individual. The evaluation suggests that the data assimilation forecasts compare well with specific glucose measurements and match or exceed in accuracy expert forecasts. We conclude by examining ways to present predictions as forecast-derived range quantities and evaluate the comparative advantages of these ranges. PMID:28448498
A Global Study of GPP focusing on Light Use Efficiency in a Random Forest Regression Model
NASA Astrophysics Data System (ADS)
Fang, W.; Wei, S.; Yi, C.; Hendrey, G. R.
2016-12-01
Light use efficiency (LUE) is at the core of mechanistic modeling of global gross primary production (GPP). However, most LUE estimates in global models are satellite-based and coarsely measured with emphasis on environmental variables. Others are from eddy covariance towers with much greater spatial and temporal data quality and emphasis on mechanistic processes, but in a limited number of sites. In this paper, we conducted a comprehensive global study of tower-based LUE from 237 FLUXNET towers, and scaled up LUEs from in-situ tower level to global biome level. We integrated key environmental and biological variables into the tower-based LUE estimates, at 0.5o x 0.5o grid-cell resolution, using a random forest regression (RFR) approach. We then developed an RFR-LUE-GPP model using the grid-cell LUE data, and compared it to a tower-LUE-GPP model by the conventional way of treating LUE as a series of biome-specific constants. In order to calibrate the LUE models, we developed a data-driven RFR-GPP model using a random forest regression method. Our results showed that LUE varies largely with latitude. We estimated a global area-weighted average of LUE at 1.21 gC m-2 MJ-1 APAR, which led to an estimated global GPP of 102.9 Gt C /year from 2000 to 2005. The tower-LUE-GPP model tended to overestimate forest GPP in tropical and boreal regions. Large uncertainties exist in GPP estimates over sparsely vegetated areas covered by savannas and woody savannas around the middle to low latitudes (i.g. 20oS to 40oS and 5oN to 15oN) due to lack of available data. Model results were improved by incorporating Köppen climate types to represent climate /meteorological information in machine learning modeling. This shed new light on the recognized issues of climate dependence of spring onset of photosynthesis and the challenges in modeling the biome GPP of evergreen broad leaf forests (EBF) accurately. The divergent responses of GPP to temperature and precipitation at mid-high latitudes and at mid-low latitudes echoed the necessity of modeling GPP separately by latitudes. This work provided a global distribution of LUE estimate, and developed a comprehensive algorithm modeling global terrestrial carbon with high spatial and temporal resolutions.
A Systems' Biology Approach to Study MicroRNA-Mediated Gene Regulatory Networks
Kunz, Manfred; Vera, Julio; Wolkenhauer, Olaf
2013-01-01
MicroRNAs (miRNAs) are potent effectors in gene regulatory networks where aberrant miRNA expression can contribute to human diseases such as cancer. For a better understanding of the regulatory role of miRNAs in coordinating gene expression, we here present a systems biology approach combining data-driven modeling and model-driven experiments. Such an approach is characterized by an iterative process, including biological data acquisition and integration, network construction, mathematical modeling and experimental validation. To demonstrate the application of this approach, we adopt it to investigate mechanisms of collective repression on p21 by multiple miRNAs. We first construct a p21 regulatory network based on data from the literature and further expand it using algorithms that predict molecular interactions. Based on the network structure, a detailed mechanistic model is established and its parameter values are determined using data. Finally, the calibrated model is used to study the effect of different miRNA expression profiles and cooperative target regulation on p21 expression levels in different biological contexts. PMID:24350286
Rational and mechanistic perspectives on reinforcement learning.
Chater, Nick
2009-12-01
This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: mechanistic and rational. Reinforcement learning is often viewed in mechanistic terms--as describing the operation of aspects of an agent's cognitive and neural machinery. Yet it can also be viewed as a rational level of description, specifically, as describing a class of methods for learning from experience, using minimal background knowledge. This paper considers how rational and mechanistic perspectives differ, and what types of evidence distinguish between them. Reinforcement learning research in the cognitive and brain sciences is often implicitly committed to the mechanistic interpretation. Here the opposite view is put forward: that accounts of reinforcement learning should apply at the rational level, unless there is strong evidence for a mechanistic interpretation. Implications of this viewpoint for reinforcement-based theories in the cognitive and brain sciences are discussed.
Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models
Hooten, Mevin B.; Leeds, William B.; Fiechter, Jerome; Wikle, Christopher K.
2011-01-01
We present an approach for estimating physical parameters in nonlinear models that relies on an approximation to the mechanistic model itself for computational efficiency. The proposed methodology is validated and applied in two different modeling scenarios: (a) Simulation and (b) lower trophic level ocean ecosystem model. The approach we develop relies on the ability to predict right singular vectors (resulting from a decomposition of computer model experimental output) based on the computer model input and an experimental set of parameters. Critically, we model the right singular vectors in terms of the model parameters via a nonlinear statistical model. Specifically, we focus our attention on first-order models of these right singular vectors rather than the second-order (covariance) structure.
Modeling Physiological Processes That Relate Toxicant Exposure and Bacterial Population Dynamics
Klanjscek, Tin; Nisbet, Roger M.; Priester, John H.; Holden, Patricia A.
2012-01-01
Quantifying effects of toxicant exposure on metabolic processes is crucial to predicting microbial growth patterns in different environments. Mechanistic models, such as those based on Dynamic Energy Budget (DEB) theory, can link physiological processes to microbial growth. Here we expand the DEB framework to include explicit consideration of the role of reactive oxygen species (ROS). Extensions considered are: (i) additional terms in the equation for the “hazard rate” that quantifies mortality risk; (ii) a variable representing environmental degradation; (iii) a mechanistic description of toxic effects linked to increase in ROS production and aging acceleration, and to non-competitive inhibition of transport channels; (iv) a new representation of the “lag time” based on energy required for acclimation. We estimate model parameters using calibrated Pseudomonas aeruginosa optical density growth data for seven levels of cadmium exposure. The model reproduces growth patterns for all treatments with a single common parameter set, and bacterial growth for treatments of up to 150 mg(Cd)/L can be predicted reasonably well using parameters estimated from cadmium treatments of 20 mg(Cd)/L and lower. Our approach is an important step towards connecting levels of biological organization in ecotoxicology. The presented model reveals possible connections between processes that are not obvious from purely empirical considerations, enables validation and hypothesis testing by creating testable predictions, and identifies research required to further develop the theory. PMID:22328915
Huang, Ruili; Xia, Menghang; Sakamuru, Srilatha; Zhao, Jinghua; Shahane, Sampada A.; Attene-Ramos, Matias; Zhao, Tongan; Austin, Christopher P.; Simeonov, Anton
2016-01-01
Target-specific, mechanism-oriented in vitro assays post a promising alternative to traditional animal toxicology studies. Here we report the first comprehensive analysis of the Tox21 effort, a large-scale in vitro toxicity screening of chemicals. We test ∼10,000 chemicals in triplicates at 15 concentrations against a panel of nuclear receptor and stress response pathway assays, producing more than 50 million data points. Compound clustering by structure similarity and activity profile similarity across the assays reveals structure–activity relationships that are useful for the generation of mechanistic hypotheses. We apply structural information and activity data to build predictive models for 72 in vivo toxicity end points using a cluster-based approach. Models based on in vitro assay data perform better in predicting human toxicity end points than animal toxicity, while a combination of structural and activity data results in better models than using structure or activity data alone. Our results suggest that in vitro activity profiles can be applied as signatures of compound mechanism of toxicity and used in prioritization for more in-depth toxicological testing. PMID:26811972
NASA Astrophysics Data System (ADS)
Jin, Biao; Rolle, Massimo
2016-04-01
Organic compounds are produced in vast quantities for industrial and agricultural use, as well as for human and animal healthcare [1]. These chemicals and their metabolites are frequently detected at trace levels in fresh water environments where they undergo degradation via different reaction pathways. Compound specific stable isotope analysis (CSIA) is a valuable tool to identify such degradation pathways in different environmental systems. Recent advances in analytical techniques have promoted the fast development and implementation of multi-element CSIA. However, quantitative frameworks to evaluate multi-element stable isotope data and incorporating mechanistic information on the degradation processes [2,3] are still lacking. In this study we propose a mechanism-based modeling approach to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic description of isotope fractionation occurring at different molecular positions. We validate the proposed approach with the concentration and multi-element isotope data of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model precisely captures the dual element isotope trends characteristic of different reaction pathways and their range of variation consistent with observed multi-element (C, N) bulk isotope fractionation. The proposed approach can also be used as a tool to explore transformation pathways in scenarios for which position-specific isotope data are not yet available. [1] Schwarzenbach, R.P., Egli, T., Hofstetter, T.B., von Gunten, U., Wehrli, B., 2010. Global Water Pollution and Human Health. Annu. Rev. Environ. Resour. doi:10.1146/annurev-environ-100809-125342. [2] Jin, B., Haderlein, S.B., Rolle, M., 2013. Integrated carbon and chlorine isotope modeling: Applications to chlorinated aliphatic hydrocarbons dechlorination. Environ. Sci. Technol. 47, 1443-1451. doi:10.1021/es304053h. [3] Jin, B., Rolle, M., 2014. Mechanistic approach to multi-element isotope modeling of organic contaminant degradation. Chemosphere 95, 131-139. doi:10.1016/j.chemosphere.2013.08.050.
Emergence of tissue polarization from synergy of intracellular and extracellular auxin signaling
Wabnik, Krzysztof; Kleine-Vehn, Jürgen; Balla, Jozef; Sauer, Michael; Naramoto, Satoshi; Reinöhl, Vilém; Merks, Roeland M H; Govaerts, Willy; Friml, Jiří
2010-01-01
Plant development is exceptionally flexible as manifested by its potential for organogenesis and regeneration, which are processes involving rearrangements of tissue polarities. Fundamental questions concern how individual cells can polarize in a coordinated manner to integrate into the multicellular context. In canalization models, the signaling molecule auxin acts as a polarizing cue, and feedback on the intercellular auxin flow is key for synchronized polarity rearrangements. We provide a novel mechanistic framework for canalization, based on up-to-date experimental data and minimal, biologically plausible assumptions. Our model combines the intracellular auxin signaling for expression of PINFORMED (PIN) auxin transporters and the theoretical postulation of extracellular auxin signaling for modulation of PIN subcellular dynamics. Computer simulations faithfully and robustly recapitulated the experimentally observed patterns of tissue polarity and asymmetric auxin distribution during formation and regeneration of vascular systems and during the competitive regulation of shoot branching by apical dominance. Additionally, our model generated new predictions that could be experimentally validated, highlighting a mechanistically conceivable explanation for the PIN polarization and canalization of the auxin flow in plants. PMID:21179019
ERIC Educational Resources Information Center
Dickes, Amanda Catherine; Sengupta, Pratim; Farris, Amy Voss; Satabdi, Basu
2016-01-01
In this paper, we present a third-grade ecology learning environment that integrates two forms of modeling--embodied modeling and agent-based modeling (ABMs)--through the generation of mathematical representations that are common to both forms of modeling. The term "agent" in the context of ABMs indicates individual computational objects…
Varma, Manthena V; El-Kattan, Ayman F
2016-07-01
A large body of evidence suggests hepatic uptake transporters, organic anion-transporting polypeptides (OATPs), are of high clinical relevance in determining the pharmacokinetics of substrate drugs, based on which recent regulatory guidances to industry recommend appropriate assessment of investigational drugs for the potential drug interactions. We recently proposed an extended clearance classification system (ECCS) framework in which the systemic clearance of class 1B and 3B drugs is likely determined by hepatic uptake. The ECCS framework therefore predicts the possibility of drug-drug interactions (DDIs) involving OATPs and the effects of genetic variants of SLCO1B1 early in the discovery and facilitates decision making in the candidate selection and progression. Although OATP-mediated uptake is often the rate-determining process in the hepatic clearance of substrate drugs, metabolic and/or biliary components also contribute to the overall hepatic disposition and, more importantly, to liver exposure. Clinical evidence suggests that alteration in biliary efflux transport or metabolic enzymes associated with genetic polymorphism leads to change in the pharmacodynamic response of statins, for which the pharmacological target resides in the liver. Perpetrator drugs may show inhibitory and/or induction effects on transporters and enzymes simultaneously. It is therefore important to adopt models that frame these multiple processes in a mechanistic sense for quantitative DDI predictions and to deconvolute the effects of individual processes on the plasma and hepatic exposure. In vitro data-informed mechanistic static and physiologically based pharmacokinetic models are proven useful in rationalizing and predicting transporter-mediated DDIs and the complex DDIs involving transporter-enzyme interplay. © 2016, The American College of Clinical Pharmacology.
Moore, Shannon R.; Saidel, Gerald M.; Knothe, Ulf; Knothe Tate, Melissa L.
2014-01-01
The link between mechanics and biology in the generation and the adaptation of bone has been well studied in context of skeletal development and fracture healing. Yet, the prediction of tissue genesis within - and the spatiotemporal healing of - postnatal defects, necessitates a quantitative evaluation of mechano-biological interactions using experimental and clinical parameters. To address this current gap in knowledge, this study aims to develop a mechanistic mathematical model of tissue genesis using bone morphogenetic protein (BMP) to represent of a class of factors that may coordinate bone healing. Specifically, we developed a mechanistic, mathematical model to predict the dynamics of tissue genesis by periosteal progenitor cells within a long bone defect surrounded by periosteum and stabilized via an intramedullary nail. The emergent material properties and mechanical environment associated with nascent tissue genesis influence the strain stimulus sensed by progenitor cells within the periosteum. Using a mechanical finite element model, periosteal surface strains are predicted as a function of emergent, nascent tissue properties. Strains are then input to a mechanistic mathematical model, where mechanical regulation of BMP-2 production mediates rates of cellular proliferation, differentiation and tissue production, to predict healing outcomes. A parametric approach enables the spatial and temporal prediction of endochondral tissue regeneration, assessed as areas of cartilage and mineralized bone, as functions of radial distance from the periosteum and time. Comparing model results to histological outcomes from two previous studies of periosteum-mediated bone regeneration in a common ovine model, it was shown that mechanistic models incorporating mechanical feedback successfully predict patterns (spatial) and trends (temporal) of bone tissue regeneration. The novel model framework presented here integrates a mechanistic feedback system based on the mechanosensitivity of periosteal progenitor cells, which allows for modeling and prediction of tissue regeneration on multiple length and time scales. Through combination of computational, physical and engineering science approaches, the model platform provides a means to test new hypotheses in silico and to elucidate conditions conducive to endogenous tissue genesis. Next generation models will serve to unravel intrinsic differences in bone genesis by endochondral and intramembranous mechanisms. PMID:24967742
Problems in mechanistic theoretical models for cell transformation by ionizing radiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, A.; Holley, W.R.
1991-10-01
A mechanistic model based on yields of double strand breaks has been developed to determine the dose response curves for cell transformation frequencies. At its present stage the model is applicable to immortal cell lines and to various qualities (X-rays, Neon and Iron) of ionizing radiation. Presently, we have considered four types of processes which can lead to activation phenomena: (1) point mutation events on a regulatory segment of selected oncogenes, (2) inactivation of suppressor genes, through point mutation, (3) deletion of a suppressor gene by a single track, and (4) deletion of a suppressor gene by two tracks.
Testing the molecular clock using mechanistic models of fossil preservation and molecular evolution.
Warnock, Rachel C M; Yang, Ziheng; Donoghue, Philip C J
2017-06-28
Molecular sequence data provide information about relative times only, and fossil-based age constraints are the ultimate source of information about absolute times in molecular clock dating analyses. Thus, fossil calibrations are critical to molecular clock dating, but competing methods are difficult to evaluate empirically because the true evolutionary time scale is never known. Here, we combine mechanistic models of fossil preservation and sequence evolution in simulations to evaluate different approaches to constructing fossil calibrations and their impact on Bayesian molecular clock dating, and the relative impact of fossil versus molecular sampling. We show that divergence time estimation is impacted by the model of fossil preservation, sampling intensity and tree shape. The addition of sequence data may improve molecular clock estimates, but accuracy and precision is dominated by the quality of the fossil calibrations. Posterior means and medians are poor representatives of true divergence times; posterior intervals provide a much more accurate estimate of divergence times, though they may be wide and often do not have high coverage probability. Our results highlight the importance of increased fossil sampling and improved statistical approaches to generating calibrations, which should incorporate the non-uniform nature of ecological and temporal fossil species distributions. © 2017 The Authors.
NASA Astrophysics Data System (ADS)
Kafka, Orion L.; Yu, Cheng; Shakoor, Modesar; Liu, Zeliang; Wagner, Gregory J.; Liu, Wing Kam
2018-04-01
A data-driven mechanistic modeling technique is applied to a system representative of a broken-up inclusion ("stringer") within drawn nickel-titanium wire or tube, e.g., as used for arterial stents. The approach uses a decomposition of the problem into a training stage and a prediction stage. It is applied to compute the fatigue crack incubation life of a microstructure of interest under high-cycle fatigue. A parametric study of a matrix-inclusion-void microstructure is conducted. The results indicate that, within the range studied, a larger void between halves of the inclusion increases fatigue life, while larger inclusion diameter reduces fatigue life.
Mechanistic systems modeling to guide drug discovery and development
Schmidt, Brian J.; Papin, Jason A.; Musante, Cynthia J.
2013-01-01
A crucial question that must be addressed in the drug development process is whether the proposed therapeutic target will yield the desired effect in the clinical population. Pharmaceutical and biotechnology companies place a large investment on research and development, long before confirmatory data are available from human trials. Basic science has greatly expanded the computable knowledge of disease processes, both through the generation of large omics data sets and a compendium of studies assessing cellular and systemic responses to physiologic and pathophysiologic stimuli. Given inherent uncertainties in drug development, mechanistic systems models can better inform target selection and the decision process for advancing compounds through preclinical and clinical research. PMID:22999913
Mechanistic systems modeling to guide drug discovery and development.
Schmidt, Brian J; Papin, Jason A; Musante, Cynthia J
2013-02-01
A crucial question that must be addressed in the drug development process is whether the proposed therapeutic target will yield the desired effect in the clinical population. Pharmaceutical and biotechnology companies place a large investment on research and development, long before confirmatory data are available from human trials. Basic science has greatly expanded the computable knowledge of disease processes, both through the generation of large omics data sets and a compendium of studies assessing cellular and systemic responses to physiologic and pathophysiologic stimuli. Given inherent uncertainties in drug development, mechanistic systems models can better inform target selection and the decision process for advancing compounds through preclinical and clinical research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Bashir Surfraz, M; Fowkes, Adrian; Plante, Jeffrey P
2017-08-01
The need to find an alternative to costly animal studies for developmental and reproductive toxicity testing has shifted the focus considerably to the assessment of in vitro developmental toxicology models and the exploitation of pharmacological data for relevant molecular initiating events. We hereby demonstrate how automation can be applied successfully to handle heterogeneous oestrogen receptor data from ChEMBL. Applying expert-derived thresholds to specific bioactivities allowed an activity call to be attributed to each data entry. Human intervention further improved this mechanistic dataset which was mined to develop structure-activity relationship alerts and an expert model covering 45 chemical classes for the prediction of oestrogen receptor modulation. The evaluation of the model using FDA EDKB and Tox21 data was quite encouraging. This model can also provide a teratogenicity prediction along with the additional information it provides relevant to the query compound, all of which will require careful assessment of potential risk by experts. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
2013-01-01
Background High-throughput profiling of human tissues typically yield as results the gene lists comprised of a mix of relevant molecular entities with multiple false positives that obstruct the translation of such results into mechanistic hypotheses. From general probabilistic considerations, gene lists distilled for the mechanistically relevant components can be far more useful for subsequent experimental design or data interpretation. Results The input candidate gene lists were processed into different tiers of evidence consistency established by enrichment analysis across subsets of the same experiments and across different experiments and platforms. The cut-offs were established empirically through ontological and semantic enrichment; resultant shortened gene list was re-expanded by Ingenuity Pathway Assistant tool. The resulting sub-networks provided the basis for generating mechanistic hypotheses that were partially validated by literature search. This approach differs from previous consistency-based studies in that the cut-off on the Receiver Operating Characteristic of the true-false separation process is optimized by flexible selection of the consistency building procedure. The gene list distilled by this analytic technique and its network representation were termed Compact Disease Model (CDM). Here we present the CDM signature for the study of early-stage Alzheimer’s disease. The integrated analysis of this gene signature allowed us to identify the protein traffic vesicles as prominent players in the pathogenesis of Alzheimer’s. Considering the distances and complexity of protein trafficking in neurons, it is plausible that spontaneous protein misfolding along with a shortage of growth stimulation result in neurodegeneration. Several potentially overlapping scenarios of early-stage Alzheimer pathogenesis have been discussed, with an emphasis on the protective effects of AT-1 mediated antihypertensive response on cytoskeleton remodeling, along with neuronal activation of oncogenes, luteinizing hormone signaling and insulin-related growth regulation, forming a pleiotropic model of its early stages. Alignment with emerging literature confirmed many predictions derived from early-stage Alzheimer’s disease’ CDM. Conclusions A flexible approach for high-throughput data analysis, the Compact Disease Model generation, allows extraction of meaningful, mechanism-centered gene sets compatible with instant translation of the results into testable hypotheses. PMID:24196233
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment.more » Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.« less
NASA Astrophysics Data System (ADS)
Ebrahimi, Ali; Or, Dani
2017-04-01
The sensitivity of the Earth's polar regions to raising global temperatures is reflected in rapidly changing hydrological processes with pronounced seasonal thawing of permafrost soil and increased biological activity. Of particular concern is the potential release of large amounts of soil carbon and the stimulation of other soil-borne GHG emissions such as methane. Soil methanotrophic and methanogenic microbial communities rapidly adjust their activity and spatial organization in response to permafrost thawing and a host of other environmental factors. Soil structural elements such as aggregates and layering and hydration status affect oxygen and nutrient diffusion processes thereby contributing to methanogenic activity within temporal anoxic niches (hotspots or hot-layers). We developed a mechanistic individual based model to quantify microbial activity dynamics within soil pore networks considering, hydration, temperature, transport processes and enzymatic activity associated with methane production in soil. The model was the upscaled from single aggregates (or hotspots) to quantifying emissions from soil profiles in which freezing/thawing processes provide macroscopic boundary conditions for microbial activity at different soil depths. The model distinguishes microbial activity in aerate bulk soil from aggregates (or submerged parts of the profile) for resolving methane production and oxidation rates. Methane transport pathways through soil by diffusion and ebullition of bubbles vary with hydration dynamics and affect emission patterns. The model links seasonal thermal and hydrologic dynamics with evolution of microbial community composition and function affecting net methane emissions in good agreement with experimental data. The mechanistic model enables systematic evaluation of key controlling factors in thawing permafrost and microbial response (e.g., nutrient availability, enzyme activity, PH) on long term methane emissions and carbon decomposition rates in the rapidly changing polar regions.
Critical evaluation of mechanistic two-phase flow pipeline and well simulation models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dhulesia, H.; Lopez, D.
1996-12-31
Mechanistic steady state simulation models, rather than empirical correlations, are used for a design of multiphase production system including well, pipeline and downstream installations. Among the available models, PEPITE, WELLSIM, OLGA, TACITE and TUFFP are widely used for this purpose and consequently, a critical evaluation of these models is needed. An extensive validation methodology is proposed which consists of two distinct steps: first to validate the hydrodynamic point model using the test loop data and, then to validate the over-all simulation model using the real pipelines and wells data. The test loop databank used in this analysis contains about 5952more » data sets originated from four different test loops and a majority of these data are obtained at high pressures (up to 90 bars) with real hydrocarbon fluids. Before performing the model evaluation, physical analysis of the test loops data is required to eliminate non-coherent data. The evaluation of these point models demonstrates that the TACITE and OLGA models can be applied to any configuration of pipes. The TACITE model performs better than the OLGA model because it uses the most appropriate closure laws from the literature validated on a large number of data. The comparison of predicted and measured pressure drop for various real pipelines and wells demonstrates that the TACITE model is a reliable tool.« less
Shuryak, Igor; Brenner, David J.; Ullrich, Robert L.
2011-01-01
Different types of ionizing radiation produce different dependences of cancer risk on radiation dose/dose rate. Sparsely ionizing radiation (e.g. γ-rays) generally produces linear or upwardly curving dose responses at low doses, and the risk decreases when the dose rate is reduced (direct dose rate effect). Densely ionizing radiation (e.g. neutrons) often produces downwardly curving dose responses, where the risk initially grows with dose, but eventually stabilizes or decreases. When the dose rate is reduced, the risk increases (inverse dose rate effect). These qualitative differences suggest qualitative differences in carcinogenesis mechanisms. We hypothesize that the dominant mechanism for induction of many solid cancers by sparsely ionizing radiation is initiation of stem cells to a pre-malignant state, but for densely ionizing radiation the dominant mechanism is radiation-bystander-effect mediated promotion of already pre-malignant cell clone growth. Here we present a mathematical model based on these assumptions and test it using data on the incidence of dysplastic growths and tumors in the mammary glands of mice exposed to high or low dose rates of γ-rays and neutrons, either with or without pre-treatment with the chemical carcinogen 7,12-dimethylbenz-alpha-anthracene (DMBA). The model provides a mechanistic and quantitative explanation which is consistent with the data and may provide useful insight into human carcinogenesis. PMID:22194850
Hock, Sabrina; Hasenauer, Jan; Theis, Fabian J
2013-01-01
Diffusion is a key component of many biological processes such as chemotaxis, developmental differentiation and tissue morphogenesis. Since recently, the spatial gradients caused by diffusion can be assessed in-vitro and in-vivo using microscopy based imaging techniques. The resulting time-series of two dimensional, high-resolutions images in combination with mechanistic models enable the quantitative analysis of the underlying mechanisms. However, such a model-based analysis is still challenging due to measurement noise and sparse observations, which result in uncertainties of the model parameters. We introduce a likelihood function for image-based measurements with log-normal distributed noise. Based upon this likelihood function we formulate the maximum likelihood estimation problem, which is solved using PDE-constrained optimization methods. To assess the uncertainty and practical identifiability of the parameters we introduce profile likelihoods for diffusion processes. As proof of concept, we model certain aspects of the guidance of dendritic cells towards lymphatic vessels, an example for haptotaxis. Using a realistic set of artificial measurement data, we estimate the five kinetic parameters of this model and compute profile likelihoods. Our novel approach for the estimation of model parameters from image data as well as the proposed identifiability analysis approach is widely applicable to diffusion processes. The profile likelihood based method provides more rigorous uncertainty bounds in contrast to local approximation methods.
Bouhaddou, Mehdi; Koch, Rick J.; DiStefano, Matthew S.; Tan, Annie L.; Mertz, Alex E.
2018-01-01
Most cancer cells harbor multiple drivers whose epistasis and interactions with expression context clouds drug and drug combination sensitivity prediction. We constructed a mechanistic computational model that is context-tailored by omics data to capture regulation of stochastic proliferation and death by pan-cancer driver pathways. Simulations and experiments explore how the coordinated dynamics of RAF/MEK/ERK and PI-3K/AKT kinase activities in response to synergistic mitogen or drug combinations control cell fate in a specific cellular context. In this MCF10A cell context, simulations suggest that synergistic ERK and AKT inhibitor-induced death is likely mediated by BIM rather than BAD, which is supported by prior experimental studies. AKT dynamics explain S-phase entry synergy between EGF and insulin, but simulations suggest that stochastic ERK, and not AKT, dynamics seem to drive cell-to-cell proliferation variability, which in simulations is predictable from pre-stimulus fluctuations in C-Raf/B-Raf levels. Simulations suggest MEK alteration negligibly influences transformation, consistent with clinical data. Tailoring the model to an alternate cell expression and mutation context, a glioma cell line, allows prediction of increased sensitivity of cell death to AKT inhibition. Our model mechanistically interprets context-specific landscapes between driver pathways and cell fates, providing a framework for designing more rational cancer combination therapy. PMID:29579036
Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A
2018-05-01
Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.
Building the bridge between animal movement and population dynamics.
Morales, Juan M; Moorcroft, Paul R; Matthiopoulos, Jason; Frair, Jacqueline L; Kie, John G; Powell, Roger A; Merrill, Evelyn H; Haydon, Daniel T
2010-07-27
While the mechanistic links between animal movement and population dynamics are ecologically obvious, it is much less clear when knowledge of animal movement is a prerequisite for understanding and predicting population dynamics. GPS and other technologies enable detailed tracking of animal location concurrently with acquisition of landscape data and information on individual physiology. These tools can be used to refine our understanding of the mechanistic links between behaviour and individual condition through 'spatially informed' movement models where time allocation to different behaviours affects individual survival and reproduction. For some species, socially informed models that address the movements and average fitness of differently sized groups and how they are affected by fission-fusion processes at relevant temporal scales are required. Furthermore, as most animals revisit some places and avoid others based on their previous experiences, we foresee the incorporation of long-term memory and intention in movement models. The way animals move has important consequences for the degree of mixing that we expect to find both within a population and between individuals of different species. The mixing rate dictates the level of detail required by models to capture the influence of heterogeneity and the dynamics of intra- and interspecific interaction.
Phosphodiester models for cleavage of nucleic acids
2018-01-01
Nucleic acids that store and transfer biological information are polymeric diesters of phosphoric acid. Cleavage of the phosphodiester linkages by protein enzymes, nucleases, is one of the underlying biological processes. The remarkable catalytic efficiency of nucleases, together with the ability of ribonucleic acids to serve sometimes as nucleases, has made the cleavage of phosphodiesters a subject of intensive mechanistic studies. In addition to studies of nucleases by pH-rate dependency, X-ray crystallography, amino acid/nucleotide substitution and computational approaches, experimental and theoretical studies with small molecular model compounds still play a role. With small molecules, the importance of various elementary processes, such as proton transfer and metal ion binding, for stabilization of transition states may be elucidated and systematic variation of the basicity of the entering or departing nucleophile enables determination of the position of the transition state on the reaction coordinate. Such data is important on analyzing enzyme mechanisms based on synergistic participation of several catalytic entities. Many nucleases are metalloenzymes and small molecular models offer an excellent tool to construct models for their catalytic centers. The present review tends to be an up to date summary of what has been achieved by mechanistic studies with small molecular phosphodiesters. PMID:29719577
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohanty, Subhasish; Barua, Bipul; Soppet, William K.
This report provides an update of an earlier assessment of environmentally assisted fatigue for components in light water reactors. This report is a deliverable in September 2016 under the work package for environmentally assisted fatigue under DOE’s Light Water Reactor Sustainability program. In an April 2016 report, we presented a detailed thermal-mechanical stress analysis model for simulating the stress-strain state of a reactor pressure vessel and its nozzles under grid-load-following conditions. In this report, we provide stress-controlled fatigue test data for 508 LAS base metal alloy under different loading amplitudes (constant, variable, and random grid-load-following) and environmental conditions (in airmore » or pressurized water reactor coolant water at 300°C). Also presented is a cyclic plasticity-based analytical model that can simultaneously capture the amplitude and time dependency of the component behavior under fatigue loading. Results related to both amplitude-dependent and amplitude-independent parameters are presented. The validation results for the analytical/mechanistic model are discussed. This report provides guidance for estimating time-dependent, amplitude-independent parameters related to material behavior under different service conditions. The developed mechanistic models and the reported material parameters can be used to conduct more accurate fatigue and ratcheting evaluation of reactor components.« less
Stochastic Human Exposure and Dose Simulation Model for Pesticides
SHEDS-Pesticides (Stochastic Human Exposure and Dose Simulation Model for Pesticides) is a physically-based stochastic model developed to quantify exposure and dose of humans to multimedia, multipathway pollutants. Probabilistic inputs are combined in physical/mechanistic algorit...
Fernández, M. Paulina; Norero, Aldo; Vera, Jorge R.; Pérez, Eduardo
2011-01-01
Backgrounds and Aims Functional–structural models are interesting tools to relate environmental and management conditions with forest growth. Their three-dimensional images can reveal important characteristics of wood used for industrial products. Like virtual laboratories, they can be used to evaluate relationships among species, sites and management, and to support silvicultural design and decision processes. Our aim was to develop a functional–structural model for radiata pine (Pinus radiata) given its economic importance in many countries. Methods The plant model uses the L-system language. The structure of the model is based on operational units, which obey particular rules, and execute photosynthesis, respiration and morphogenesis, according to their particular characteristics. Plant allometry is adhered to so that harmonic growth and plant development are achieved. Environmental signals for morphogenesis are used. Dynamic turnover guides the normal evolution of the tree. Monthly steps allow for detailed information of wood characteristics. The model is independent of traditional forest inventory relationships and is conceived as a mechanistic model. For model parameterization, three databases which generated new information relating to P. radiata were analysed and incorporated. Key Results Simulations under different and contrasting environmental and management conditions were run and statistically tested. The model was validated against forest inventory data for the same sites and times and against true crown architectural data. The performance of the model for 6-year-old trees was encouraging. Total height, diameter and lengths of growth units were adequately estimated. Branch diameters were slightly overestimated. Wood density values were not satisfactory, but the cyclical pattern and increase of growth rings were reasonably well modelled. Conclusions The model was able to reproduce the development and growth of the species based on mechanistic formulations. It may be valuable in assessing stand behaviour under different environmental and management conditions, assisting in decision-making with regard to management, and as a research tool to formulate hypothesis regarding forest tree growth and development. PMID:21987452
Van Bockstal, Pieter-Jan; Mortier, Séverine Thérèse F C; De Meyer, Laurens; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas
2017-05-01
Conventional pharmaceutical freeze-drying is an inefficient and expensive batch-wise process, associated with several disadvantages leading to an uncontrolled end product variability. The proposed continuous alternative, based on spinning the vials during freezing and on optimal energy supply during drying, strongly increases process efficiency and improves product quality (uniformity). The heat transfer during continuous drying of the spin frozen vials is provided via non-contact infrared (IR) radiation. The energy transfer to the spin frozen vials should be optimised to maximise the drying efficiency while avoiding cake collapse. Therefore, a mechanistic model was developed which allows computing the optimal, dynamic IR heater temperature in function of the primary drying progress and which, hence, also allows predicting the primary drying endpoint based on the applied dynamic IR heater temperature. The model was validated by drying spin frozen vials containing the model formulation (3.9mL in 10R vials) according to the computed IR heater temperature profile. In total, 6 validation experiments were conducted. The primary drying endpoint was experimentally determined via in-line near-infrared (NIR) spectroscopy and compared with the endpoint predicted by the model (50min). The mean ratio of the experimental drying time to the predicted value was 0.91, indicating a good agreement between the model predictions and the experimental data. The end product had an elegant product appearance (visual inspection) and an acceptable residual moisture content (Karl Fischer). Copyright © 2017 Elsevier B.V. All rights reserved.
Woodward, Bill
2016-04-11
Inflammatory incompetence is characteristic of acute pediatric protein-energy malnutrition, but its underlying mechanisms remain obscure. Perhaps substantially because the research front lacks the driving force of a scholarly unifying hypothesis, it is adrift and research activity is declining. A body of animal-based research points to a unifying paradigm, the Tolerance Model, with some potential to offer coherence and a mechanistic impetus to the field. However, reasonable skepticism prevails regarding the relevance of animal models of acute pediatric malnutrition; consequently, the fundamental contributions of the animal-based component of this research front are largely overlooked. Design-related modifications to improve the relevance of animal modeling in this research front include, most notably, prioritizing essential features of pediatric malnutrition pathology rather than dietary minutiae specific to infants and children, selecting windows of experimental animal development that correspond to targeted stages of pediatric immunological ontogeny, and controlling for ontogeny-related confounders. In addition, important opportunities are presented by newer tools including the immunologically humanized mouse and outbred stocks exhibiting a magnitude of genetic heterogeneity comparable to that of human populations. Sound animal modeling is within our grasp to stimulate and support a mechanistic research front relevant to the immunological problems that accompany acute pediatric malnutrition.
Changes in Black-legged Tick Population in New England with Future Climate Change
NASA Astrophysics Data System (ADS)
Krishnan, S.; Huber, M.
2015-12-01
Lyme disease is one of the most frequently reported vector-borne diseases in the United States. In the Northeastern United States, vector transmission is maintained in a horizontal transmission cycle between the vector, the black-legged ticks, and the vertebrate reservoir hosts, which include white-tailed deer, rodents and other medium to large sized mammals. Predicting how vector populations change with future climate change is critical to understanding disease spread in the future, and for developing suitable regional adaptation strategies. For the United States, these predictions have mostly been made using regressions based on field and lab studies, or using spatial suitability studies. However, the relation between tick populations at various life-cycle stages and climate variables are complex, necessitating a mechanistic approach. In this study, we present a framework for driving a mechanistic tick population model with high-resolution regional climate modeling projections. The goal is to estimate changes in black-legged tick populations in New England for the 21st century. The tick population model used is based on the mechanistic approach of Ogden et al., (2005) developed for Canada. Dynamically downscaled climate projections at a 3-kms resolution using the Weather and Research Forecasting Model (WRF) are used to drive the tick population model.
A framework for predicting impacts on ecosystem services ...
Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. The framework introduced here represents an ongoing initiative supported by the National Institute of Mathematical and Biological Synthesis (NIMBioS; http://www.nimbi
Modeling process-structure-property relationships for additive manufacturing
NASA Astrophysics Data System (ADS)
Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Yu, Cheng; Liu, Zeliang; Lian, Yanping; Wolff, Sarah; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam
2018-02-01
This paper presents our latest work on comprehensive modeling of process-structure-property relationships for additive manufacturing (AM) materials, including using data-mining techniques to close the cycle of design-predict-optimize. To illustrate the processstructure relationship, the multi-scale multi-physics process modeling starts from the micro-scale to establish a mechanistic heat source model, to the meso-scale models of individual powder particle evolution, and finally to the macro-scale model to simulate the fabrication process of a complex product. To link structure and properties, a highefficiency mechanistic model, self-consistent clustering analyses, is developed to capture a variety of material response. The model incorporates factors such as voids, phase composition, inclusions, and grain structures, which are the differentiating features of AM metals. Furthermore, we propose data-mining as an effective solution for novel rapid design and optimization, which is motivated by the numerous influencing factors in the AM process. We believe this paper will provide a roadmap to advance AM fundamental understanding and guide the monitoring and advanced diagnostics of AM processing.
Comparison of simplified models in the prediction of two phase flow in pipelines
NASA Astrophysics Data System (ADS)
Jerez-Carrizales, M.; Jaramillo, J. E.; Fuentes, D.
2014-06-01
Prediction of two phase flow in pipelines is a common task in engineering. It is a complex phenomenon and many models have been developed to find an approximate solution to the problem. Some old models, such as the Hagedorn & Brown (HB) model, have been highlighted by many authors to give very good performance. Furthermore, many modifications have been applied to this method to improve its predictions. In this work two simplified models which are based on empiricism (HB and Mukherjee and Brill, MB) are considered. One mechanistic model which is based on the physics of the phenomenon (AN) and it still needs some correlations called closure relations is also used. Moreover, a drift flux model defined in steady state that is flow pattern dependent (HK model) is implemented. The implementation of these methods was tested using published data in the scientific literature for vertical upward flows. Furthermore, a comparison of the predictive performance of the four models is done against a well from Campo Escuela Colorado. Difference among four models is smaller than difference with experimental data from the well in Campo Escuela Colorado.
Alimohammadi, Mona; Pichardo-Almarza, Cesar; Agu, Obiekezie; Díaz-Zuccarini, Vanessa
2016-01-01
Vascular calcification results in stiffening of the aorta and is associated with hypertension and atherosclerosis. Atherogenesis is a complex, multifactorial, and systemic process; the result of a number of factors, each operating simultaneously at several spatial and temporal scales. The ability to predict sites of atherogenesis would be of great use to clinicians in order to improve diagnostic and treatment planning. In this paper, we present a mathematical model as a tool to understand why atherosclerotic plaque and calcifications occur in specific locations. This model is then used to analyze vascular calcification and atherosclerotic areas in an aortic dissection patient using a mechanistic, multi-scale modeling approach, coupling patient-specific, fluid-structure interaction simulations with a model of endothelial mechanotransduction. A number of hemodynamic factors based on state-of-the-art literature are used as inputs to the endothelial permeability model, in order to investigate plaque and calcification distributions, which are compared with clinical imaging data. A significantly improved correlation between elevated hydraulic conductivity or volume flux and the presence of calcification and plaques was achieved by using a shear index comprising both mean and oscillatory shear components (HOLMES) and a non-Newtonian viscosity model as inputs, as compared to widely used hemodynamic indicators. The proposed approach shows promise as a predictive tool. The improvements obtained using the combined biomechanical/biochemical modeling approach highlight the benefits of mechanistic modeling as a powerful tool to understand complex phenomena and provides insight into the relative importance of key hemodynamic parameters. PMID:27445834
NASA Astrophysics Data System (ADS)
Seidel, Sabine J.; Werisch, Stefan; Barfus, Klemens; Wagner, Michael; Schütze, Niels; Laber, Hermann
2014-05-01
The increasing worldwide water scarcity, costs and negative off-site effects of irrigation are leading to the necessity of developing methods of irrigation that increase water productivity. Various approaches are available for irrigation scheduling. Traditionally schedules are calculated based on soil water balance (SWB) calculations using some measure of reference evaporation and empirical crop coeffcients. These crop-specific coefficients are provided by the FAO but are also available for different regions (e.g. Germany). The approach is simple but there are several inaccuracies due to simplifications and limitations such as poor transferability. Crop growth models - which simulate the main physiological plant processes through a set of assumptions and calibration parameter - are widely used to support decision making, but also for yield gap or scenario analyses. One major advantage of mechanistic models compared to empirical approaches is their spatial and temporal transferability. Irrigation scheduling can also be based on measurements of soil water tension which is closely related to plant stress. Advantages of precise and easy measurements are able to be automated but face difficulties of finding the place where to probe especially in heterogenous soils. In this study, a two-year field experiment was used to extensively evaluate the three mentioned irrigation scheduling approaches regarding their efficiency on irrigation water application with the aim to promote better agronomic practices in irrigated horticulture. To evaluate the tested irrigation scheduling approaches, an extensive plant and soil water data collection was used to precisely calibrate the mechanistic crop model Daisy. The experiment was conducted with white cabbage (Brassica oleracea L.) on a sandy loamy field in 2012/13 near Dresden, Germany. Hereby, three irrigation scheduling approaches were tested: (i) two schedules were estimated based on SWB calculations using different crop coefficients, and (ii) one treatment was automatically drip irrigated using tensiometers (irrigation of 15 mm at a soil tension of -250 hPa at 30 cm soil depth). In treatment (iii), the irrigation schedule was estimated (using the same critera as in the tension-based treatment) applying the model Daisy partially calibrated against data of 2012. Moreover, one control treatment was minimally irrigated. Measured yield was highest for the tension-based treatment with a low irrigation water input (8.5 DM t/ha, 120 mm). Both SWB treatments showed lower yields and higher irrigation water input (both 8.3 DM t/ha, 306 and 410 mm). The simulation model based treatment yielded lower (7.5 DM t/ha, 106 mm) mainly due to drought stress caused by inaccurate simulation of the soil water dynamics and thus an overestimation of the soil moisture. The evaluation using the calibrated model estimated heavy deep percolation under both SWB treatments. Targeting the challenge to increase water productivity, soil water tension-based irrigation should be favoured. Irrigation scheduling based on SWB calculation requires accurate estimates of crop coefficients. A robust calibration of mechanistic crop models implies a high effort and can be recommended to farmers only to some extent but enables comprehensive crop growth and site analyses.
Nøst, Therese Haugdahl; Breivik, Knut; Wania, Frank; Rylander, Charlotta; Odland, Jon Øyvind; Sandanger, Torkjel Manning
2015-01-01
Background Studies on the health effects of polychlorinated biphenyls (PCBs) call for an understanding of past and present human exposure. Time-resolved mechanistic models may supplement information on concentrations in individuals obtained from measurements and/or statistical approaches if they can be shown to reproduce empirical data. Objectives Here, we evaluated the capability of one such mechanistic model to reproduce measured PCB concentrations in individual Norwegian women. We also assessed individual life-course concentrations. Methods Concentrations of four PCB congeners in pregnant (n = 310, sampled in 2007–2009) and postmenopausal (n = 244, 2005) women were compared with person-specific predictions obtained using CoZMoMAN, an emission-based environmental fate and human food-chain bioaccumulation model. Person-specific predictions were also made using statistical regression models including dietary and lifestyle variables and concentrations. Results CoZMoMAN accurately reproduced medians and ranges of measured concentrations in the two study groups. Furthermore, rank correlations between measurements and predictions from both CoZMoMAN and regression analyses were strong (Spearman’s r > 0.67). Precision in quartile assignments from predictions was strong overall as evaluated by weighted Cohen’s kappa (> 0.6). Simulations indicated large inter-individual differences in concentrations experienced in the past. Conclusions The mechanistic model reproduced all measurements of PCB concentrations within a factor of 10, and subject ranking and quartile assignments were overall largely consistent, although they were weak within each study group. Contamination histories for individuals predicted by CoZMoMAN revealed variation between study subjects, particularly in the timing of peak concentrations. Mechanistic models can provide individual PCB exposure metrics that could serve as valuable supplements to measurements. Citation Nøst TH, Breivik K, Wania F, Rylander C, Odland JØ, Sandanger TM. 2016. Estimating time-varying PCB exposures using person-specific predictions to supplement measured values: a comparison of observed and predicted values in two cohorts of Norwegian women. Environ Health Perspect 124:299–305; http://dx.doi.org/10.1289/ehp.1409191 PMID:26186800
How adverse outcome pathways can aid the development and ...
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. The present manuscript reports on expert opinion and case studies that came out of a European Commission, Joint Research Centre-sponsored work
Isazadeh, Siavash; Feng, Min; Urbina Rivas, Luis Enrique; Frigon, Dominic
2014-04-15
Two pilot-scale activated sludge reactors were operated for 98 days to provide the necessary data to develop and validate a new mathematical model predicting the reduction of biosolids production by ozonation of the return activated sludge (RAS). Three ozone doses were tested during the study. In addition to the pilot-scale study, laboratory-scale experiments were conducted with mixed liquor suspended solids and with pure cultures to parameterize the biomass inactivation process during exposure to ozone. The experiments revealed that biomass inactivation occurred even at the lowest doses, but that it was not associated with extensive COD solubilization. For validation, the model was used to simulate the temporal dynamics of the pilot-scale operational data. Increasing the description accuracy of the inactivation process improved the precision of the model in predicting the operational data. Copyright © 2014 Elsevier B.V. All rights reserved.
Perception of mind and dehumanization: Human, animal, or machine?
Morera, María D; Quiles, María N; Correa, Ana D; Delgado, Naira; Leyens, Jacques-Philippe
2016-08-02
Dehumanization is reached through several approaches, including the attribute-based model of mind perception and the metaphor-based model of dehumanization. We performed two studies to find different (de)humanized images for three targets: Professional people, Evil people, and Lowest of the low. In Study 1, we examined dimensions of mind, expecting the last two categories to be dehumanized through denial of agency (Lowest of the low) or experience (Evil people), compared with humanized targets (Professional people). Study 2 aimed to distinguish these targets using metaphors. We predicted that Evil and Lowest of the low targets would suffer mechanistic and animalistic dehumanization, respectively; our predictions were confirmed, but the metaphor-based model nuanced these results: animalistic and mechanistic dehumanization were shown as overlapping rather than independent. Evil persons were perceived as "killing machines" and "predators." Finally, Lowest of the low were not animalized but considered human beings. We discuss possible interpretations. © 2016 International Union of Psychological Science.
An, Gary C
2010-01-01
The greatest challenge facing the biomedical research community is the effective translation of basic mechanistic knowledge into clinically effective therapeutics. This challenge is most evident in attempts to understand and modulate "systems" processes/disorders, such as sepsis, cancer, and wound healing. Formulating an investigatory strategy for these issues requires the recognition that these are dynamic processes. Representation of the dynamic behavior of biological systems can aid in the investigation of complex pathophysiological processes by augmenting existing discovery procedures by integrating disparate information sources and knowledge. This approach is termed Translational Systems Biology. Focusing on the development of computational models capturing the behavior of mechanistic hypotheses provides a tool that bridges gaps in the understanding of a disease process by visualizing "thought experiments" to fill those gaps. Agent-based modeling is a computational method particularly well suited to the translation of mechanistic knowledge into a computational framework. Utilizing agent-based models as a means of dynamic hypothesis representation will be a vital means of describing, communicating, and integrating community-wide knowledge. The transparent representation of hypotheses in this dynamic fashion can form the basis of "knowledge ecologies," where selection between competing hypotheses will apply an evolutionary paradigm to the development of community knowledge.
Calculating Henry’s Constants of Charged Molecules Using SPARC
SPARC Performs Automated Reasoning in Chemistry is a computer program designed to model physical and chemical properties of molecules solely based on thier chemical structure. SPARC uses a toolbox of mechanistic perturbation models to model intermolecular interactions. SPARC has ...
Estimating Cumulative Traffic Loads, Final Report for Phase 1
DOT National Transportation Integrated Search
2000-07-01
The knowledge of traffic loads is a prerequisite for the pavement analysis process, especially for the development of load-related distress prediction models. Furthermore, the emerging mechanistically based pavement performance models and pavement de...
Mechanisms of Developmental Change in Infant Categorization
ERIC Educational Resources Information Center
Westermann, Gert; Mareschal, Denis
2012-01-01
Computational models are tools for testing mechanistic theories of learning and development. Formal models allow us to instantiate theories of cognitive development in computer simulations. Model behavior can then be compared to real performance. Connectionist models, loosely based on neural information processing, have been successful in…
Modeling of the nearshore marine ecosystem with the AQUATOX model
Process-based models can be used to forecast the responses of coastal ecosystems to changes under future scenarios. However, most models applied to coastal systems do not include higher trophic levels, which are important providers of ecosystem services. AQUATOX is a mechanistic...
Assessing uncertainty in mechanistic models
Edwin J. Green; David W. MacFarlane; Harry T. Valentine
2000-01-01
Concern over potential global change has led to increased interest in the use of mechanistic models for predicting forest growth. The rationale for this interest is that empirical models may be of limited usefulness if environmental conditions change. Intuitively, we expect that mechanistic models, grounded as far as possible in an understanding of the biology of tree...
CYP2E1 hydroxylation of aniline involves negative cooperativity.
Hartman, Jessica H; Knott, Katie; Miller, Grover P
2014-02-01
CYP2E1 plays a role in the metabolic activation and elimination of aniline, yet there are conflicting reports on its mechanism of action, and hence relevance, in aniline metabolism. Based on our work with similar compounds, we hypothesized that aniline binds two CYP2E1 sites during metabolism resulting in cooperative reaction kinetics and tested this hypothesis through rigorous in vitro studies. The kinetic profile for recombinant CYP2E1 demonstrated significant negative cooperativity based on a fit of data to the Hill equation (n=0.56). Mechanistically, the data were best explained through a two-binding site cooperative model in which aniline binds with high affinity (K(s)=30 μM) followed by a second weaker binding event (K(ss)=1100 uM) resulting in a threefold increase in the oxidation rate. Binding sites for aniline were confirmed by inhibition studies with 4-methylpyrazole. Inhibitor phenotyping experiments with human liver microsomes validated the central role for CYP2E1 in aniline hydroxylation and indicated minor roles for CYP2A6 and CYP2C9. Importantly, inhibition of minor metabolic pathways resulted in a kinetic profile for microsomal CYP2E1 that replicated the preferred mechanism and parameters observed with the recombinant enzyme. Scaled modeling of in vitro CYP2E1 metabolism of aniline to in vivo clearance, especially at low aniline levels, led to significant deviations from the traditional model based on non-cooperative, Michaelis-Menten kinetics. These findings provide a critical mechanistic perspective on the potential importance of CYP2E1 in the metabolic activation and elimination of aniline as well as the first experimental evidence of a negatively cooperative metabolic reaction catalyzed by CYP2E1. Copyright © 2013 Elsevier Inc. All rights reserved.
The development of physiologically based toxicokinetic (PBTK) models for hydrophobic chemicals in fish requires: 1) an understanding of chemical efflux at fish gills; 2) knowledge of the factors that limit chemical exchange between blood and tissues; and, 3) a mechanistic descrip...
Effects of exercise on tumor physiology and metabolism.
Pedersen, Line; Christensen, Jesper Frank; Hojman, Pernille
2015-01-01
Exercise is a potent regulator of a range of physiological processes in most tissues. Solid epidemiological data show that exercise training can reduce disease risk and mortality for several cancer diagnoses, suggesting that exercise training may directly regulate tumor physiology and metabolism. Here, we review the body of literature describing exercise intervention studies performed in rodent tumor models and elaborate on potential mechanistic effects of exercise on tumor physiology. Exercise has been shown to reduce tumor incidence, tumor multiplicity, and tumor growth across numerous different transplantable, chemically induced or genetic tumor models. We propose 4 emerging mechanistic effects of exercise, including (1) vascularization and blood perfusion, (2) immune function, (3) tumor metabolism, and (4) muscle-to-cancer cross-talk, and discuss these in details. In conclusion, exercise training has the potential to be a beneficial and integrated component of cancer management, but has yet to fully elucidate its potential. Understanding the mechanistic effects of exercise on tumor physiology is warranted. Insight into these mechanistic effects is emerging, but experimental intervention studies are still needed to verify the cause-effect relationship between these mechanisms and the control of tumor growth.
Varma, Manthena V S; Lai, Yurong; Kimoto, Emi; Goosen, Theunis C; El-Kattan, Ayman F; Kumar, Vikas
2013-04-01
Quantitative prediction of complex drug-drug interactions (DDIs) is challenging. Repaglinide is mainly metabolized by cytochrome-P-450 (CYP)2C8 and CYP3A4, and is also a substrate of organic anion transporting polypeptide (OATP)1B1. The purpose is to develop a physiologically based pharmacokinetic (PBPK) model to predict the pharmacokinetics and DDIs of repaglinide. In vitro hepatic transport of repaglinide, gemfibrozil and gemfibrozil 1-O-β-glucuronide was characterized using sandwich-culture human hepatocytes. A PBPK model, implemented in Simcyp (Sheffield, UK), was developed utilizing in vitro transport and metabolic clearance data. In vitro studies suggested significant active hepatic uptake of repaglinide. Mechanistic model adequately described repaglinide pharmacokinetics, and successfully predicted DDIs with several OATP1B1 and CYP3A4 inhibitors (<10% error). Furthermore, repaglinide-gemfibrozil interaction at therapeutic dose was closely predicted using in vitro fraction metabolism for CYP2C8 (0.71), when primarily considering reversible inhibition of OATP1B1 and mechanism-based inactivation of CYP2C8 by gemfibrozil and gemfibrozil 1-O-β-glucuronide. This study demonstrated that hepatic uptake is rate-determining in the systemic clearance of repaglinide. The model quantitatively predicted several repaglinide DDIs, including the complex interactions with gemfibrozil. Both OATP1B1 and CYP2C8 inhibition contribute significantly to repaglinide-gemfibrozil interaction, and need to be considered for quantitative rationalization of DDIs with either drug.
Fast charging technique for high power LiFePO4 batteries: A mechanistic analysis of aging
NASA Astrophysics Data System (ADS)
Anseán, D.; Dubarry, M.; Devie, A.; Liaw, B. Y.; García, V. M.; Viera, J. C.; González, M.
2016-07-01
One of the major issues hampering the acceptance of electric vehicles (EVs) is the anxiety associated with long charging time. Hence, the ability to fast charging lithium-ion battery (LIB) systems is gaining notable interest. However, fast charging is not tolerated by all LIB chemistries because it affects battery functionality and accelerates its aging processes. Here, we investigate the long-term effects of multistage fast charging on a commercial high power LiFePO4-based cell and compare it to another cell tested under standard charging. Coupling incremental capacity (IC) and IC peak area analysis together with mechanistic model simulations ('Alawa' toolbox with harvested half-cell data), we quantify the degradation modes that cause aging of the tested cells. The results show that the proposed fast charging technique caused similar aging effects as standard charging. The degradation is caused by a linear loss of lithium inventory, coupled with a less degree of linear loss of active material on the negative electrode. This study validates fast charging as a feasible mean of operation for this particular LIB chemistry and cell architecture. It also illustrates the benefits of a mechanistic approach to understand cell degradation on commercial cells.
Understanding the effect of carbon status on stem diameter variations
De Swaef, Tom; Driever, Steven M.; Van Meulebroek, Lieven; Vanhaecke, Lynn; Marcelis, Leo F. M.; Steppe, Kathy
2013-01-01
Background Carbon assimilation and leaf-to-fruit sugar transport are, along with plant water status, the driving mechanisms for fruit growth. An integrated comprehension of the plant water and carbon relationships is therefore essential to better understand water and dry matter accumulation. Variations in stem diameter result from an integrated response to plant water and carbon status and are as such a valuable source of information. Methods A mechanistic water flow and storage model was used to relate variations in stem diameter to phloem sugar loading and sugar concentration dynamics in tomato. The simulation results were compared with an independent model, simulating phloem sucrose loading at the leaf level based on photosynthesis and sugar metabolism kinetics and enabled a mechanistic interpretation of the ‘one common assimilate pool’ concept for tomato. Key Results Combining stem diameter variation measurements and mechanistic modelling allowed us to distinguish instantaneous dynamics in the plant water relations and gradual variations in plant carbon status. Additionally, the model combined with stem diameter measurements enabled prediction of dynamic variables which are difficult to measure in a continuous and non-destructive way, such as xylem water potential and phloem hydrostatic potential. Finally, dynamics in phloem sugar loading and sugar concentration were distilled from stem diameter variations. Conclusions Stem diameter variations, when used in mechanistic models, have great potential to continuously monitor and interpret plant water and carbon relations under natural growing conditions. PMID:23186836
Energy efficiency drives the global seasonal distribution of birds.
Somveille, Marius; Rodrigues, Ana S L; Manica, Andrea
2018-06-01
The uneven distribution of biodiversity on Earth is one of the most general and puzzling patterns in ecology. Many hypotheses have been proposed to explain it, based on evolutionary processes or on constraints related to geography and energy. However, previous studies investigating these hypotheses have been largely descriptive due to the logistical difficulties of conducting controlled experiments on such large geographical scales. Here, we use bird migration-the seasonal redistribution of approximately 15% of bird species across the world-as a natural experiment for testing the species-energy relationship, the hypothesis that animal diversity is driven by energetic constraints. We develop a mechanistic model of bird distributions across the world, and across seasons, based on simple ecological and energetic principles. Using this model, we show that bird species distributions optimize the balance between energy acquisition and energy expenditure while taking into account competition with other species. These findings support, and provide a mechanistic explanation for, the species-energy relationship. The findings also provide a general explanation of migration as a mechanism that allows birds to optimize their energy budget in the face of seasonality and competition. Finally, our mechanistic model provides a tool for predicting how ecosystems will respond to global anthropogenic change.
Building a mechanistic understanding of predation with GPS-based movement data.
Merrill, Evelyn; Sand, Håkan; Zimmermann, Barbara; McPhee, Heather; Webb, Nathan; Hebblewhite, Mark; Wabakken, Petter; Frair, Jacqueline L
2010-07-27
Quantifying kill rates and sources of variation in kill rates remains an important challenge in linking predators to their prey. We address current approaches to using global positioning system (GPS)-based movement data for quantifying key predation components of large carnivores. We review approaches to identify kill sites from GPS movement data as a means to estimate kill rates and address advantages of using GPS-based data over past approaches. Despite considerable progress, modelling the probability that a cluster of GPS points is a kill site is no substitute for field visits, but can guide our field efforts. Once kill sites are identified, time spent at a kill site (handling time) and time between kills (killing time) can be determined. We show how statistical models can be used to investigate the influence of factors such as animal characteristics (e.g. age, sex, group size) and landscape features on either handling time or killing efficiency. If we know the prey densities along paths to a kill, we can quantify the 'attack success' parameter in functional response models directly. Problems remain in incorporating the behavioural complexity derived from GPS movement paths into functional response models, particularly in multi-prey systems, but we believe that exploring the details of GPS movement data has put us on the right path.
Visible Machine Learning for Biomedicine.
Yu, Michael K; Ma, Jianzhu; Fisher, Jasmin; Kreisberg, Jason F; Raphael, Benjamin J; Ideker, Trey
2018-06-14
A major ambition of artificial intelligence lies in translating patient data to successful therapies. Machine learning models face particular challenges in biomedicine, however, including handling of extreme data heterogeneity and lack of mechanistic insight into predictions. Here, we argue for "visible" approaches that guide model structure with experimental biology. Copyright © 2018. Published by Elsevier Inc.
Golightly, Andrew; Wilkinson, Darren J.
2011-01-01
Computational systems biology is concerned with the development of detailed mechanistic models of biological processes. Such models are often stochastic and analytically intractable, containing uncertain parameters that must be estimated from time course data. In this article, we consider the task of inferring the parameters of a stochastic kinetic model defined as a Markov (jump) process. Inference for the parameters of complex nonlinear multivariate stochastic process models is a challenging problem, but we find here that algorithms based on particle Markov chain Monte Carlo turn out to be a very effective computationally intensive approach to the problem. Approximations to the inferential model based on stochastic differential equations (SDEs) are considered, as well as improvements to the inference scheme that exploit the SDE structure. We apply the methodology to a Lotka–Volterra system and a prokaryotic auto-regulatory network. PMID:23226583
A series RCL circuit theory for analyzing non-steady-state water uptake of maize plants.
Zhuang, Jie; Yu, Gui-Rui; Nakayama, Keiichi
2014-10-22
Understanding water uptake and transport through the soil-plant continuum is vital for ecosystem management and agricultural water use. Plant water uptake under natural conditions is a non-steady transient flow controlled by root distribution, plant configuration, soil hydraulics, and climatic conditions. Despite significant progress in model development, a mechanistic description of transient water uptake has not been developed or remains incomplete. Here, based on advanced electrical network theory (RLC circuit theory), we developed a non-steady state biophysical model to mechanistically analyze the fluctuations of uptake rates in response to water stress. We found that the non-steady-state model captures the nature of instantaneity and hysteresis of plant water uptake due to the considerations of water storage in plant xylem and coarse roots (capacitance effect), hydraulic architecture of leaf system (inductance effect), and soil-root contact (fuse effect). The model provides insights into the important role of plant configuration and hydraulic heterogeneity in helping plants survive an adverse environment. Our tests against field data suggest that the non-steady-state model has great potential for being used to interpret the smart water strategy of plants, which is intrinsically determined by stem size, leaf size/thickness and distribution, root system architecture, and the ratio of fine-to-coarse root lengths.
Gupta, Pankaj; Friberg, Lena E; Karlsson, Mats O; Krishnaswami, Sriram; French, Jonathan
2010-06-01
CP-690,550, a selective inhibitor of the Janus kinase family, is being developed as an oral disease-modifying antirheumatic drug for the treatment of rheumatoid arthritis (RA). A semi-mechanistic model was developed to characterize the time course of drug-induced absolute neutrophil count (ANC) reduction in a phase 2a study. Data from 264 RA patients receiving 6-week treatment (placebo, 5, 15, 30 mg bid) followed by a 6-week off-treatment period were analyzed. The model included a progenitor cell pool, a maturation chain comprising transit compartments, a circulation pool, and a feedback mechanism. The model was adequately described by system parameters (BASE(h), ktr(h), gamma, and k(circ)), disease effect parameters (DIS), and drug effect parameters (k(off) and k(D)). The disease manifested as an increase in baseline ANC and reduced maturation time due to increased demand from the inflammation site. The drug restored the perturbed system parameters to their normal values via an indirect mechanism. ANC reduction due to a direct myelosuppressive drug effect was not supported. The final model successfully described the dose- and time-dependent changes in ANC and predicted the incidence of neutropenia at different doses reasonably well.
WholeCellSimDB: a hybrid relational/HDF database for whole-cell model predictions
Karr, Jonathan R.; Phillips, Nolan C.; Covert, Markus W.
2014-01-01
Mechanistic ‘whole-cell’ models are needed to develop a complete understanding of cell physiology. However, extracting biological insights from whole-cell models requires running and analyzing large numbers of simulations. We developed WholeCellSimDB, a database for organizing whole-cell simulations. WholeCellSimDB was designed to enable researchers to search simulation metadata to identify simulations for further analysis, and quickly slice and aggregate simulation results data. In addition, WholeCellSimDB enables users to share simulations with the broader research community. The database uses a hybrid relational/hierarchical data format architecture to efficiently store and retrieve both simulation setup metadata and results data. WholeCellSimDB provides a graphical Web-based interface to search, browse, plot and export simulations; a JavaScript Object Notation (JSON) Web service to retrieve data for Web-based visualizations; a command-line interface to deposit simulations; and a Python API to retrieve data for advanced analysis. Overall, we believe WholeCellSimDB will help researchers use whole-cell models to advance basic biological science and bioengineering. Database URL: http://www.wholecellsimdb.org Source code repository URL: http://github.com/CovertLab/WholeCellSimDB PMID:25231498
NASA Astrophysics Data System (ADS)
Evans, M. E.; Merow, C.; Record, S.; Menlove, J.; Gray, A.; Cundiff, J.; McMahon, S.; Enquist, B. J.
2013-12-01
Current attempts to forecast how species' distributions will change in response to climate change suffer under a fundamental trade-off: between modeling many species superficially vs. few species in detail (between correlative vs. mechanistic models). The goals of this talk are two-fold: first, we present a Bayesian multilevel modeling framework, dynamic range modeling (DRM), for building process-based forecasts of many species' distributions at a time, designed to address the trade-off between detail and number of distribution forecasts. In contrast to 'species distribution modeling' or 'niche modeling', which uses only species' occurrence data and environmental data, DRMs draw upon demographic data, abundance data, trait data, occurrence data, and GIS layers of climate in a single framework to account for two processes known to influence range dynamics - demography and dispersal. The vision is to use extensive databases on plant demography, distributions, and traits - in the Botanical Information and Ecology Network, the Forest Inventory and Analysis database (FIA), and the International Tree Ring Data Bank - to develop DRMs for North American trees. Second, we present preliminary results from building the core submodel of a DRM - an integral projection model (IPM) - for a sample of dominant tree species in western North America. IPMs are used to infer demographic niches - i.e., the set of environmental conditions under which population growth rate is positive - and project population dynamics through time. Based on >550,000 data points derived from FIA for nine tree species in western North America, we show IPM-based models of their current and future distributions, and discuss how IPMs can be used to forecast future forest productivity, mortality patterns, and inform efforts at assisted migration.
Model-based analysis of keratin intermediate filament assembly
NASA Astrophysics Data System (ADS)
Martin, Ines; Leitner, Anke; Walther, Paul; Herrmann, Harald; Marti, Othmar
2015-09-01
The cytoskeleton of epithelial cells consists of three types of filament systems: microtubules, actin filaments and intermediate filaments (IFs). Here, we took a closer look at type I and type II IF proteins, i.e. keratins. They are hallmark constituents of epithelial cells and are responsible for the generation of stiffness, the cellular response to mechanical stimuli and the integrity of entire cell layers. Thereby, keratin networks constitute an important instrument for cells to adapt to their environment. In particular, we applied models to characterize the assembly of keratin K8 and K18 into elongated filaments as a means for network formation. For this purpose, we measured the length of in vitro assembled keratin K8/K18 filaments by transmission electron microscopy at different time points. We evaluated the experimental data of the longitudinal annealing reaction using two models from polymer chemistry: the Schulz-Zimm model and the condensation polymerization model. In both scenarios one has to make assumptions about the reaction process. We compare how well the models fit the measured data and thus determine which assumptions fit best. Based on mathematical modelling of experimental filament assembly data we define basic mechanistic properties of the elongation reaction process.
NASA Astrophysics Data System (ADS)
Pokhotelov, Dimitry; Becker, Erich; Stober, Gunter; Chau, Jorge L.
2018-06-01
Thermal tides play an important role in the global atmospheric dynamics and provide a key mechanism for the forcing of thermosphere-ionosphere dynamics from below. A method for extracting tidal contributions, based on the adaptive filtering, is applied to analyse multi-year observations of mesospheric winds from ground-based meteor radars located in northern Germany and Norway. The observed seasonal variability of tides is compared to simulations with the Kühlungsborn Mechanistic Circulation Model (KMCM). It is demonstrated that the model provides reasonable representation of the tidal amplitudes, though substantial differences from observations are also noticed. The limitations of applying a conventionally coarse-resolution model in combination with parametrisation of gravity waves are discussed. The work is aimed towards the development of an ionospheric model driven by the dynamics of the KMCM.
Focks, Andreas; Klasmeier, Jörg; Matthies, Michael
2010-07-01
Sulfonamides (SA) are antibiotic compounds that are widely used as human and veterinary pharmaceuticals. They are not rapidly biodegradable and have been detected in various environmental compartments. Effects of sulfonamides on microbial endpoints in soil have been reported from laboratory incubation studies. Sulfonamides inhibit the growth of sensitive microorganisms by competitive binding to the dihydropteroate-synthase (DHPS) enzyme of folic acid production. A mathematical model was developed that relates the extracellular SA concentration to the inhibition of the relative bacterial growth rate. Two factors--the anionic accumulation factor (AAF) and the cellular affinity factor (CAF)--determine the effective concentration of an SA. The AAF describes the SA uptake into bacterial cells and varies with both the extra- and intracellular pH values and with the acidic pKa value of an SA. The CAF subsumes relevant cellular and enzyme properties, and is directly proportional to the DHPS affinity constant for an SA. Based on the model, a mechanistic dose-response relationship is developed and evaluated against previously published data, where differences in the responses of Pseudomonas aeruginosa and Panthoea agglomerans toward changing medium pH values were found, most likely as a result of their diverse pH regulation. The derived dose-response relationship explains the pH and pKa dependency of mean effective concentration values (EC50) of eight SA and two soil bacteria based on AAF and CAF values. The mathematical model can be used to extrapolate sulfonamide effects to other pH values and to calculate the CAF as a pH-independent measure for the SA effects on microbial growth. Copyright (c) 2010 SETAC.
Bechtel, William; Abrahamsen, Adele
2010-09-01
We consider computational modeling in two fields: chronobiology and cognitive science. In circadian rhythm models, variables generally correspond to properties of parts and operations of the responsible mechanism. A computational model of this complex mechanism is grounded in empirical discoveries and contributes a more refined understanding of the dynamics of its behavior. In cognitive science, on the other hand, computational modelers typically advance de novo proposals for mechanisms to account for behavior. They offer indirect evidence that a proposed mechanism is adequate to produce particular behavioral data, but typically there is no direct empirical evidence for the hypothesized parts and operations. Models in these two fields differ in the extent of their empirical grounding, but they share the goal of achieving dynamic mechanistic explanation. That is, they augment a proposed mechanistic explanation with a computational model that enables exploration of the mechanism's dynamics. Using exemplars from circadian rhythm research, we extract six specific contributions provided by computational models. We then examine cognitive science models to determine how well they make the same types of contributions. We suggest that the modeling approach used in circadian research may prove useful in cognitive science as researchers develop procedures for experimentally decomposing cognitive mechanisms into parts and operations and begin to understand their nonlinear interactions.
Scherrer, Stephen R; Rideout, Brendan P; Giorli, Giacomo; Nosal, Eva-Marie; Weng, Kevin C
2018-01-01
Passive acoustic telemetry using coded transmitter tags and stationary receivers is a popular method for tracking movements of aquatic animals. Understanding the performance of these systems is important in array design and in analysis. Close proximity detection interference (CPDI) is a condition where receivers fail to reliably detect tag transmissions. CPDI generally occurs when the tag and receiver are near one another in acoustically reverberant settings. Here we confirm transmission multipaths reflected off the environment arriving at a receiver with sufficient delay relative to the direct signal cause CPDI. We propose a ray-propagation based model to estimate the arrival of energy via multipaths to predict CPDI occurrence, and we show how deeper deployments are particularly susceptible. A series of experiments were designed to develop and validate our model. Deep (300 m) and shallow (25 m) ranging experiments were conducted using Vemco V13 acoustic tags and VR2-W receivers. Probabilistic modeling of hourly detections was used to estimate the average distance a tag could be detected. A mechanistic model for predicting the arrival time of multipaths was developed using parameters from these experiments to calculate the direct and multipath path lengths. This model was retroactively applied to the previous ranging experiments to validate CPDI observations. Two additional experiments were designed to validate predictions of CPDI with respect to combinations of deployment depth and distance. Playback of recorded tags in a tank environment was used to confirm multipaths arriving after the receiver's blanking interval cause CPDI effects. Analysis of empirical data estimated the average maximum detection radius (AMDR), the farthest distance at which 95% of tag transmissions went undetected by receivers, was between 840 and 846 m for the deep ranging experiment across all factor permutations. From these results, CPDI was estimated within a 276.5 m radius of the receiver. These empirical estimations were consistent with mechanistic model predictions. CPDI affected detection at distances closer than 259-326 m from receivers. AMDR determined from the shallow ranging experiment was between 278 and 290 m with CPDI neither predicted nor observed. Results of validation experiments were consistent with mechanistic model predictions. Finally, we were able to predict detection/nondetection with 95.7% accuracy using the mechanistic model's criterion when simulating transmissions with and without multipaths. Close proximity detection interference results from combinations of depth and distance that produce reflected signals arriving after a receiver's blanking interval has ended. Deployment scenarios resulting in CPDI can be predicted with the proposed mechanistic model. For deeper deployments, sea-surface reflections can produce CPDI conditions, resulting in transmission rejection, regardless of the reflective properties of the seafloor.
Marmarelis, Vasilis Z.; Berger, Theodore W.
2009-01-01
Parametric and non-parametric modeling methods are combined to study the short-term plasticity (STP) of synapses in the central nervous system (CNS). The nonlinear dynamics of STP are modeled by means: (1) previously proposed parametric models based on mechanistic hypotheses and/or specific dynamical processes, and (2) non-parametric models (in the form of Volterra kernels) that transforms the presynaptic signals into postsynaptic signals. In order to synergistically use the two approaches, we estimate the Volterra kernels of the parametric models of STP for four types of synapses using synthetic broadband input–output data. Results show that the non-parametric models accurately and efficiently replicate the input–output transformations of the parametric models. Volterra kernels provide a general and quantitative representation of the STP. PMID:18506609
An, Gary; Kulkarni, Swati
2015-02-01
Inflammation plays a critical role in the development and progression of cancer, evident in multiple patient populations manifesting increased, non-resolving inflammation, such as inflammatory bowel disease, viral hepatitis and obesity. Given the complexity of both the inflammatory response and the process of oncogenesis, we utilize principles from the field of Translational Systems Biology to bridge the gap between basic mechanistic knowledge and clinical/epidemiologic data by integrating inflammation and oncogenesis within an agent-based model, the Inflammation and Cancer Agent-based Model (ICABM). The ICABM utilizes two previously published and clinically/epidemiologically validated mechanistic models to demonstrate the role of an increased inflammatory milieu on oncogenesis. Development of the ICABM required the creation of a generative hierarchy of the basic hallmarks of cancer to provide a foundation to ground the plethora of molecular and pathway components currently being studied. The ordering schema emphasizes the essential role of a fitness/selection frame shift to sub-organismal evolution as a basic property of cancer, where the generation of genetic instability as a negative effect for multicellular eukaryotic organisms represents the restoration of genetic plasticity used as an adaptive strategy by colonies of prokaryotic unicellular organisms. Simulations with the ICABM demonstrate that inflammation provides a functional environmental context that drives the shift to sub-organismal evolution, where increasingly inflammatory environments led to increasingly damaged genomes in microtumors (tumors below clinical detection size) and cancers. The flexibility of this platform readily facilitates tailoring the ICABM to specific cancers, their associated mechanisms and available epidemiological data. One clinical example of an epidemiological finding that could be investigated with this platform is the increased incidence of triple negative breast cancers in the premenopausal African-American population, which has been identified as having up-regulated of markers of inflammation. The fundamental nature of the ICABM suggests its usefulness as a base platform upon which additional molecular detail could be added as needed. Copyright © 2014 Elsevier Inc. All rights reserved.
A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...
Most predictions of the effect of climate change on species’ ranges are based on correlations between climate and current species’ distributions. These so-called envelope models may be a good first approximation, but we need demographically mechanistic models to incorporate the ...
Lambrechts, T; Papantoniou, I; Sonnaert, M; Schrooten, J; Aerts, J-M
2014-10-01
Online and non-invasive quantification of critical tissue engineering (TE) construct quality attributes in TE bioreactors is indispensable for the cost-effective up-scaling and automation of cellular construct manufacturing. However, appropriate monitoring techniques for cellular constructs in bioreactors are still lacking. This study presents a generic and robust approach to determine cell number and metabolic activity of cell-based TE constructs in perfusion bioreactors based on single oxygen sensor data in dynamic perfusion conditions. A data-based mechanistic modeling technique was used that is able to correlate the number of cells within the scaffold (R(2) = 0.80) and the metabolic activity of the cells (R(2) = 0.82) to the dynamics of the oxygen response to step changes in the perfusion rate. This generic non-destructive measurement technique is effective for a large range of cells, from as low as 1.0 × 10(5) cells to potentially multiple millions of cells, and can open-up new possibilities for effective bioprocess monitoring. © 2014 Wiley Periodicals, Inc.
Quantifying root water extraction after drought recovery using sub-mm in situ empirical data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dhiman, Indu; Bilheux, Hassina Z.; DeCarlo, Keito F.
Root-specific responses to stress are not well-known, and have been largely based on indirect measurements of bulk soil water extraction, which limits mechanistic modeling of root function. Here, we used neutron radiography to examine in situ root-soil water dynamics of a previously droughted black cottonwood ( Populus trichocarpa) seedling, contrasting water uptake by younger, thinner or older, thicker parts of the fine root system. The smaller diameter roots had greater water uptake capacity per unit surface area than the larger diameter roots, but they had less total surface area leading to less total water extraction; rates ranged from 0.0027 –more » 0.0116 g cm -2 hr -1. The finest most-active roots were not visible in the radiographs, indicating the need to include destructive sampling. Analysis based on bulk soil hydraulic properties indicated substantial redistribution of water via saturated/unsaturated flow, capillary wicking, and root hydraulic redistribution across the layers - suggesting water uptake dynamics following an infiltration event may be more complex than approximated by common soil hydraulic or root surface area modeling approaches. Lastly, our results highlight the need for continued exploration of root-trait specific water uptake rates in situ, and impacts of roots on soil hydraulic properties – both critical components for mechanistic modeling of root function.« less
Quantifying root water extraction after drought recovery using sub-mm in situ empirical data
Dhiman, Indu; Bilheux, Hassina Z.; DeCarlo, Keito F.; ...
2017-09-09
Root-specific responses to stress are not well-known, and have been largely based on indirect measurements of bulk soil water extraction, which limits mechanistic modeling of root function. Here, we used neutron radiography to examine in situ root-soil water dynamics of a previously droughted black cottonwood ( Populus trichocarpa) seedling, contrasting water uptake by younger, thinner or older, thicker parts of the fine root system. The smaller diameter roots had greater water uptake capacity per unit surface area than the larger diameter roots, but they had less total surface area leading to less total water extraction; rates ranged from 0.0027 –more » 0.0116 g cm -2 hr -1. The finest most-active roots were not visible in the radiographs, indicating the need to include destructive sampling. Analysis based on bulk soil hydraulic properties indicated substantial redistribution of water via saturated/unsaturated flow, capillary wicking, and root hydraulic redistribution across the layers - suggesting water uptake dynamics following an infiltration event may be more complex than approximated by common soil hydraulic or root surface area modeling approaches. Lastly, our results highlight the need for continued exploration of root-trait specific water uptake rates in situ, and impacts of roots on soil hydraulic properties – both critical components for mechanistic modeling of root function.« less
Public Databases Supporting Computational Toxicology
A major goal of the emerging field of computational toxicology is the development of screening-level models that predict potential toxicity of chemicals from a combination of mechanistic in vitro assay data and chemical structure descriptors. In order to build these models, resea...
Integrative models are needed to "decode the toxicological blueprint of active substances that interact with living systems" (Systems toxicology). Computational biology is uniquely positioned to capture this connectivity and help shift decision-making to mechanistic pre...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamp, Florian; Department of Radiation Oncology, Technische Universität München, Klinikum Rechts der Isar, München; Physik-Department, Technische Universität München, Garching
2015-11-01
Purpose: The physical and biological differences between heavy ions and photons have not been fully exploited and could improve treatment outcomes. In carbon ion therapy, treatment planning must account for physical properties, such as the absorbed dose and nuclear fragmentation, and for differences in the relative biological effectiveness (RBE) of ions compared with photons. We combined the mechanistic repair-misrepair-fixation (RMF) model with Monte Carlo-generated fragmentation spectra for biological optimization of carbon ion treatment plans. Methods and Materials: Relative changes in double-strand break yields and radiosensitivity parameters with particle type and energy were determined using the independently benchmarked Monte Carlo damagemore » simulation and the RMF model to estimate the RBE values for primary carbon ions and secondary fragments. Depth-dependent energy spectra were generated with the Monte Carlo code FLUKA for clinically relevant initial carbon ion energies. The predicted trends in RBE were compared with the published experimental data. Biological optimization for carbon ions was implemented in a 3-dimensional research treatment planning tool. Results: We compared the RBE and RBE-weighted dose (RWD) distributions of different carbon ion treatment scenarios with and without nuclear fragments. The inclusion of fragments in the simulations led to smaller RBE predictions. A validation of RMF against measured cell survival data reported in published studies showed reasonable agreement. We calculated and optimized the RWD distributions on patient data and compared the RMF predictions with those from other biological models. The RBE values in an astrocytoma tumor ranged from 2.2 to 4.9 (mean 2.8) for a RWD of 3 Gy(RBE) assuming (α/β){sub X} = 2 Gy. Conclusions: These studies provide new information to quantify and assess uncertainties in the clinically relevant RBE values for carbon ion therapy based on biophysical mechanisms. We present results from the first biological optimization of carbon ion radiation therapy beams on patient data using a combined RMF and Monte Carlo damage simulation modeling approach. The presented method is advantageous for fast biological optimization.« less
Kamp, Florian; Cabal, Gonzalo; Mairani, Andrea; Parodi, Katia; Wilkens, Jan J; Carlson, David J
2015-11-01
The physical and biological differences between heavy ions and photons have not been fully exploited and could improve treatment outcomes. In carbon ion therapy, treatment planning must account for physical properties, such as the absorbed dose and nuclear fragmentation, and for differences in the relative biological effectiveness (RBE) of ions compared with photons. We combined the mechanistic repair-misrepair-fixation (RMF) model with Monte Carlo-generated fragmentation spectra for biological optimization of carbon ion treatment plans. Relative changes in double-strand break yields and radiosensitivity parameters with particle type and energy were determined using the independently benchmarked Monte Carlo damage simulation and the RMF model to estimate the RBE values for primary carbon ions and secondary fragments. Depth-dependent energy spectra were generated with the Monte Carlo code FLUKA for clinically relevant initial carbon ion energies. The predicted trends in RBE were compared with the published experimental data. Biological optimization for carbon ions was implemented in a 3-dimensional research treatment planning tool. We compared the RBE and RBE-weighted dose (RWD) distributions of different carbon ion treatment scenarios with and without nuclear fragments. The inclusion of fragments in the simulations led to smaller RBE predictions. A validation of RMF against measured cell survival data reported in published studies showed reasonable agreement. We calculated and optimized the RWD distributions on patient data and compared the RMF predictions with those from other biological models. The RBE values in an astrocytoma tumor ranged from 2.2 to 4.9 (mean 2.8) for a RWD of 3 Gy(RBE) assuming (α/β)X = 2 Gy. These studies provide new information to quantify and assess uncertainties in the clinically relevant RBE values for carbon ion therapy based on biophysical mechanisms. We present results from the first biological optimization of carbon ion radiation therapy beams on patient data using a combined RMF and Monte Carlo damage simulation modeling approach. The presented method is advantageous for fast biological optimization. Copyright © 2015 Elsevier Inc. All rights reserved.
Social stress shortens lifespan in mice.
Razzoli, Maria; Nyuyki-Dufe, Kewir; Gurney, Allison; Erickson, Connor; McCallum, Jacob; Spielman, Nicholas; Marzullo, Marta; Patricelli, Jessica; Kurata, Morito; Pope, Emily A; Touma, Chadi; Palme, Rupert; Largaespada, David A; Allison, David B; Bartolomucci, Alessandro
2018-05-28
Stress and low socioeconomic status in humans confer increased vulnerability to morbidity and mortality. However, this association is not mechanistically understood nor has its causation been explored in animal models thus far. Recently, cellular senescence has been suggested as a potential mechanism linking lifelong stress to age-related diseases and shorter life expectancy in humans. Here, we established a causal role for lifelong social stress on shortening lifespan and increasing the risk of cardiovascular disease in mice. Specifically, we developed a lifelong chronic psychosocial stress model in which male mouse aggressive behavior is used to study the impact of negative social confrontations on healthspan and lifespan. C57BL/6J mice identified through unbiased cluster analysis for receiving high while exhibiting low aggression, or identified as subordinate based on an ethologic criterion, had lower median and maximal lifespan, and developed earlier onset of several organ pathologies in the presence of a cellular senescence signature. Critically, subordinate mice developed spontaneous early-stage atherosclerotic lesions of the aortic sinuses characterized by significant immune cells infiltration and sporadic rupture and calcification, none of which was found in dominant subjects. In conclusion, we present here the first rodent model to study and mechanistically dissect the impact of chronic stress on lifespan and disease of aging. These data highlight a conserved role for social stress and low social status on shortening lifespan and increasing the risk of cardiovascular disease in mammals and identify a potential mechanistic link for this complex phenomenon. © 2018 The Authors. Aging Cell published by the Anatomical Society and John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Haihua; Zou, Ling; Zhang, Hongbin
As part of the efforts to understand the unexpected “self-regulating” mode of the RCIC (Reactor Core Isolation Cooling) systems in Fukushima accidents and extend BWR RCIC and PWR AFW (Auxiliary Feed Water) operational range and flexibility, mechanistic models for the Terry turbine, based on Sandia’s original work [1], have been developed and implemented in the RELAP-7 code to simulate the RCIC system. In 2016, our effort has been focused on normal working conditions of the RCIC system. More complex off-design conditions will be pursued in later years when more data are available. In the Sandia model, the turbine stator inletmore » velocity is provided according to a reduced-order model which was obtained from a large number of CFD (computational fluid dynamics) simulations. In this work, we propose an alternative method, using an under-expanded jet model to obtain the velocity and thermodynamic conditions for the turbine stator inlet. The models include both an adiabatic expansion process inside the nozzle and a free expansion process outside of the nozzle to ambient pressure. The combined models are able to predict the steam mass flow rate and supersonic velocity to the Terry turbine bucket entrance, which are the necessary input information for the Terry turbine rotor model. The analytical models for the nozzle were validated with experimental data and benchmarked with CFD simulations. The analytical models generally agree well with the experimental data and CFD simulations. The analytical models are suitable for implementation into a reactor system analysis code or severe accident code as part of mechanistic and dynamical models to understand the RCIC behaviors. The newly developed nozzle models and modified turbine rotor model according to the Sandia’s original work have been implemented into RELAP-7, along with the original Sandia Terry turbine model. A new pump model has also been developed and implemented to couple with the Terry turbine model. An input model was developed to test the Terry turbine RCIC system, which generates reasonable results. Both the INL RCIC model and the Sandia RCIC model produce results matching major rated parameters such as the rotational speed, pump torque, and the turbine shaft work for the normal operation condition. The Sandia model is more sensitive to the turbine outlet pressure than the INL model. The next step will be further refining the Terry turbine models by including two-phase flow cases so that off-design conditions can be simulated. The pump model could also be enhanced with the use of the homologous curves.« less
Mechanistic Understanding of Microbial Plugging for Improved Sweep Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steven Bryant; Larry Britton
2008-09-30
Microbial plugging has been proposed as an effective low cost method of permeability reduction. Yet there is a dearth of information on the fundamental processes of microbial growth in porous media, and there are no suitable data to model the process of microbial plugging as it relates to sweep efficiency. To optimize the field implementation, better mechanistic and volumetric understanding of biofilm growth within a porous medium is needed. In particular, the engineering design hinges upon a quantitative relationship between amount of nutrient consumption, amount of growth, and degree of permeability reduction. In this project experiments were conducted to obtainmore » new data to elucidate this relationship. Experiments in heterogeneous (layered) beadpacks showed that microbes could grow preferentially in the high permeability layer. Ultimately this caused flow to be equally divided between high and low permeability layers, precisely the behavior needed for MEOR. Remarkably, classical models of microbial nutrient uptake in batch experiments do not explain the nutrient consumption by the same microbes in flow experiments. We propose a simple extension of classical kinetics to account for the self-limiting consumption of nutrient observed in our experiments, and we outline a modeling approach based on architecture and behavior of biofilms. Such a model would account for the changing trend of nutrient consumption by bacteria with the increasing biomass and the onset of biofilm formation. However no existing model can explain the microbial preference for growth in high permeability regions, nor is there any obvious extension of the model for this observation. An attractive conjecture is that quorum sensing is involved in the heterogeneous bead packs.« less
Generating daily weather data for ecosystem modelling in the Congo River Basin
NASA Astrophysics Data System (ADS)
Petritsch, Richard; Pietsch, Stephan A.
2010-05-01
Daily weather data are an important constraint for diverse applications in ecosystem research. In particular, temperature and precipitation are the main drivers for forest ecosystem productivity. Mechanistic modelling theory heavily relies on daily values for minimum and maximum temperatures, precipitation, incident solar radiation and vapour pressure deficit. Although the number of climate measurement stations increased during the last centuries, there are still regions with limited climate data. For example, in the WMO database there are only 16 stations located in Gabon with daily weather measurements. Additionally, the available time series are heavily affected by measurement errors or missing values. In the WMO record for Gabon, on average every second day is missing. Monthly means are more robust and may be estimated over larger areas. Therefore, a good alternative is to interpolate monthly mean values using a sparse network of measurement stations, and based on these monthly data generate daily weather data with defined characteristics. The weather generator MarkSim was developed to produce climatological time series for crop modelling in the tropics. It provides daily values for maximum and minimum temperature, precipitation and solar radiation. The monthly means can either be derived from the internal climate surfaces or prescribed as additional inputs. We compared the generated outputs observations from three climate stations in Gabon (Lastourville, Moanda and Mouilla) and found that maximum temperature and solar radiation were heavily overestimated during the long dry season. This is due to the internal dependency of the solar radiation estimates to precipitation. With no precipitation a cloudless sky is assumed and thus high incident solar radiation and a large diurnal temperature range. However, in reality it is cloudy in the Congo River Basin during the long dry season. Therefore, we applied a correction factor to solar radiation and temperature range based on the ratio of values on rainy days and days without rain, respectively. For assessing the impact of our correction, we simulated the ecosystem behaviour using the climate data from Lastourville, Moanda and Mouilla with the mechanistic ecosystem model Biome-BGC. Differences in terms of the carbon, nitrogen and water cycle were subsequently analysed and discussed.
Ufuk, Ayşe; Assmus, Frauke; Francis, Laura; Plumb, Jonathan; Damian, Valeriu; Gertz, Michael; Houston, J Brian; Galetin, Aleksandra
2017-04-03
Accumulation of respiratory drugs in human alveolar macrophages (AMs) has not been extensively studied in vitro and in silico despite its potential impact on therapeutic efficacy and/or occurrence of phospholipidosis. The current study aims to characterize the accumulation and subcellular distribution of drugs with respiratory indication in human AMs and to develop an in silico mechanistic AM model to predict lysosomal accumulation of investigated drugs. The data set included 9 drugs previously investigated in rat AM cell line NR8383. Cell-to-unbound medium concentration ratio (K p,cell ) of all drugs (5 μM) was determined to assess the magnitude of intracellular accumulation. The extent of lysosomal sequestration in freshly isolated human AMs from multiple donors (n = 5) was investigated for clarithromycin and imipramine (positive control) using an indirect in vitro method (±20 mM ammonium chloride, NH 4 Cl). The AM cell parameters and drug physicochemical data were collated to develop an in silico mechanistic AM model. Three in silico models differing in their description of drug membrane partitioning were evaluated; model (1) relied on octanol-water partitioning of drugs, model (2) used in vitro data to account for this process, and model (3) predicted membrane partitioning by incorporating AM phospholipid fractions. In vitro K p,cell ranged >200-fold for respiratory drugs, with the highest accumulation seen for clarithromycin. A good agreement in K p,cell was observed between human AMs and NR8383 (2.45-fold bias), highlighting NR8383 as a potentially useful in vitro surrogate tool to characterize drug accumulation in AMs. The mean K p,cell of clarithromycin (81, CV = 51%) and imipramine (963, CV = 54%) were reduced in the presence of NH 4 Cl by up to 67% and 81%, respectively, suggesting substantial contribution of lysosomal sequestration and intracellular binding in the accumulation of these drugs in human AMs. The in vitro data showed variability in drug accumulation between individual human AM donors due to possible differences in lysosomal abundance, volume, and phospholipid content, which may have important clinical implications. Consideration of drug-acidic phospholipid interactions significantly improved the performance of the in silico models; use of in vitro K p,cell obtained in the presence of NH 4 Cl as a surrogate for membrane partitioning (model (2)) captured the variability in clarithromycin and imipramine K p,cell observed in vitro and showed the best ability to predict correctly positive and negative lysosomotropic properties. The developed mechanistic AM model represents a useful in silico tool to predict lysosomal and cellular drug concentrations based on drug physicochemical data and system specific properties, with potential application to other cell types.
Integrating Environmental Genomics and Biogeochemical Models: a Gene-centric Approach
NASA Astrophysics Data System (ADS)
Reed, D. C.; Algar, C. K.; Huber, J. A.; Dick, G.
2013-12-01
Rapid advances in molecular microbial ecology have yielded an unprecedented amount of data about the evolutionary relationships and functional traits of microbial communities that regulate global geochemical cycles. Biogeochemical models, however, are trailing in the wake of the environmental genomics revolution and such models rarely incorporate explicit representations of bacteria and archaea, nor are they compatible with nucleic acid or protein sequence data. Here, we present a functional gene-based framework for describing microbial communities in biogeochemical models that uses genomics data and provides predictions that are readily testable using cutting-edge molecular tools. To demonstrate the approach in practice, nitrogen cycling in the Arabian Sea oxygen minimum zone (OMZ) was modelled to examine key questions about cryptic sulphur cycling and dinitrogen production pathways in OMZs. By directly linking geochemical dynamics to the genetic composition of microbial communities, the method provides mechanistic insights into patterns and biogeochemical consequences of marine microbes. Such an approach is critical for informing our understanding of the key role microbes play in modulating Earth's biogeochemistry.
Big data, big knowledge: big data for personalized healthcare.
Viceconti, Marco; Hunter, Peter; Hose, Rod
2015-07-01
The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.
Modeling Bird Migration under Climate Change: A Mechanistic Approach
NASA Technical Reports Server (NTRS)
Smith, James A.
2009-01-01
How will migrating birds respond to changes in the environment under climate change? What are the implications for migratory success under the various accelerated climate change scenarios as forecast by the Intergovernmental Panel on Climate Change? How will reductions or increased variability in the number or quality of wetland stop-over sites affect migratory bird species? The answers to these questions have important ramifications for conservation biology and wildlife management. Here, we describe the use of continental scale simulation modeling to explore how spatio-temporal changes along migratory flyways affect en-route migration success. We use an individually based, biophysical, mechanistic, bird migration model to simulate the movement of shorebirds in North America as a tool to study how such factors as drought and wetland loss may impact migratory success and modify migration patterns. Our model is driven by remote sensing and climate data and incorporates important landscape variables. The energy budget components of the model include resting, foraging, and flight, but presently predation is ignored. Results/Conclusions We illustrate our model by studying the spring migration of sandpipers through the Great Plains to their Arctic breeding grounds. Why many species of shorebirds have shown significant declines remains a puzzle. Shorebirds are sensitive to stop-over quality and spacing because of their need for frequent refueling stops and their opportunistic feeding patterns. We predict bird "hydrographs that is, stop-over frequency with latitude, that are in agreement with the literature. Mean stop-over durations predicted from our model for nominal cases also are consistent with the limited, but available data. For the shorebird species simulated, our model predicts that shorebirds exhibit significant plasticity and are able to shift their migration patterns in response to changing drought conditions. However, the question remains as to whether this behavior can be maintained over increasing and sustained environmental change. Also, the problem is much more complex than described by the current processes captured in our model. We have taken some important and interesting steps, and our model does demonstrate how local scale information about individual stop-over sites can be linked into the migratory flyway as a whole. We are incorporating additional, species specific, mechanistic processes to better reflect different climate change scenarios
Mechanistic modeling of insecticide risks to breeding birds in North American agroecosystems
Garber, Kristina; Odenkirchen, Edward
2017-01-01
Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. There is accumulating evidence that insecticides adversely affect non-target wildlife species, including birds, causing mortality, reproductive impairment, and indirect effects through loss of prey base, and the type and magnitude of such effects differs by chemical class, or mode of action. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of the insecticide and the risk the insecticide poses to the environment and non-target wildlife. Current USEPA risk assessments for pesticides generally rely on endpoints from laboratory based toxicity studies focused on groups of individuals and do not directly assess population-level endpoints. In this paper, we present a mechanistic model, which allows risk assessors to estimate the effects of insecticide exposure on the survival and seasonal productivity of birds known to forage in agricultural fields during their breeding season. This model relies on individual-based toxicity data and translates effects into endpoints meaningful at the population level (i.e., magnitude of mortality and reproductive impairment). The model was created from two existing USEPA avian risk assessment models, the Terrestrial Investigation Model (TIM v.3.0) and the Markov Chain Nest Productivity model (MCnest). The integrated TIM/MCnest model was used to assess the relative risk of 12 insecticides applied via aerial spray to control corn pests on a suite of 31 avian species known to forage in cornfields in agroecosystems of the Midwest, USA. We found extensive differences in risk to birds among insecticides, with chlorpyrifos and malathion (organophosphates) generally posing the greatest risk, and bifenthrin and λ-cyhalothrin (pyrethroids) posing the least risk. Comparative sensitivity analysis across the 31 species showed that ecological trait parameters related to the timing of breeding and reproductive output per nest attempt offered the greatest explanatory power for predicting the magnitude of risk. An important advantage of TIM/MCnest is that it allows risk assessors to rationally combine both acute (lethal) and chronic (reproductive) effects into a single unified measure of risk. PMID:28467479
Mechanistic modeling of insecticide risks to breeding birds in North American agroecosystems.
Etterson, Matthew; Garber, Kristina; Odenkirchen, Edward
2017-01-01
Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. There is accumulating evidence that insecticides adversely affect non-target wildlife species, including birds, causing mortality, reproductive impairment, and indirect effects through loss of prey base, and the type and magnitude of such effects differs by chemical class, or mode of action. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of the insecticide and the risk the insecticide poses to the environment and non-target wildlife. Current USEPA risk assessments for pesticides generally rely on endpoints from laboratory based toxicity studies focused on groups of individuals and do not directly assess population-level endpoints. In this paper, we present a mechanistic model, which allows risk assessors to estimate the effects of insecticide exposure on the survival and seasonal productivity of birds known to forage in agricultural fields during their breeding season. This model relies on individual-based toxicity data and translates effects into endpoints meaningful at the population level (i.e., magnitude of mortality and reproductive impairment). The model was created from two existing USEPA avian risk assessment models, the Terrestrial Investigation Model (TIM v.3.0) and the Markov Chain Nest Productivity model (MCnest). The integrated TIM/MCnest model was used to assess the relative risk of 12 insecticides applied via aerial spray to control corn pests on a suite of 31 avian species known to forage in cornfields in agroecosystems of the Midwest, USA. We found extensive differences in risk to birds among insecticides, with chlorpyrifos and malathion (organophosphates) generally posing the greatest risk, and bifenthrin and λ-cyhalothrin (pyrethroids) posing the least risk. Comparative sensitivity analysis across the 31 species showed that ecological trait parameters related to the timing of breeding and reproductive output per nest attempt offered the greatest explanatory power for predicting the magnitude of risk. An important advantage of TIM/MCnest is that it allows risk assessors to rationally combine both acute (lethal) and chronic (reproductive) effects into a single unified measure of risk.
Maling, T; Diggle, A J; Thackray, D J; Siddique, K H M; Jones, R A C
2008-12-01
A hybrid mechanistic/statistical model was developed to predict vector activity and epidemics of vector-borne viruses spreading from external virus sources to an adjacent crop. The pathosystem tested was Bean yellow mosaic virus (BYMV) spreading from annually self-regenerating, legume-based pastures to adjacent crops of narrow-leafed lupin (Lupinus angustifolius) in the winter-spring growing season in a region with a Mediterranean-type environment where the virus persists over summer within dormant seed of annual clovers. The model uses a combination of daily rainfall and mean temperature during late summer and early fall to drive aphid population increase, migration of aphids from pasture to lupin crops, and the spread of BYMV. The model predicted time of arrival of aphid vectors and resulting BYMV spread successfully for seven of eight datasets from 2 years of field observations at four sites representing different rainfall and geographic zones of the southwestern Australian grainbelt. Sensitivity analysis was performed to determine the relative importance of the main parameters that describe the pathosystem. The hybrid mechanistic/statistical approach used created a flexible analytical tool for vector-mediated plant pathosystems that made useful predictions even when field data were not available for some components of the system.
NASA Astrophysics Data System (ADS)
Beltrame, L.; Dunne, T.; Rose, H.; Walker, J.; Morgan, E.; Vickerman, P.; Wagener, T.
2016-12-01
Liver fluke is a flatworm parasite infecting grazing animals worldwide. In the UK, it causes considerable production losses to cattle and sheep industries and costs farmers millions of pounds each year due to reduced growth rates and lower milk yields. Large part of the parasite life-cycle takes place outside of the host, with its survival and development strongly controlled by climatic and hydrologic conditions. Evidence of climate-driven changes in the distribution and seasonality of fluke disease already exists, as the infection is increasingly expanding to new areas and becoming a year-round problem. Therefore, it is crucial to assess current and potential future impacts of climate variability on the disease to guide interventions at the farm scale and mitigate risk. Climate-based fluke risk models have been available since the 1950s, however, they are based on empirical relationships derived between historical climate and incidence data, and thus are unlikely to be robust for simulating risk under changing conditions. Moreover, they are not dynamic, but estimate risk over large regions in the UK based on monthly average climate conditions, so they do not allow investigating the effects of climate variability for supporting farmers' decisions. In this study, we introduce a mechanistic model for fluke, which represents habitat suitability for disease development at 25m resolution with a daily time step, explicitly linking the parasite life-cycle to key hydro-climate conditions. The model is used on a case study in the UK and sensitivity analysis is performed to better understand the role of climate variability on the space-time dynamics of the disease, while explicitly accounting for uncertainties. Comparisons are presented with experts' knowledge and a widely used empirical model.
A semi-mechanistic model of dead fine fuel moisture for Temperate and Mediterranean ecosystems
NASA Astrophysics Data System (ADS)
Resco de Dios, Víctor; Fellows, Aaron; Boer, Matthias; Bradstock, Ross; Nolan, Rachel; Goulden, Michel
2014-05-01
Fire is a major disturbance in terrestrial ecosystems globally. It has an enormous economic and social cost, and leads to fatalities in the worst cases. The moisture content of the vegetation (fuel moisture) is one of the main determinants of fire risk. Predicting the moisture content of dead and fine fuel (< 2.5 cm in diameter) is particularly important, as this is often the most important component of the fuel complex for fire propagation. A variety of drought indices, empirical and mechanistic models have been proposed to model fuel moisture. A commonality across these different approaches is that they have been neither validated across large temporal datasets nor validated across broadly different vegetation types. Here, we present the results of a study performed at 6 locations in California, USA (5 sites) and New South Wales, Australia (1 site), where 10-hours fuel moisture content was continuously measured every 30 minutes during one full year at each site. We observed that drought indices did not accurately predict fuel moisture, and that empirical and mechanistic models both needed site-specific calibrations, which hinders their global application as indices of fuel moisture. We developed a novel, single equation and semi-mechanistic model, based on atmospheric vapor-pressure deficit. Across sites and years, mean absolute error (MAE) of predicted fuel moisture was 4.7%. MAE dropped <1% in the critical range of fuel moisture <10%. The high simplicity, accuracy and precision of our model makes it suitable for a wide range of applications: from operational purposes, to global vegetation models.
Mechanistic ecohydrological modeling with Tethys-Chloris: an attempt to unravel complexity
NASA Astrophysics Data System (ADS)
Fatichi, S.; Ivanov, V. Y.; Caporali, E.
2010-12-01
The role of vegetation in controlling and mediating hydrological states and fluxes at the level of individual processes has been largely explored, which has lead to the improvement of our understanding of mechanisms and patterns in ecohydrological systems. Nonetheless, relatively few efforts have been directed toward the development of continuous, complex, mechanistic ecohydrological models operating at the watershed-scale. This study presents a novel ecohydrological model Tethys-Chloris (T&C) and aims to discuss current limitations and perspectives of the mechanistic approach in ecohydrology. The model attempts to synthesize the state-of-the-art knowledge on individual processes and mechanisms drawn from various disciplines such as hydrology, plant physiology, ecology, and biogeochemistry. The model reproduces all essential components of hydrological cycle resolving the mass and energy budgets at the hourly scale; it includes energy and mass exchanges in the atmospheric boundary layer; a module of saturated and unsaturated soil water dynamics; two layers of vegetation, and a module of snowpack evolution. The vegetation component parsimoniously parameterizes essential plant life-cycle processes, including photosynthesis, phenology, carbon allocation, tissues turnover, and soil biogeochemistry. Quantitative metrics of model performance are discussed and highlight the capabilities of T&C in reproducing ecohydrological dynamics. The simulated patterns mimic the outcome of hydrological dynamics with high realism, given the uncertainty of imposed boundary conditions and limited data availability. Furthermore, highly satisfactory results are obtained without significant (e.g., automated) calibration efforts despite the large phase-space dimensionality of the model. A significant investment into model design and development leads to such desirable behavior. This suggests that while using the presented tool for high-precision predictions can be still problematic, the mechanistic nature of the model can be extremely valuable for designing virtual experiments, testing hypotheses. and focusing questions of scientific inquiry.
The coefficient of restitution of pressurized balls: a mechanistic model
NASA Astrophysics Data System (ADS)
Georgallas, Alex; Landry, Gaëtan
2016-01-01
Pressurized, inflated balls used in professional sports are regulated so that their behaviour upon impact can be anticipated and allow the game to have its distinctive character. However, the dynamics governing the impacts of such balls, even on stationary hard surfaces, can be extremely complex. The energy transformations, which arise from the compression of the gas within the ball and from the shear forces associated with the deformation of the wall, are examined in this paper. We develop a simple mechanistic model of the dependence of the coefficient of restitution, e, upon both the gauge pressure, P_G, of the gas and the shear modulus, G, of the wall. The model is validated using the results from a simple series of experiments using three different sports balls. The fits to the data are extremely good for P_G > 25 kPa and consistent values are obtained for the value of G for the wall material. As far as the authors can tell, this simple, mechanistic model of the pressure dependence of the coefficient of restitution is the first in the literature. *%K Coefficient of Restitution, Dynamics, Inflated Balls, Pressure, Impact Model
Effects of septum and pericardium on heart-lung interactions in a cardiopulmonary simulation model.
Karamolegkos, Nikolaos; Albanese, Antonio; Chbat, Nicolas W
2017-07-01
Mechanical heart-lung interactions are often overlooked in clinical settings. However, their impact on cardiac function can be quite significant. Mechanistic physiology-based models can provide invaluable insights into such cardiorespiratory interactions, which occur not only under external mechanical ventilatory support but in normal physiology as well. In this work, we focus on the cardiac component of a previously developed mathematical model of the human cardiopulmonary system, aiming to improve the model's response to the intrathoracic pressure variations that are associated with the respiratory cycle. Interventricular septum and pericardial membrane are integrated into the existing model. Their effect on the overall cardiac response is explained by means of comparison against simulation results from the original model as well as experimental data from literature.
Breed, Greg A.; Golson, Emily A.; Tinker, M. Tim
2017-01-01
The home‐range concept is central in animal ecology and behavior, and numerous mechanistic models have been developed to understand home range formation and maintenance. These mechanistic models usually assume a single, contiguous home range. Here we describe and implement a simple home‐range model that can accommodate multiple home‐range centers, form complex shapes, allow discontinuities in use patterns, and infer how external and internal variables affect movement and use patterns. The model assumes individuals associate with two or more home‐range centers and move among them with some estimable probability. Movement in and around home‐range centers is governed by a two‐dimensional Ornstein‐Uhlenbeck process, while transitions between centers are modeled as a stochastic state‐switching process. We augmented this base model by introducing environmental and demographic covariates that modify transition probabilities between home‐range centers and can be estimated to provide insight into the movement process. We demonstrate the model using telemetry data from sea otters (Enhydra lutris) in California. The model was fit using a Bayesian Markov Chain Monte Carlo method, which estimated transition probabilities, as well as unique Ornstein‐Uhlenbeck diffusion and centralizing tendency parameters. Estimated parameters could then be used to simulate movement and space use that was virtually indistinguishable from real data. We used Deviance Information Criterion (DIC) scores to assess model fit and determined that both wind and reproductive status were predictive of transitions between home‐range centers. Females were less likely to move between home‐range centers on windy days, less likely to move between centers when tending pups, and much more likely to move between centers just after weaning a pup. These tendencies are predicted by theoretical movement rules but were not previously known and show that our model can extract meaningful behavioral insight from complex movement data.
Mathematical modeling of PDC bit drilling process based on a single-cutter mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wojtanowicz, A.K.; Kuru, E.
1993-12-01
An analytical development of a new mechanistic drilling model for polycrystalline diamond compact (PDC) bits is presented. The derivation accounts for static balance of forces acting on a single PDC cutter and is based on assumed similarity between bit and cutter. The model is fully explicit with physical meanings given to all constants and functions. Three equations constitute the mathematical model: torque, drilling rate, and bit life. The equations comprise cutter`s geometry, rock properties drilling parameters, and four empirical constants. The constants are used to match the model to a PDC drilling process. Also presented are qualitative and predictive verificationsmore » of the model. Qualitative verification shows that the model`s response to drilling process variables is similar to the behavior of full-size PDC bits. However, accuracy of the model`s predictions of PDC bit performance is limited primarily by imprecision of bit-dull evaluation. The verification study is based upon the reported laboratory drilling and field drilling tests as well as field data collected by the authors.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zapol, Peter; Bourg, Ian; Criscenti, Louise Jacqueline
2011-10-01
This report summarizes research performed for the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Subcontinuum and Upscaling Task. The work conducted focused on developing a roadmap to include molecular scale, mechanistic information in continuum-scale models of nuclear waste glass dissolution. This information is derived from molecular-scale modeling efforts that are validated through comparison with experimental data. In addition to developing a master plan to incorporate a subcontinuum mechanistic understanding of glass dissolution into continuum models, methods were developed to generate constitutive dissolution rate expressions from quantum calculations, force field models were selected to generate multicomponent glass structures and gel layers,more » classical molecular modeling was used to study diffusion through nanopores analogous to those in the interfacial gel layer, and a micro-continuum model (K{mu}C) was developed to study coupled diffusion and reaction at the glass-gel-solution interface.« less
SHEDS-HT: An Integrated Probabilistic Exposure Model for ...
United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologically relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. Based on probabilistic methods and algorithms developed for The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals (SHEDS-MM), a new mechanistic modeling approach has been developed to accommodate high-throughput (HT) assessment of exposure potential. In this SHEDS-HT model, the residential and dietary modules of SHEDS-MM have been operationally modified to reduce the user burden, input data demands, and run times of the higher-tier model, while maintaining critical features and inputs that influence exposure. The model has been implemented in R; the modeling framework links chemicals to consumer product categories or food groups (and thus exposure scenarios) to predict HT exposures and intake doses. Initially, SHEDS-HT has been applied to 2507 organic chemicals associated with consumer products and agricultural pesticides. These evaluations employ data from recent USEPA efforts to characterize usage (prevalence, frequency, and magnitude), chemical composition, and exposure scenarios for a wide range of consumer products. In modeling indirec
Mao, Zhun; Saint-André, Laurent; Bourrier, Franck; Stokes, Alexia; Cordonnier, Thomas
2015-01-01
Background and Aims In mountain ecosystems, predicting root density in three dimensions (3-D) is highly challenging due to the spatial heterogeneity of forest communities. This study presents a simple and semi-mechanistic model, named ChaMRoots, that predicts root interception density (RID, number of roots m–2). ChaMRoots hypothesizes that RID at a given point is affected by the presence of roots from surrounding trees forming a polygon shape. Methods The model comprises three sub-models for predicting: (1) the spatial heterogeneity – RID of the finest roots in the top soil layer as a function of tree basal area at breast height, and the distance between the tree and a given point; (2) the diameter spectrum – the distribution of RID as a function of root diameter up to 50 mm thick; and (3) the vertical profile – the distribution of RID as a function of soil depth. The RID data used for fitting in the model were measured in two uneven-aged mountain forest ecosystems in the French Alps. These sites differ in tree density and species composition. Key Results In general, the validation of each sub-model indicated that all sub-models of ChaMRoots had good fits. The model achieved a highly satisfactory compromise between the number of aerial input parameters and the fit to the observed data. Conclusions The semi-mechanistic ChaMRoots model focuses on the spatial distribution of root density at the tree cluster scale, in contrast to the majority of published root models, which function at the level of the individual. Based on easy-to-measure characteristics, simple forest inventory protocols and three sub-models, it achieves a good compromise between the complexity of the case study area and that of the global model structure. ChaMRoots can be easily coupled with spatially explicit individual-based forest dynamics models and thus provides a highly transferable approach for modelling 3-D root spatial distribution in complex forest ecosystems. PMID:26173892
González-Domínguez, Elisa; Armengol, Josep; Rossi, Vittorio
2014-01-01
A mechanistic, dynamic model was developed to predict infection of loquat fruit by conidia of Fusicladium eriobotryae, the causal agent of loquat scab. The model simulates scab infection periods and their severity through the sub-processes of spore dispersal, infection, and latency (i.e., the state variables); change from one state to the following one depends on environmental conditions and on processes described by mathematical equations. Equations were developed using published data on F. eriobotryae mycelium growth, conidial germination, infection, and conidial dispersion pattern. The model was then validated by comparing model output with three independent data sets. The model accurately predicts the occurrence and severity of infection periods as well as the progress of loquat scab incidence on fruit (with concordance correlation coefficients >0.95). Model output agreed with expert assessment of the disease severity in seven loquat-growing seasons. Use of the model for scheduling fungicide applications in loquat orchards may help optimise scab management and reduce fungicide applications. PMID:25233340
Secondary dispersal driven by overland flow in drylands: Review and mechanistic model development.
Thompson, Sally E; Assouline, Shmuel; Chen, Li; Trahktenbrot, Ana; Svoray, Tal; Katul, Gabriel G
2014-01-01
Seed dispersal alters gene flow, reproduction, migration and ultimately spatial organization of dryland ecosystems. Because many seeds in drylands lack adaptations for long-distance dispersal, seed transport by secondary processes such as tumbling in the wind or mobilization in overland flow plays a dominant role in determining where seeds ultimately germinate. Here, recent developments in modeling runoff generation in spatially complex dryland ecosystems are reviewed with the aim of proposing improvements to mechanistic modeling of seed dispersal processes. The objective is to develop a physically-based yet operational framework for determining seed dispersal due to surface runoff, a process that has gained recent experimental attention. A Buoyant OBject Coupled Eulerian - Lagrangian Closure model (BOB-CELC) is proposed to represent seed movement in shallow surface flows. The BOB-CELC is then employed to investigate the sensitivity of seed transport to landscape and storm properties and to the spatial configuration of vegetation patches interspersed within bare earth. The potential to simplify seed transport outcomes by considering the limiting behavior of multiple runoff events is briefly considered, as is the potential for developing highly mechanistic, spatially explicit models that link seed transport, vegetation structure and water movement across multiple generations of dryland plants.
DOT National Transportation Integrated Search
2018-01-01
This report explores the application of a discrete computational model for predicting the fracture behavior of asphalt mixtures at low temperatures based on the results of simple laboratory experiments. In this discrete element model, coarse aggregat...
Potter, W R; Henderson, B W; Bellnier, D A; Pandey, R K; Vaughan, L A; Weishaupt, K R; Dougherty, T J
1999-11-01
An open three-compartment pharmacokinetic model was applied to the in vivo quantitative structure-activity relationship (QSAR) data of a homologous series of pyropheophorbide photosensitizers for photodynamic therapy (PDT). The physical model was a lipid compartment sandwiched between two identical aqueous compartments. The first compartment was assumed to clear irreversibly at a rate K0. The measured octanol-water partition coefficients, P(i) (where i is the number of carbons in the alkyl chain) and the clearance rate K0 determined the clearance kinetics of the drugs. Solving the coupled differential equations of the three-compartment model produced clearance kinetics for each of the sensitizers in each of the compartments. The third compartment was found to contain the target of PDT. This series of compounds is quite lipophilic. Therefore these drugs are found mainly in the second compartment. The drug level in the third compartment represents a small fraction of the tissue level and is thus not accessible to direct measurement by extraction. The second compartment of the model accurately predicted the clearance from the serum of mice of the hexyl ether of pyropheophorbide a, one member of this series of compounds. The diffusion and clearance rate constants were those found by fitting the pharmacokinetics of the third compartment to the QSAR data. This result validated the magnitude and mechanistic significance of the rate constants used to model the QSAR data. The PDT response to dose theory was applied to the kinetic behavior of the target compartment drug concentration. This produced a pharmacokinetic-based function connecting PDT response to dose as a function of time postinjection. This mechanistic dose-response function was fitted to published, single time point QSAR data for the pheophorbides. As a result, the PDT target threshold dose together with the predicted QSAR as a function of time postinjection was found.
Modeling Creep Effects in Advanced SiC/SiC Composites
NASA Technical Reports Server (NTRS)
Lang, Jerry; DiCarlo, James
2006-01-01
Because advanced SiC/SiC composites are projected to be used for aerospace components with large thermal gradients at high temperatures, efforts are on-going at NASA Glenn to develop approaches for modeling the anticipated creep behavior of these materials and its subsequent effects on such key composite properties as internal residual stress, proportional limit stress, ultimate tensile strength, and rupture life. Based primarily on in-plane creep data for 2D panels, this presentation describes initial modeling progress at applied composite stresses below matrix cracking for some high performance SiC/SiC composite systems recently developed at NASA. Studies are described to develop creep and rupture models using empirical, mechanical analog, and mechanistic approaches, and to implement them into finite element codes for improved component design and life modeling
Simulation of Plant Physiological Process Using Fuzzy Variables
Daniel L. Schmoldt
1991-01-01
Qualitative modelling can help us understand and project effects of multiple stresses on trees. It is not practical to collect and correlate empirical data for all combinations of plant/environments and human/climate stresses, especially for mature trees in natural settings. Therefore, a mechanistic model was developed to describe ecophysiological processes. This model...
Human Health Effects of Trichloroethylene: Key Findings and Scientific Issues
Jinot, Jennifer; Scott, Cheryl Siegel; Makris, Susan L.; Cooper, Glinda S.; Dzubow, Rebecca C.; Bale, Ambuja S.; Evans, Marina V.; Guyton, Kathryn Z.; Keshava, Nagalakshmi; Lipscomb, John C.; Barone, Stanley; Fox, John F.; Gwinn, Maureen R.; Schaum, John; Caldwell, Jane C.
2012-01-01
Background: In support of the Integrated Risk Information System (IRIS), the U.S. Environmental Protection Agency (EPA) completed a toxicological review of trichloroethylene (TCE) in September 2011, which was the result of an effort spanning > 20 years. Objectives: We summarized the key findings and scientific issues regarding the human health effects of TCE in the U.S. EPA’s toxicological review. Methods: In this assessment we synthesized and characterized thousands of epidemiologic, experimental animal, and mechanistic studies, and addressed several key scientific issues through modeling of TCE toxicokinetics, meta-analyses of epidemiologic studies, and analyses of mechanistic data. Discussion: Toxicokinetic modeling aided in characterizing the toxicological role of the complex metabolism and multiple metabolites of TCE. Meta-analyses of the epidemiologic data strongly supported the conclusions that TCE causes kidney cancer in humans and that TCE may also cause liver cancer and non-Hodgkin lymphoma. Mechanistic analyses support a key role for mutagenicity in TCE-induced kidney carcinogenicity. Recent evidence from studies in both humans and experimental animals point to the involvement of TCE exposure in autoimmune disease and hypersensitivity. Recent avian and in vitro mechanistic studies provided biological plausibility that TCE plays a role in developmental cardiac toxicity, the subject of substantial debate due to mixed results from epidemiologic and rodent studies. Conclusions: TCE is carcinogenic to humans by all routes of exposure and poses a potential human health hazard for noncancer toxicity to the central nervous system, kidney, liver, immune system, male reproductive system, and the developing embryo/fetus. PMID:23249866
A Decision Analytic Approach to Exposure-Based Chemical Prioritization
Mitchell, Jade; Pabon, Nicolas; Collier, Zachary A.; Egeghy, Peter P.; Cohen-Hubal, Elaine; Linkov, Igor; Vallero, Daniel A.
2013-01-01
The manufacture of novel synthetic chemicals has increased in volume and variety, but often the environmental and health risks are not fully understood in terms of toxicity and, in particular, exposure. While efforts to assess risks have generally been effective when sufficient data are available, the hazard and exposure data necessary to assess risks adequately are unavailable for the vast majority of chemicals in commerce. The US Environmental Protection Agency has initiated the ExpoCast Program to develop tools for rapid chemical evaluation based on potential for exposure. In this context, a model is presented in which chemicals are evaluated based on inherent chemical properties and behaviorally-based usage characteristics over the chemical’s life cycle. These criteria are assessed and integrated within a decision analytic framework, facilitating rapid assessment and prioritization for future targeted testing and systems modeling. A case study outlines the prioritization process using 51 chemicals. The results show a preliminary relative ranking of chemicals based on exposure potential. The strength of this approach is the ability to integrate relevant statistical and mechanistic data with expert judgment, allowing for an initial tier assessment that can further inform targeted testing and risk management strategies. PMID:23940664
2013-01-01
Background While the majority of studies have focused on the association between sex hormones and dementia, emerging evidence supports the role of other hormone signals in increasing dementia risk. However, due to the lack of an integrated view on mechanistic interactions of hormone signaling pathways associated with dementia, molecular mechanisms through which hormones contribute to the increased risk of dementia has remained unclear and capacity of translating hormone signals to potential therapeutic and diagnostic applications in relation to dementia has been undervalued. Methods Using an integrative knowledge- and data-driven approach, a global hormone interaction network in the context of dementia was constructed, which was further filtered down to a model of convergent hormone signaling pathways. This model was evaluated for its biological and clinical relevance through pathway recovery test, evidence-based analysis, and biomarker-guided analysis. Translational validation of the model was performed using the proposed novel mechanism discovery approach based on ‘serendipitous off-target effects’. Results Our results reveal the existence of a well-connected hormone interaction network underlying dementia. Seven hormone signaling pathways converge at the core of the hormone interaction network, which are shown to be mechanistically linked to the risk of dementia. Amongst these pathways, estrogen signaling pathway takes the major part in the model and insulin signaling pathway is analyzed for its association to learning and memory functions. Validation of the model through serendipitous off-target effects suggests that hormone signaling pathways substantially contribute to the pathogenesis of dementia. Conclusions The integrated network model of hormone interactions underlying dementia may serve as an initial translational platform for identifying potential therapeutic targets and candidate biomarkers for dementia-spectrum disorders such as Alzheimer’s disease. PMID:23885764
Robinson, Joshua F; Theunissen, Peter T; van Dartel, Dorien A M; Pennings, Jeroen L; Faustman, Elaine M; Piersma, Aldert H
2011-09-01
Toxicogenomic evaluations may improve toxicity prediction of in vitro-based developmental models, such as whole embryo culture (WEC) and embryonic stem cells (ESC), by providing a robust mechanistic marker which can be linked with responses associated with developmental toxicity in vivo. While promising in theory, toxicogenomic comparisons between in vivo and in vitro models are complex due to inherent differences in model characteristics and experimental design. Determining factors which influence these global comparisons are critical in the identification of reliable mechanistic-based markers of developmental toxicity. In this study, we compared available toxicogenomic data assessing the impact of the known teratogen, methylmercury (MeHg) across a diverse set of in vitro and in vivo models to investigate the impact of experimental variables (i.e. model, dose, time) on our comparative assessments. We evaluated common and unique aspects at both the functional (Gene Ontology) and gene level of MeHg-induced response. At the functional level, we observed stronger similarity in MeHg-response between mouse embryos exposed in utero (2 studies), ESC, and WEC as compared to liver, brain and mouse embryonic fibroblast MeHg studies. These findings were strongly correlated to the presence of a MeHg-induced developmentally related gene signature. In addition, we identified specific MeHg-induced gene expression alterations associated with developmental signaling and heart development across WEC, ESC and in vivo systems. However, the significance of overlap between studies was highly dependent on traditional experimental variables (i.e. dose, time). In summary, we identify promising examples of unique gene expression responses which show in vitro-in vivo similarities supporting the relevance of in vitro developmental models for predicting in vivo developmental toxicity. Copyright © 2011 Elsevier Inc. All rights reserved.
SUMMARY: Mechanistic data should provide the Agency with a more accurate basis to estimate risk than do the Agency’s default assumptions (10x uncertainty factors, etc.), thereby improving risk assessment decisions. NTD is providing mechanistic data for toxicant effects on two maj...
Agent-Based Modeling in Systems Pharmacology.
Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M
2015-11-01
Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.
Knowledge-based vision and simple visual machines.
Cliff, D; Noble, J
1997-01-01
The vast majority of work in machine vision emphasizes the representation of perceived objects and events: it is these internal representations that incorporate the 'knowledge' in knowledge-based vision or form the 'models' in model-based vision. In this paper, we discuss simple machine vision systems developed by artificial evolution rather than traditional engineering design techniques, and note that the task of identifying internal representations within such systems is made difficult by the lack of an operational definition of representation at the causal mechanistic level. Consequently, we question the nature and indeed the existence of representations posited to be used within natural vision systems (i.e. animals). We conclude that representations argued for on a priori grounds by external observers of a particular vision system may well be illusory, and are at best place-holders for yet-to-be-identified causal mechanistic interactions. That is, applying the knowledge-based vision approach in the understanding of evolved systems (machines or animals) may well lead to theories and models that are internally consistent, computationally plausible, and entirely wrong. PMID:9304684
APPLICATION OF A MULTIROUTE HUMAN PBPK MODEL FOR BROMODICHLOROMETHANE (BDCM)
Due to its presence in water as a volatile disinfection byproduct, BDCM poses a risk for exposure via multiple routes. Mechanistic data suggest target tissue metabolism could be important for some types of BDCM-induced toxicity. Utilizing our refined PBPK model for BDCM, the impa...
NASA Astrophysics Data System (ADS)
Darmon, David
2018-03-01
In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.
STOCHASTIC SIMULATION OF FIELD-SCALE PESTICIDE TRANSPORT USING OPUS AND GLEAMS
Incorporating variability in soil and chemical properties into root zone leaching models should provide a better representation of pollutant distribution in natural field conditions. Our objective was to determine if a more mechanistic rate-based model (Opus) would predict soil w...
An, Gary; Bartels, John; Vodovotz, Yoram
2011-03-01
The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and -content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism.
Battista, C; Woodhead, JL; Stahl, SH; Mettetal, JT; Watkins, PB; Siler, SQ; Howell, BA
2017-01-01
Elevations in serum bilirubin during drug treatment may indicate global liver dysfunction and a high risk of liver failure. However, drugs also can increase serum bilirubin in the absence of hepatic injury by inhibiting specific enzymes/transporters. We constructed a mechanistic model of bilirubin disposition based on known functional polymorphisms in bilirubin metabolism/transport. Using physiologically based pharmacokinetic (PBPK) model‐predicted drug exposure and enzyme/transporter inhibition constants determined in vitro, our model correctly predicted indinavir‐mediated hyperbilirubinemia in humans and rats. Nelfinavir was predicted not to cause hyperbilirubinemia, consistent with clinical observations. We next examined a new drug candidate that caused both elevations in serum bilirubin and biochemical evidence of liver injury in rats. Simulations suggest that bilirubin elevation primarily resulted from inhibition of transporters rather than global liver dysfunction. We conclude that mechanistic modeling of bilirubin can help elucidate underlying mechanisms of drug‐induced hyperbilirubinemia, and thereby distinguish benign from clinically important elevations in serum bilirubin. PMID:28074467
Band, Leah R.; Fozard, John A.; Godin, Christophe; Jensen, Oliver E.; Pridmore, Tony; Bennett, Malcolm J.; King, John R.
2012-01-01
Over recent decades, we have gained detailed knowledge of many processes involved in root growth and development. However, with this knowledge come increasing complexity and an increasing need for mechanistic modeling to understand how those individual processes interact. One major challenge is in relating genotypes to phenotypes, requiring us to move beyond the network and cellular scales, to use multiscale modeling to predict emergent dynamics at the tissue and organ levels. In this review, we highlight recent developments in multiscale modeling, illustrating how these are generating new mechanistic insights into the regulation of root growth and development. We consider how these models are motivating new biological data analysis and explore directions for future research. This modeling progress will be crucial as we move from a qualitative to an increasingly quantitative understanding of root biology, generating predictive tools that accelerate the development of improved crop varieties. PMID:23110897
WholeCellSimDB: a hybrid relational/HDF database for whole-cell model predictions.
Karr, Jonathan R; Phillips, Nolan C; Covert, Markus W
2014-01-01
Mechanistic 'whole-cell' models are needed to develop a complete understanding of cell physiology. However, extracting biological insights from whole-cell models requires running and analyzing large numbers of simulations. We developed WholeCellSimDB, a database for organizing whole-cell simulations. WholeCellSimDB was designed to enable researchers to search simulation metadata to identify simulations for further analysis, and quickly slice and aggregate simulation results data. In addition, WholeCellSimDB enables users to share simulations with the broader research community. The database uses a hybrid relational/hierarchical data format architecture to efficiently store and retrieve both simulation setup metadata and results data. WholeCellSimDB provides a graphical Web-based interface to search, browse, plot and export simulations; a JavaScript Object Notation (JSON) Web service to retrieve data for Web-based visualizations; a command-line interface to deposit simulations; and a Python API to retrieve data for advanced analysis. Overall, we believe WholeCellSimDB will help researchers use whole-cell models to advance basic biological science and bioengineering. http://www.wholecellsimdb.org SOURCE CODE REPOSITORY: URL: http://github.com/CovertLab/WholeCellSimDB. © The Author(s) 2014. Published by Oxford University Press.
Cotten, Cameron; Reed, Jennifer L
2013-01-30
Constraint-based modeling uses mass balances, flux capacity, and reaction directionality constraints to predict fluxes through metabolism. Although transcriptional regulation and thermodynamic constraints have been integrated into constraint-based modeling, kinetic rate laws have not been extensively used. In this study, an in vivo kinetic parameter estimation problem was formulated and solved using multi-omic data sets for Escherichia coli. To narrow the confidence intervals for kinetic parameters, a series of kinetic model simplifications were made, resulting in fewer kinetic parameters than the full kinetic model. These new parameter values are able to account for flux and concentration data from 20 different experimental conditions used in our training dataset. Concentration estimates from the simplified kinetic model were within one standard deviation for 92.7% of the 790 experimental measurements in the training set. Gibbs free energy changes of reaction were calculated to identify reactions that were often operating close to or far from equilibrium. In addition, enzymes whose activities were positively or negatively influenced by metabolite concentrations were also identified. The kinetic model was then used to calculate the maximum and minimum possible flux values for individual reactions from independent metabolite and enzyme concentration data that were not used to estimate parameter values. Incorporating these kinetically-derived flux limits into the constraint-based metabolic model improved predictions for uptake and secretion rates and intracellular fluxes in constraint-based models of central metabolism. This study has produced a method for in vivo kinetic parameter estimation and identified strategies and outcomes of kinetic model simplification. We also have illustrated how kinetic constraints can be used to improve constraint-based model predictions for intracellular fluxes and biomass yield and identify potential metabolic limitations through the integrated analysis of multi-omics datasets.
2013-01-01
Background Constraint-based modeling uses mass balances, flux capacity, and reaction directionality constraints to predict fluxes through metabolism. Although transcriptional regulation and thermodynamic constraints have been integrated into constraint-based modeling, kinetic rate laws have not been extensively used. Results In this study, an in vivo kinetic parameter estimation problem was formulated and solved using multi-omic data sets for Escherichia coli. To narrow the confidence intervals for kinetic parameters, a series of kinetic model simplifications were made, resulting in fewer kinetic parameters than the full kinetic model. These new parameter values are able to account for flux and concentration data from 20 different experimental conditions used in our training dataset. Concentration estimates from the simplified kinetic model were within one standard deviation for 92.7% of the 790 experimental measurements in the training set. Gibbs free energy changes of reaction were calculated to identify reactions that were often operating close to or far from equilibrium. In addition, enzymes whose activities were positively or negatively influenced by metabolite concentrations were also identified. The kinetic model was then used to calculate the maximum and minimum possible flux values for individual reactions from independent metabolite and enzyme concentration data that were not used to estimate parameter values. Incorporating these kinetically-derived flux limits into the constraint-based metabolic model improved predictions for uptake and secretion rates and intracellular fluxes in constraint-based models of central metabolism. Conclusions This study has produced a method for in vivo kinetic parameter estimation and identified strategies and outcomes of kinetic model simplification. We also have illustrated how kinetic constraints can be used to improve constraint-based model predictions for intracellular fluxes and biomass yield and identify potential metabolic limitations through the integrated analysis of multi-omics datasets. PMID:23360254
Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos
2017-11-09
Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos
Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less
Aoyama, T; Hirata, K; Yamamoto, Y; Yokota, H; Hayashi, H; Aoyama, Y; Matsumoto, Y
2016-08-01
Midazolam (MDZ) is commonly used for sedating critically ill patients. The daily dose required for adequate sedation increases in increments over 100 h after administration. The objectives of this study were to characterize the MDZ pharmacokinetics in critically ill patients and to describe the phenomenon of increasing daily dose by means of population pharmacokinetic analysis. Data were obtained from 30 patients treated in an intensive care unit. The patients received MDZ intravenously as a combination of bolus and continuous infusion. Serum MDZ concentration was assayed by high-performance liquid chromatography. Population pharmacokinetic analysis was performed using the NONMEM software package. The alteration of clearance unexplained by demographic factors and clinical laboratory data was described as an autoinduction of MDZ clearance using a semi-mechanistic pharmacokinetic-enzyme turnover model. The final population pharmacokinetic model was a one-compartment model estimated by incorporating a semi-mechanistic pharmacokinetic-enzyme turnover model for clearance, taking autoinduction into account. A significant covariate for MDZ clearance was total bilirubin. An increase in total bilirubin indicated a reduction in MDZ clearance. From simulation using the population pharmacokinetic parameters obtained in this study, MDZ clearance increased 2·3 times compared with pre-induced clearance 100 h after the start of 12·5 mg/h continuous infusion. Autoinduction and total bilirubin were significant predictors of the clearance of MDZ in this population. Step-by-step dosage adjustment using this population pharmacokinetic model may be useful for establishing a MDZ dosage regimen in critically ill patients. © 2016 John Wiley & Sons Ltd.
Development of climate data input files for the Mechanistic-Empirical Pavement Design Guide (MEPDG).
DOT National Transportation Integrated Search
2011-06-30
Prior to this effort, Mississippi's MEPDG climate files were limited to 12 weather stations in only 10 countries and only seven weather stations had over 8 years (100 months)of data. Hence, building MEPDG climate input datasets improves modeling accu...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohanty, Subhasish; Soppet, William K.; Majumdar, Saurindranath
Argonne National Laboratory (ANL), under the sponsorship of Department of Energy’s Light Water Reactor Sustainability (LWRS) program, is trying to develop a mechanistic approach for more accurate life estimation of LWR components. In this context, ANL has conducted many fatigue experiments under different test and environment conditions on type 316 stainless steel (316SS) material which is widely used in the US reactors. Contrary to the conventional S~N curve based empirical fatigue life estimation approach, the aim of the present DOE sponsored work is to develop an understanding of the material ageing issues more mechanistically (e.g. time dependent hardening and softening)more » under different test and environmental conditions. Better mechanistic understanding will help develop computer-based advanced modeling tools to better extrapolate stress-strain evolution of reactor components under multi-axial stress states and hence help predict their fatigue life more accurately. In this paper (part-I) the fatigue experiments under different test and environment conditions and related stress-strain results for 316 SS are discussed. In a second paper (part-II) the related evolutionary cyclic plasticity material modeling techniques and results are discussed.« less
Fast integration-based prediction bands for ordinary differential equation models.
Hass, Helge; Kreutz, Clemens; Timmer, Jens; Kaschek, Daniel
2016-04-15
To gain a deeper understanding of biological processes and their relevance in disease, mathematical models are built upon experimental data. Uncertainty in the data leads to uncertainties of the model's parameters and in turn to uncertainties of predictions. Mechanistic dynamic models of biochemical networks are frequently based on nonlinear differential equation systems and feature a large number of parameters, sparse observations of the model components and lack of information in the available data. Due to the curse of dimensionality, classical and sampling approaches propagating parameter uncertainties to predictions are hardly feasible and insufficient. However, for experimental design and to discriminate between competing models, prediction and confidence bands are essential. To circumvent the hurdles of the former methods, an approach to calculate a profile likelihood on arbitrary observations for a specific time point has been introduced, which provides accurate confidence and prediction intervals for nonlinear models and is computationally feasible for high-dimensional models. In this article, reliable and smooth point-wise prediction and confidence bands to assess the model's uncertainty on the whole time-course are achieved via explicit integration with elaborate correction mechanisms. The corresponding system of ordinary differential equations is derived and tested on three established models for cellular signalling. An efficiency analysis is performed to illustrate the computational benefit compared with repeated profile likelihood calculations at multiple time points. The integration framework and the examples used in this article are provided with the software package Data2Dynamics, which is based on MATLAB and freely available at http://www.data2dynamics.org helge.hass@fdm.uni-freiburg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Franek, F; Jarlfors, A; Larsen, F; Holm, P; Steffansen, B
2015-09-18
Desvenlafaxine is a biopharmaceutics classification system (BCS) class 1 (high solubility, high permeability) and biopharmaceutical drug disposition classification system (BDDCS) class 3, (high solubility, poor metabolism; implying low permeability) compound. Thus the rate-limiting step for desvenlafaxine absorption (i.e. intestinal dissolution or permeation) is not fully clarified. The aim of this study was to investigate whether dissolution and/or intestinal permeability rate-limit desvenlafaxine absorption from an immediate-release formulation (IRF) and Pristiq(®), an extended release formulation (ERF). Semi-mechanistic models of desvenlafaxine were built (using SimCyp(®)) by combining in vitro data on dissolution and permeation (mechanistic part of model) with clinical data (obtained from literature) on distribution and clearance (non-mechanistic part of model). The model predictions of desvenlafaxine pharmacokinetics after IRF and ERF administration were compared with published clinical data from 14 trials. Desvenlafaxine in vivo dissolution from the IRF and ERF was predicted from in vitro solubility studies and biorelevant dissolution studies (using the USP3 dissolution apparatus), respectively. Desvenlafaxine apparent permeability (Papp) at varying apical pH was investigated using the Caco-2 cell line and extrapolated to effective intestinal permeability (Peff) in human duodenum, jejunum, ileum and colon. Desvenlafaxine pKa-values and octanol-water partition coefficients (Do:w) were determined experimentally. Due to predicted rapid dissolution after IRF administration, desvenlafaxine was predicted to be available for permeation in the duodenum. Desvenlafaxine Do:w and Papp increased approximately 13-fold when increasing apical pH from 5.5 to 7.4. Desvenlafaxine Peff thus increased with pH down the small intestine. Consequently, desvenlafaxine absorption from an IRF appears rate-limited by low Peff in the upper small intestine, which "delays" the predicted time to the maximal plasma concentration (tmax), consistent with clinical data. Conversely, desvenlafaxine absorption from the ERF appears rate-limited by dissolution due to the formulation, which tends to negate the influence of pH-dependent permeability on absorption. We suggest that desvenlafaxine Peff is mainly driven by transcellular diffusion of the unionized form. In the case of desvenlafaxine, poor metabolism does not imply low intestinal permeability, as indicated by the BDDCS, merely low duodenal/jejunal permeability. Copyright © 2015 Elsevier B.V. All rights reserved.
Simulating malaria transmission in the current and future climate of West Africa
NASA Astrophysics Data System (ADS)
Yamana, T. K.; Bomblies, A.; Eltahir, E. A. B.
2015-12-01
Malaria transmission in West Africa is closely tied to climate, as rain fed water pools provide breeding habitat for the anopheles mosquito vector, and temperature affects the mosquito's ability to spread disease. We present results of a highly detailed, spatially explicit mechanistic modelling study exploring the relationships between the environment and malaria in the current and future climate of West Africa. A mechanistic model of human immunity was incorporated into an existing agent-based model of malaria transmission, allowing us to move beyond entomological measures such as mosquito density and vectorial capacity to analyzing the prevalence of the malaria parasite within human populations. The result is a novel modelling tool that mechanistically simulates all of the key processes linking environment to malaria transmission. Simulations were conducted across climate zones in West Africa, linking temperature and rainfall to entomological and epidemiological variables with a focus on nonlinearities due to threshold effects and interannual variability. Comparisons to observations from the region confirmed that the model provides a reasonable representation of the entomological and epidemiological conditions in this region. We used the predictions of future climate from the most credible CMIP5 climate models to predict the change in frequency and severity of malaria epidemics in West Africa as a result of climate change.
Fitzpatrick, Megan J; Mathewson, Paul D; Porter, Warren P
2015-01-01
Mechanistic models provide a powerful, minimally invasive tool for gaining a deeper understanding of the ecology of animals across geographic space and time. In this paper, we modified and validated the accuracy of the mechanistic model Niche Mapper for simulating heat exchanges of animals with counter-current heat exchange mechanisms in their legs and animals that wade in water. We then used Niche Mapper to explore the effects of wading and counter-current heat exchange on the energy expenditures of Whooping Cranes, a long-legged wading bird. We validated model accuracy against the energy expenditure of two captive Whooping Cranes measured using the doubly-labeled water method and time energy budgets. Energy expenditure values modeled by Niche Mapper were similar to values measured by the doubly-labeled water method and values estimated from time-energy budgets. Future studies will be able to use Niche Mapper as a non-invasive tool to explore energy-based limits to the fundamental niche of Whooping Cranes and apply this knowledge to management decisions. Basic questions about the importance of counter-current exchange and wading to animal physiological tolerances can also now be explored with the model.
Fitzpatrick, Megan J.; Mathewson, Paul D.; Porter, Warren P.
2015-01-01
Mechanistic models provide a powerful, minimally invasive tool for gaining a deeper understanding of the ecology of animals across geographic space and time. In this paper, we modified and validated the accuracy of the mechanistic model Niche Mapper for simulating heat exchanges of animals with counter-current heat exchange mechanisms in their legs and animals that wade in water. We then used Niche Mapper to explore the effects of wading and counter-current heat exchange on the energy expenditures of Whooping Cranes, a long-legged wading bird. We validated model accuracy against the energy expenditure of two captive Whooping Cranes measured using the doubly-labeled water method and time energy budgets. Energy expenditure values modeled by Niche Mapper were similar to values measured by the doubly-labeled water method and values estimated from time-energy budgets. Future studies will be able to use Niche Mapper as a non-invasive tool to explore energy-based limits to the fundamental niche of Whooping Cranes and apply this knowledge to management decisions. Basic questions about the importance of counter-current exchange and wading to animal physiological tolerances can also now be explored with the model. PMID:26308207
Proposal of an in silico profiler for categorisation of repeat dose toxicity data of hair dyes.
Nelms, M D; Ates, G; Madden, J C; Vinken, M; Cronin, M T D; Rogiers, V; Enoch, S J
2015-05-01
This study outlines the analysis of 94 chemicals with repeat dose toxicity data taken from Scientific Committee on Consumer Safety opinions for commonly used hair dyes in the European Union. Structural similarity was applied to group these chemicals into categories. Subsequent mechanistic analysis suggested that toxicity to mitochondria is potentially a key driver of repeat dose toxicity for chemicals within each of the categories. The mechanistic hypothesis allowed for an in silico profiler consisting of four mechanism-based structural alerts to be proposed. These structural alerts related to a number of important chemical classes such as quinones, anthraquinones, substituted nitrobenzenes and aromatic azos. This in silico profiler is intended for grouping chemicals into mechanism-based categories within the adverse outcome pathway paradigm.
NASA Astrophysics Data System (ADS)
Jaiswal, D.; Long, S.; Parton, W. J.; Hartman, M.
2012-12-01
A coupled modeling system of crop growth model (BioCro) and biogeochemical model (DayCent) has been developed to assess the two-way interactions between plant growth and biogeochemistry. Crop growth in BioCro is simulated using a detailed mechanistic biochemical and biophysical multi-layer canopy model and partitioning of dry biomass into different plant organs according to phenological stages. Using hourly weather records, the model partitions light between dynamically changing sunlit and shaded portions of the canopy and computes carbon and water exchange with the atmosphere and through the canopy for each hour of the day, each day of the year. The model has been parameterized for the bioenergy crops sugarcane, Miscanthus and switchgrass, and validation has shown it to predict growth cycles and partitioning of biomass to a high degree of accuracy. As such it provides an ideal input for a soil biogeochemical model. DayCent is an established model for predicting long-term changes in soil C & N and soil-atmosphere exchanges of greenhouse gases. At present, DayCent uses a relatively simple productivity model. In this project BioCro has replaced this simple model to provide DayCent with a productivity and growth model equal in detail to its biogeochemistry. Dynamic coupling of these two models to produce CroCent allows for differential C: N ratios of litter fall (based on rates of senescence of different plant organs) and calibration of the model for realistic plant productivity in a mechanistic way. A process-based approach to modeling plant growth is needed for bioenergy crops because research on these crops (especially second generation feedstocks) has started only recently, and detailed agronomic information for growth, yield and management is too limited for effective empirical models. The coupled model provides means to test and improve the model against high resolution data, such as that obtained by eddy covariance and explore yield implications of different crop and soil management.
Modelling algae-duckweed interaction under chemical pressure within a laboratory microcosm.
Lamonica, Dominique; Clément, Bernard; Charles, Sandrine; Lopes, Christelle
2016-06-01
Contaminant effects on species are generally assessed with single-species bioassays. As a consequence, interactions between species that occur in ecosystems are not taken into account. To investigate the effects of contaminants on interacting species dynamics, our study describes the functioning of a 2-L laboratory microcosm with two species, the duckweed Lemna minor and the microalgae Pseudokirchneriella subcapitata, exposed to cadmium contamination. We modelled the dynamics of both species and their interactions using a mechanistic model based on coupled ordinary differential equations. The main processes occurring in this two-species microcosm were thus formalised, including growth and settling of algae, growth of duckweeds, interspecific competition between the two species and cadmium effects. We estimated model parameters by Bayesian inference, using simultaneously all the data issued from multiple laboratory experiments specifically conducted for this study. Cadmium concentrations ranged between 0 and 50 μg·L(-1). For all parameters of our model, we obtained biologically realistic values and reasonable uncertainties. Only duckweed dynamics was affected by interspecific competition, while algal dynamics was not impaired. Growth rate of both species decreased with cadmium concentration, as well as competition intensity showing that the interspecific competition pressure on duckweed decreased with cadmium concentration. This innovative combination of mechanistic modelling and model-guided experiments was successful to understand the algae-duckweed microcosm functioning without and with contaminant. This approach appears promising to include interactions between species when studying contaminant effects on ecosystem functioning. Copyright © 2016 Elsevier Inc. All rights reserved.
DOT National Transportation Integrated Search
2014-01-01
The main objective of this study was to collect and evaluate climatic and soil data pertaining to Oklahoma for the climatic model (EICM) in the mechanistic-empirical design guide for pavements. The EICM climatic input files were updated and extended ...
Use of Mechanistic Models to?Improve Understanding: Differential, mass balance, process-based Spatial and temporal resolution Necessary simplifications of system complexity Combing field monitoring and modeling efforts Balance between capturing complexity and maintaining...
USDA-ARS?s Scientific Manuscript database
Ammonia volatilization from treatment lagoons varies widely with the total ammonia concentration, pH, temperature, suspended solids, atmospheric ammonia concentration above the water surface, and wind speed. Ammonia emissions were estimated with a process-based mechanistic model integrating ammonia ...
Mechanistic Enzyme Models: Pyridoxal and Metal Ions.
ERIC Educational Resources Information Center
Hamilton, S. E.; And Others
1984-01-01
Background information, procedures, and results are presented for experiments on the pyridoxal/metal ion model system. These experiments illustrate catalysis through Schiff's base formation between aldehydes/ketones and primary amines, catalysis by metal ions, and the predictable manner in which metal ions inhibit or catalyze reactions. (JN)
Mechanistic Considerations Used in the Development of the PROFIT PCI Failure Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pankaskie, P. J.
A fuel Pellet-Zircaloy Cladding (thermo-mechanical-chemical) Interactions (PC!) failure model for estimating the probability of failure in !ransient increases in power (PROFIT) was developed. PROFIT is based on 1) standard statistical methods applied to available PC! fuel failure data and 2) a mechanistic analysis of the environmental and strain-rate-dependent stress versus strain characteristics of Zircaloy cladding. The statistical analysis of fuel failures attributable to PCI suggested that parameters in addition to power, transient increase in power, and burnup are needed to define PCI fuel failures in terms of probability estimates with known confidence limits. The PROFIT model, therefore, introduces an environmentalmore » and strain-rate dependent strain energy absorption to failure (SEAF) concept to account for the stress versus strain anomalies attributable to interstitial-disloction interaction effects in the Zircaloy cladding. Assuming that the power ramping rate is the operating corollary of strain-rate in the Zircaloy cladding, then the variables of first order importance in the PCI fuel failure phenomenon are postulated to be: 1. pre-transient fuel rod power, P{sub I}, 2. transient increase in fuel rod power, {Delta}P, 3. fuel burnup, Bu, and 4. the constitutive material property of the Zircaloy cladding, SEAF.« less
Shankaran, Harish; Zhang, Yi; Chrisler, William B.; Ewald, Jonathan A.; Wiley, H. Steven; Resat, Haluk
2012-01-01
The epidermal growth factor receptor (EGFR) belongs to the ErbB family of receptor tyrosine kinases, and controls a diverse set of cellular responses relevant to development and tumorigenesis. ErbB activation is a complex process involving receptor-ligand binding, receptor dimerization, phosphorylation, and trafficking (internalization, recycling and degradation), which together dictate the spatio-temporal distribution of active receptors within the cell. The ability to predict this distribution, and elucidation of the factors regulating it, would help to establish a mechanistic link between ErbB expression levels and the cellular response. Towards this end, we constructed mathematical models to determine the contributions of receptor dimerization and phosphorylation to EGFR activation, and to examine the dependence of these processes on sub-cellular location. We collected experimental datasets for EGFR activation dynamics in human mammary epithelial cells, with the specific goal of model parameterization, and used the data to estimate parameters for several alternate models. Model-based analysis indicated that: 1) signal termination via receptor dephosphorylation in late endosomes, prior to degradation, is an important component of the response, 2) less than 40% of the receptors in the cell are phosphorylated at any given time, even at saturating ligand doses, and 3) receptor phosphorylation kinetics at the cell surface and early endosomes are comparable. We validated the last finding by measuring the EGFR dephosphorylation rates at various times following ligand addition both in whole cells and in endosomes using ELISAs and fluorescent imaging. Overall, our results provide important information on how EGFR phosphorylation levels are regulated within cells. This study demonstrates that an iterative cycle of experiments and modeling can be used to gain mechanistic insight regarding complex cell signaling networks. PMID:22952062
Bunker, Alex; Magarkar, Aniket; Viitala, Tapani
2016-10-01
Combined experimental and computational studies of lipid membranes and liposomes, with the aim to attain mechanistic understanding, result in a synergy that makes possible the rational design of liposomal drug delivery system (LDS) based therapies. The LDS is the leading form of nanoscale drug delivery platform, an avenue in drug research, known as "nanomedicine", that holds the promise to transcend the current paradigm of drug development that has led to diminishing returns. Unfortunately this field of research has, so far, been far more successful in generating publications than new drug therapies. This partly results from the trial and error based methodologies used. We discuss experimental techniques capable of obtaining mechanistic insight into LDS structure and behavior. Insight obtained purely experimentally is, however, limited; computational modeling using molecular dynamics simulation can provide insight not otherwise available. We review computational research, that makes use of the multiscale modeling paradigm, simulating the phospholipid membrane with all atom resolution and the entire liposome with coarse grained models. We discuss in greater detail the computational modeling of liposome PEGylation. Overall, we wish to convey the power that lies in the combined use of experimental and computational methodologies; we hope to provide a roadmap for the rational design of LDS based therapies. Computational modeling is able to provide mechanistic insight that explains the context of experimental results and can also take the lead and inspire new directions for experimental research into LDS development. This article is part of a Special Issue entitled: Biosimulations edited by Ilpo Vattulainen and Tomasz Róg. Copyright © 2016 Elsevier B.V. All rights reserved.
Probabilistic calibration of the SPITFIRE fire spread model using Earth observation data
NASA Astrophysics Data System (ADS)
Gomez-Dans, Jose; Wooster, Martin; Lewis, Philip; Spessa, Allan
2010-05-01
There is a great interest in understanding how fire affects vegetation distribution and dynamics in the context of global vegetation modelling. A way to include these effects is through the development of embedded fire spread models. However, fire is a complex phenomenon, thus difficult to model. Statistical models based on fire return intervals, or fire danger indices need large amounts of data for calibration, and are often prisoner to the epoch they were calibrated to. Mechanistic models, such as SPITFIRE, try to model the complete fire phenomenon based on simple physical rules, making these models mostly independent of calibration data. However, the processes expressed in models such as SPITFIRE require many parameters. These parametrisations are often reliant on site-specific experiments, or in some other cases, paremeters might not be measured directly. Additionally, in many cases, changes in temporal and/or spatial resolution result in parameters becoming effective. To address the difficulties with parametrisation and the often-used fitting methodologies, we propose using a probabilistic framework to calibrate some areas of the SPITFIRE fire spread model. We calibrate the model against Earth Observation (EO) data, a global and ever-expanding source of relevant data. We develop a methodology that tries to incorporate the limitations of the EO data, reasonable prior values for parameters and that results in distributions of parameters, which can be used to infer uncertainty due to parameter estimates. Additionally, the covariance structure of parameters and observations is also derived, whcih can help inform data gathering efforts and model development, respectively. For this work, we focus on Southern African savannas, an important ecosystem for fire studies, and one with a good amount of EO data relevnt to fire studies. As calibration datasets, we use burned area data, estimated number of fires and vegetation moisture dynamics.
Vollert, Jan; Magerl, Walter; Baron, Ralf; Binder, Andreas; Enax-Krumova, Elena K; Geisslinger, Gerd; Gierthmühlen, Janne; Henrich, Florian; Hüllemann, Philipp; Klein, Thomas; Lötsch, Jörn; Maier, Christoph; Oertel, Bruno; Schuh-Hofer, Sigrid; Tölle, Thomas R; Treede, Rolf-Detlef
2018-06-01
As an indirect approach to relate previously identified sensory phenotypes of patients suffering from peripheral neuropathic pain to underlying mechanisms, we used a published sorting algorithm to estimate the prevalence of denervation, peripheral and central sensitization in 657 healthy subjects undergoing experimental models of nerve block (NB) (compression block and topical lidocaine), primary hyperalgesia (PH) (sunburn and topical capsaicin), or secondary hyperalgesia (intradermal capsaicin and electrical high-frequency stimulation), and in 902 patients suffering from neuropathic pain. Some of the data have been previously published. Randomized split-half analysis verified a good concordance with a priori mechanistic sensory profile assignment in the training (79%, Cohen κ = 0.54, n = 265) and the test set (81%, Cohen κ = 0.56, n = 279). Nerve blocks were characterized by pronounced thermal and mechanical sensory loss, but also mild pinprick hyperalgesia and paradoxical heat sensations. Primary hyperalgesia was characterized by pronounced gain for heat, pressure and pinprick pain, and mild thermal sensory loss. Secondary hyperalgesia was characterized by pronounced pinprick hyperalgesia and mild thermal sensory loss. Topical lidocaine plus topical capsaicin induced a combined phenotype of NB plus PH. Topical menthol was the only model with significant cold hyperalgesia. Sorting of the 902 patients into these mechanistic phenotypes led to a similar distribution as the original heuristic clustering (65% identity, Cohen κ = 0.44), but the denervation phenotype was more frequent than in heuristic clustering. These data suggest that sorting according to human surrogate models may be useful for mechanism-based stratification of neuropathic pain patients for future clinical trials, as encouraged by the European Medicines Agency.
De Kauwe, Martin G; Medlyn, Belinda E; Zaehle, Sönke; Walker, Anthony P; Dietze, Michael C; Wang, Ying-Ping; Luo, Yiqi; Jain, Atul K; El-Masri, Bassil; Hickler, Thomas; Wårlind, David; Weng, Ensheng; Parton, William J; Thornton, Peter E; Wang, Shusen; Prentice, I Colin; Asao, Shinichi; Smith, Benjamin; McCarthy, Heather R; Iversen, Colleen M; Hanson, Paul J; Warren, Jeffrey M; Oren, Ram; Norby, Richard J
2014-01-01
Elevated atmospheric CO2 concentration (eCO2) has the potential to increase vegetation carbon storage if increased net primary production causes increased long-lived biomass. Model predictions of eCO2 effects on vegetation carbon storage depend on how allocation and turnover processes are represented. We used data from two temperate forest free-air CO2 enrichment (FACE) experiments to evaluate representations of allocation and turnover in 11 ecosystem models. Observed eCO2 effects on allocation were dynamic. Allocation schemes based on functional relationships among biomass fractions that vary with resource availability were best able to capture the general features of the observations. Allocation schemes based on constant fractions or resource limitations performed less well, with some models having unintended outcomes. Few models represent turnover processes mechanistically and there was wide variation in predictions of tissue lifespan. Consequently, models did not perform well at predicting eCO2 effects on vegetation carbon storage. Our recommendations to reduce uncertainty include: use of allocation schemes constrained by biomass fractions; careful testing of allocation schemes; and synthesis of allocation and turnover data in terms of model parameters. Data from intensively studied ecosystem manipulation experiments are invaluable for constraining models and we recommend that such experiments should attempt to fully quantify carbon, water and nutrient budgets. PMID:24844873
Atomic scale simulations for improved CRUD and fuel performance modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andersson, Anders David Ragnar; Cooper, Michael William Donald
2017-01-06
A more mechanistic description of fuel performance codes can be achieved by deriving models and parameters from atomistic scale simulations rather than fitting models empirically to experimental data. The same argument applies to modeling deposition of corrosion products on fuel rods (CRUD). Here are some results from publications in 2016 carried out using the CASL allocation at LANL.
NASA Astrophysics Data System (ADS)
Brewe, Eric; Traxler, Adrienne; de la Garza, Jorge; Kramer, Laird H.
2013-12-01
We report on a multiyear study of student attitudes measured with the Colorado Learning Attitudes about Science Survey in calculus-based introductory physics taught with the Modeling Instruction curriculum. We find that five of six instructors and eight of nine sections using Modeling Instruction showed significantly improved attitudes from pre- to postcourse. Cohen’s d effect sizes range from 0.08 to 0.95 for individual instructors. The average effect was d=0.45, with a 95% confidence interval of (0.26-0.64). These results build on previously published results showing positive shifts in attitudes from Modeling Instruction classes. We interpret these data in light of other published positive attitudinal shifts and explore mechanistic explanations for similarities and differences with other published positive shifts.
Thomas, Reuben; Thomas, Russell S.; Auerbach, Scott S.; Portier, Christopher J.
2013-01-01
Background Several groups have employed genomic data from subchronic chemical toxicity studies in rodents (90 days) to derive gene-centric predictors of chronic toxicity and carcinogenicity. Genes are annotated to belong to biological processes or molecular pathways that are mechanistically well understood and are described in public databases. Objectives To develop a molecular pathway-based prediction model of long term hepatocarcinogenicity using 90-day gene expression data and to evaluate the performance of this model with respect to both intra-species, dose-dependent and cross-species predictions. Methods Genome-wide hepatic mRNA expression was retrospectively measured in B6C3F1 mice following subchronic exposure to twenty-six (26) chemicals (10 were positive, 2 equivocal and 14 negative for liver tumors) previously studied by the US National Toxicology Program. Using these data, a pathway-based predictor model for long-term liver cancer risk was derived using random forests. The prediction model was independently validated on test sets associated with liver cancer risk obtained from mice, rats and humans. Results Using 5-fold cross validation, the developed prediction model had reasonable predictive performance with the area under receiver-operator curve (AUC) equal to 0.66. The developed prediction model was then used to extrapolate the results to data associated with rat and human liver cancer. The extrapolated model worked well for both extrapolated species (AUC value of 0.74 for rats and 0.91 for humans). The prediction models implied a balanced interplay between all pathway responses leading to carcinogenicity predictions. Conclusions Pathway-based prediction models estimated from sub-chronic data hold promise for predicting long-term carcinogenicity and also for its ability to extrapolate results across multiple species. PMID:23737943
Thomas, Reuben; Thomas, Russell S; Auerbach, Scott S; Portier, Christopher J
2013-01-01
Several groups have employed genomic data from subchronic chemical toxicity studies in rodents (90 days) to derive gene-centric predictors of chronic toxicity and carcinogenicity. Genes are annotated to belong to biological processes or molecular pathways that are mechanistically well understood and are described in public databases. To develop a molecular pathway-based prediction model of long term hepatocarcinogenicity using 90-day gene expression data and to evaluate the performance of this model with respect to both intra-species, dose-dependent and cross-species predictions. Genome-wide hepatic mRNA expression was retrospectively measured in B6C3F1 mice following subchronic exposure to twenty-six (26) chemicals (10 were positive, 2 equivocal and 14 negative for liver tumors) previously studied by the US National Toxicology Program. Using these data, a pathway-based predictor model for long-term liver cancer risk was derived using random forests. The prediction model was independently validated on test sets associated with liver cancer risk obtained from mice, rats and humans. Using 5-fold cross validation, the developed prediction model had reasonable predictive performance with the area under receiver-operator curve (AUC) equal to 0.66. The developed prediction model was then used to extrapolate the results to data associated with rat and human liver cancer. The extrapolated model worked well for both extrapolated species (AUC value of 0.74 for rats and 0.91 for humans). The prediction models implied a balanced interplay between all pathway responses leading to carcinogenicity predictions. Pathway-based prediction models estimated from sub-chronic data hold promise for predicting long-term carcinogenicity and also for its ability to extrapolate results across multiple species.
Application of the DART Code for the Assessment of Advanced Fuel Behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.; Totev, T.
2007-07-01
The Dispersion Analysis Research Tool (DART) code is a dispersion fuel analysis code that contains mechanistically-based fuel and reaction-product swelling models, a one dimensional heat transfer analysis, and mechanical deformation models. DART has been used to simulate the irradiation behavior of uranium oxide, uranium silicide, and uranium molybdenum aluminum dispersion fuels, as well as their monolithic counterparts. The thermal-mechanical DART code has been validated against RERTR tests performed in the ATR for irradiation data on interaction thickness, fuel, matrix, and reaction product volume fractions, and plate thickness changes. The DART fission gas behavior model has been validated against UO{sub 2}more » fission gas release data as well as measured fission gas-bubble size distributions. Here DART is utilized to analyze various aspects of the observed bubble growth in U-Mo/Al interaction product. (authors)« less
Thomas J. Urbanik; Edmond P. Saliklis
2002-01-01
Conventional compression strength formulas for corrugated fiberboard boxes are limited to geometry and material that produce an elastic postbuckling failure. Inelastic postbuckling can occur in squatty boxes and trays, but a mechanistic rationale for unifying observed strength data is lacking. This study employs a finite element model, instead of actual experiments, to...
DOT National Transportation Integrated Search
2017-03-01
This report describes the efforts undertaken to review the status of falling weight deflectometer (FWD) equipment, data collection, analysis, and interpretation, including dynamic backcalculation, as they relate to the models and procedures incorpora...
Do the same traffic rules apply? Directional chromosome segregation by SpoIIIE and FtsK.
Besprozvannaya, Marina; Burton, Briana M
2014-08-01
Over a decade of studies have tackled the question of how FtsK/SpoIIIE translocases establish and maintain directional DNA translocation during chromosome segregation in bacteria. FtsK/SpoIIIE translocases move DNA in a highly processive, directional manner, where directionality is facilitated by sequences on the substrate DNA molecules that are being transported. In recent years, structural, biochemical, single-molecule and high-resolution microscopic studies have provided new insight into the mechanistic details of directional DNA segregation. Out of this body of work, a series of models have emerged and, ultimately, yielded two seemingly opposing models: the loading model and the target search model. We review these recent mechanistic insights into directional DNA movement and discuss the data that may serve to unite these suggested models, as well as propose future directions that may ultimately solve the debate. © 2014 John Wiley & Sons Ltd.
Levy, Karen; Zimmerman, Julie; Elliott, Mark; Bartram, Jamie; Carlton, Elizabeth; Clasen, Thomas; Dillingham, Rebecca; Eisenberg, Joseph; Guerrant, Richard; Lantagne, Daniele; Mihelcic, James; Nelson, Kara
2016-01-01
Increased precipitation and temperature variability as well as extreme events related to climate change are predicted to affect the availability and quality of water globally. Already heavily burdened with diarrheal diseases due to poor access to water, sanitation and hygiene facilities, communities throughout the developing world lack the adaptive capacity to sufficiently respond to the additional adversity caused by climate change. Studies suggest that diarrhea rates are positively correlated with increased temperature, and show a complex relationship with precipitation. Although climate change will likely increase rates of diarrheal diseases on average, there is a poor mechanistic understanding of the underlying disease transmission processes and substantial uncertainty surrounding current estimates. This makes it difficult to recommend appropriate adaptation strategies. We review the relevant climate-related mechanisms behind transmission of diarrheal disease pathogens and argue that systems-based mechanistic approaches incorporating human, engineered and environmental components are urgently needed. We then review successful systems-based approaches used in other environmental health fields and detail one modeling framework to predict climate change impacts on diarrheal diseases and design adaptation strategies. PMID:26799810
Molecular Signaling Network Motifs Provide a Mechanistic Basis for Cellular Threshold Responses
Bhattacharya, Sudin; Conolly, Rory B.; Clewell, Harvey J.; Kaminski, Norbert E.; Andersen, Melvin E.
2014-01-01
Background: Increasingly, there is a move toward using in vitro toxicity testing to assess human health risk due to chemical exposure. As with in vivo toxicity testing, an important question for in vitro results is whether there are thresholds for adverse cellular responses. Empirical evaluations may show consistency with thresholds, but the main evidence has to come from mechanistic considerations. Objectives: Cellular response behaviors depend on the molecular pathway and circuitry in the cell and the manner in which chemicals perturb these circuits. Understanding circuit structures that are inherently capable of resisting small perturbations and producing threshold responses is an important step towards mechanistically interpreting in vitro testing data. Methods: Here we have examined dose–response characteristics for several biochemical network motifs. These network motifs are basic building blocks of molecular circuits underpinning a variety of cellular functions, including adaptation, homeostasis, proliferation, differentiation, and apoptosis. For each motif, we present biological examples and models to illustrate how thresholds arise from specific network structures. Discussion and Conclusion: Integral feedback, feedforward, and transcritical bifurcation motifs can generate thresholds. Other motifs (e.g., proportional feedback and ultrasensitivity)produce responses where the slope in the low-dose region is small and stays close to the baseline. Feedforward control may lead to nonmonotonic or hormetic responses. We conclude that network motifs provide a basis for understanding thresholds for cellular responses. Computational pathway modeling of these motifs and their combinations occurring in molecular signaling networks will be a key element in new risk assessment approaches based on in vitro cellular assays. Citation: Zhang Q, Bhattacharya S, Conolly RB, Clewell HJ III, Kaminski NE, Andersen ME. 2014. Molecular signaling network motifs provide a mechanistic basis for cellular threshold responses. Environ Health Perspect 122:1261–1270; http://dx.doi.org/10.1289/ehp.1408244 PMID:25117432
Organism and population-level ecological models for ...
Ecological risk assessment typically focuses on animal populations as endpoints for regulatory ecotoxicology. Scientists at USEPA are developing models for animal populations exposed to a wide range of chemicals from pesticides to emerging contaminants. Modeled taxa include aquatic and terrestrial invertebrates, fish, amphibians, and birds, and employ a wide range of methods, from matrix-based projection models to mechanistic bioenergetics models and spatially explicit population models. not applicable
INTEGRATED CHEMICAL INFORMATION TECHNOLOGIES ...
A central regulatory mandate of the Environmental Protection Agency, spanning many Program Offices and issues, is to assess the potential health and environmental risks of large numbers of chemicals released into the environment, often in the absence of relevant test data. Models for predicting potential adverse effects of chemicals based primarily on chemical structure play a central role in prioritization and screening strategies yet are highly dependent and conditional upon the data used for developing such models. Hence, limits on data quantity, quality, and availability are considered by many to be the largest hurdles to improving prediction models in diverse areas of toxicology. Generation of new toxicity data for additional chemicals and endpoints, development of new high-throughput, mechanistically relevant bioassays, and increased generation of genomics and proteomics data that can clarify relevant mechanisms will all play important roles in improving future SAR prediction models. The potential for much greater immediate gains, across large domains of chemical and toxicity space, comes from maximizing the ability to mine and model useful information from existing toxicity data, data that represent huge past investment in research and testing expenditures. In addition, the ability to place newer “omics” data, data that potentially span many possible domains of toxicological effects, in the broader context of historical data is the means for opti
An Open Source Simulation Model for Soil and Sediment Bioturbation
Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin
2011-01-01
Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach. PMID:22162997
An open source simulation model for soil and sediment bioturbation.
Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin
2011-01-01
Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach.
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M; Young, Vincent B; Jansson, Janet K; Fredricks, David N; Borenstein, Elhanan
2016-01-01
Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites' abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M.; Young, Vincent B.; Jansson, Janet K.; Fredricks, David N.
2016-01-01
ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCE Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism. PMID:27239563
COLLABORATION ON NHEERL EPIDEMIOLOGY STUDIES
This task will continue ORD's efforts to develop a biologically plausible, quantitative health risk model for particulate matter (PM) based on epidemiological, toxicological, and mechanistic studies using matched exposure assessments. The NERL, in collaboration with the NHEERL, ...
A dynamic and mechanistic model of PCB bioaccumulation in the European hake ( Merluccius merluccius)
NASA Astrophysics Data System (ADS)
Bodiguel, Xavier; Maury, Olivier; Mellon-Duval, Capucine; Roupsard, François; Le Guellec, Anne-Marie; Loizeau, Véronique
2009-08-01
Bioaccumulation is difficult to document because responses differ among chemical compounds, with environmental conditions, and physiological processes characteristic of each species. We use a mechanistic model, based on the Dynamic Energy Budget (DEB) theory, to take into account this complexity and study factors impacting accumulation of organic pollutants in fish through ontogeny. The bioaccumulation model proposed is a comprehensive approach that relates evolution of hake PCB contamination to physiological information about the fish, such as diet, metabolism, reserve and reproduction status. The species studied is the European hake ( Merluccius merluccius, L. 1758). The model is applied to study the total concentration and the lipid normalised concentration of 4 PCB congeners in male and female hakes from the Gulf of Lions (NW Mediterranean sea) and the Bay of Biscay (NE Atlantic ocean). Outputs of the model compare consistently to measurements over the life span of fish. Simulation results clearly demonstrate the relative effects of food contamination, growth and reproduction on the PCB bioaccumulation in hake. The same species living in different habitats and exposed to different PCB prey concentrations exhibit marked difference in the body accumulation of PCBs. At the adult stage, female hakes have a lower PCB concentration compared to males for a given length. We successfully simulated these sex-specific PCB concentrations by considering two mechanisms: a higher energy allocation to growth for females and a transfer of PCBs from the female to its eggs when allocating lipids from reserve to eggs. Finally, by its mechanistic description of physiological processes, the model is relevant for other species and sets the stage for a mechanistic understanding of toxicity and ecological effects of organic contaminants in marine organisms.
Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration
NASA Astrophysics Data System (ADS)
Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.
2017-12-01
Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.
Predicting agricultural impacts of large-scale drought: 2012 and the case for better modeling
USDA-ARS?s Scientific Manuscript database
We present an example of a simulation-based forecast for the 2012 U.S. maize growing season produced as part of a high-resolution, multi-scale, predictive mechanistic modeling study designed for decision support, risk management, and counterfactual analysis. The simulations undertaken for this analy...
Integrating Cellular Metabolism into a Multiscale Whole-Body Model
Krauss, Markus; Schaller, Stephan; Borchers, Steffen; Findeisen, Rolf; Lippert, Jörg; Kuepfer, Lars
2012-01-01
Cellular metabolism continuously processes an enormous range of external compounds into endogenous metabolites and is as such a key element in human physiology. The multifaceted physiological role of the metabolic network fulfilling the catalytic conversions can only be fully understood from a whole-body perspective where the causal interplay of the metabolic states of individual cells, the surrounding tissue and the whole organism are simultaneously considered. We here present an approach relying on dynamic flux balance analysis that allows the integration of metabolic networks at the cellular scale into standardized physiologically-based pharmacokinetic models at the whole-body level. To evaluate our approach we integrated a genome-scale network reconstruction of a human hepatocyte into the liver tissue of a physiologically-based pharmacokinetic model of a human adult. The resulting multiscale model was used to investigate hyperuricemia therapy, ammonia detoxification and paracetamol-induced toxication at a systems level. The specific models simultaneously integrate multiple layers of biological organization and offer mechanistic insights into pathology and medication. The approach presented may in future support a mechanistic understanding in diagnostics and drug development. PMID:23133351
An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis.
Wang, Zi; Ramsey, Benjamin J; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong
2016-01-01
With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations.
Howell, Brett A; Chauhan, Anuj
2010-08-01
Physiologically based pharmacokinetic (PBPK) models were developed for design and optimization of liposome therapy for treatment of overdoses of tricyclic antidepressants and local anesthetics. In vitro drug-binding data for pegylated, anionic liposomes and published mechanistic equations for partition coefficients were used to develop the models. The models were proven reliable through comparisons to intravenous data. The liposomes were predicted to be highly effective at treating amitriptyline overdoses, with reductions in the area under the concentration versus time curves (AUC) of 64% for the heart and brain. Peak heart and brain drug concentrations were predicted to drop by 20%. Bupivacaine AUC and peak concentration reductions were lower at 15.4% and 17.3%, respectively, for the heart and brain. The predicted pharmacokinetic profiles following liposome administration agreed well with data from clinical studies where protein fragments were administered to patients for overdose treatment. Published data on local cardiac function were used to relate the predicted concentrations in the body to local pharmacodynamic effects in the heart. While the results offer encouragement for future liposome therapies geared toward overdose, it is imperative to point out that animal experiments and phase I clinical trials are the next steps to ensuring the efficacy of the treatment. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association
A climate-driven mechanistic population model of Aedes albopictus with diapause.
Jia, Pengfei; Lu, Liang; Chen, Xiang; Chen, Jin; Guo, Li; Yu, Xiao; Liu, Qiyong
2016-03-24
The mosquito Aedes albopitus is a competent vector for the transmission of many blood-borne pathogens. An important factor that affects the mosquitoes' development and spreading is climate, such as temperature, precipitation and photoperiod. Existing climate-driven mechanistic models overlook the seasonal pattern of diapause, referred to as the survival strategy of mosquito eggs being dormant and unable to hatch under extreme weather. With respect to diapause, several issues remain unaddressed, including identifying the time when diapause eggs are laid and hatched under different climatic conditions, demarcating the thresholds of diapause and non-diapause periods, and considering the mortality rate of diapause eggs. Here we propose a generic climate-driven mechanistic population model of Ae. albopitus applicable to most Ae. albopictus-colonized areas. The new model is an improvement over the previous work by incorporating the diapause behaviors with many modifications to the stage-specific mechanism of the mosquitoes' life-cycle. monthly Container Index (CI) of Ae. albopitus collected in two Chinese cities, Guangzhou and Shanghai is used for model validation. The simulation results by the proposed model is validated with entomological field data by the Pearson correlation coefficient r (2) in Guangzhou (r (2) = 0.84) and in Shanghai (r (2) = 0.90). In addition, by consolidating the effect of diapause-related adjustments and temperature-related parameters in the model, the improvement is significant over the basic model. The model highlights the importance of considering diapause in simulating Ae. albopitus population. It also corroborates that temperature and photoperiod are significant in affecting the population dynamics of the mosquito. By refining the relationship between Ae. albopitus population and climatic factors, the model serves to establish a mechanistic relation to the growth and decline of the species. Understanding this relationship in a better way will benefit studying the transmission and the spatiotemporal distribution of mosquito-borne epidemics and eventually facilitating the early warning and control of the diseases.
Veltman, Karin; Huijbregts, Mark A J; Hendriks, A Jan
2010-07-01
Both biotic ligand models (BLM) and bioaccumulation models aim to quantify metal exposure based on mechanistic knowledge, but key factors included in the description of metal uptake differ between the two approaches. Here, we present a quantitative comparison of both approaches and show that BLM and bioaccumulation kinetics can be merged into a common mechanistic framework for metal uptake in aquatic organisms. Our results show that metal-specific absorption efficiencies calculated from BLM-parameters for freshwater fish are highly comparable, i.e. within a factor of 2.4 for silver, cadmium, copper, and zinc, to bioaccumulation-absorption efficiencies for predominantly marine fish. Conditional affinity constants are significantly related to the metal-specific covalent index. Additionally, the affinity constants of calcium, cadmium, copper, sodium, and zinc are significantly comparable across aquatic species, including molluscs, daphnids, and fish. This suggests that affinity constants can be estimated from the covalent index, and constants can be extrapolated across species. A new model is proposed that integrates the combined effect of metal chemodynamics, as speciation, competition, and ligand affinity, and species characteristics, as size, on metal uptake by aquatic organisms. An important direction for further research is the quantitative comparison of the proposed model with acute toxicity values for organisms belonging to different size classes.
A rat model system to study complex disease risks, fitness, aging, and longevity.
Koch, Lauren Gerard; Britton, Steven L; Wisløff, Ulrik
2012-02-01
The association between low exercise capacity and all-cause morbidity and mortality is statistically strong yet mechanistically unresolved. By connecting clinical observation with a theoretical base, we developed a working hypothesis that variation in capacity for oxygen metabolism is the central mechanistic determinant between disease and health (aerobic hypothesis). As an unbiased test, we show that two-way artificial selective breeding of rats for low and high intrinsic endurance exercise capacity also produces rats that differ for numerous disease risks, including the metabolic syndrome, cardiovascular complications, premature aging, and reduced longevity. This contrasting animal model system may prove to be translationally superior relative to more widely used simplistic models for understanding geriatric biology and medicine. Copyright © 2012 Elsevier Inc. All rights reserved.
Understanding essential tremor: progress on the biological front.
Louis, Elan D
2014-06-01
For many years, little was written about the underlying biology of ET, despite its high prevalence. Discussions of disease mechanisms were dominated by a focus on tremor physiology. The traditional model of ET, the olivary model, was proposed in the 1970s. The model suffers from several critical problems, and its relevance to ET has been questioned. Recent mechanistic research has focused on the cerebellum. Clinical and neuroimaging studies strongly implicate the importance of this brain region in ET. Recent mechanistic research has been grounded more in tissue-based changes (i.e., postmortem studies of the brain). These studies have collectively and systematically identified a sizable number of changes in the ET cerebellum, and have led to a new model of ET, referred to as the cerebellar degenerative model. Hence, there is a renewed interest in the science behind the biology of ET. How the new understanding of ET will translate into treatment changes is an open question.
Berryhill, Marian E.; Chein, Jason; Olson, Ingrid R.
2011-01-01
Portions of the posterior parietal cortex (PPC) play a role in working memory (WM) yet the precise mechanistic function of this region remains poorly understood. The pure storage hypothesis proposes that this region functions as a short-lived modality-specific memory store. Alternatively, the internal attention hypothesis proposes that the PPC functions as an attention-based storage and refreshing mechanism deployable as an alternative to material-specific rehearsal. These models were tested in patients with bilateral PPC lesions. Our findings discount the pure storage hypothesis because variables indexing storage capacity and longevity were not disproportionately affected by PPC damage. Instead, our data support the internal attention account by showing that (a) normal participants tend to use a rehearsal-based WM maintenance strategy for recall tasks but not for recognition tasks; (b) patients with PPC lesions performed normally on WM tasks that relied on material-specific rehearsal strategies but poorly on WM tasks that relied on attention-based maintenance strategies and patient strategy usage could be shifted by task or instructions; (c) patients’ memory deficits extended into the long-term domain. These findings suggest that the PPC maintains or shifts internal attention among the representations of items in WM. PMID:21345344
Berryhill, Marian E; Chein, Jason; Olson, Ingrid R
2011-04-01
Portions of the posterior parietal cortex (PPC) play a role in working memory (WM) yet the precise mechanistic function of this region remains poorly understood. The pure storage hypothesis proposes that this region functions as a short-lived modality-specific memory store. Alternatively, the internal attention hypothesis proposes that the PPC functions as an attention-based storage and refreshing mechanism deployable as an alternative to material-specific rehearsal. These models were tested in patients with bilateral PPC lesions. Our findings discount the pure storage hypothesis because variables indexing storage capacity and longevity were not disproportionately affected by PPC damage. Instead, our data support the internal attention account by showing that (a) normal participants tend to use a rehearsal-based WM maintenance strategy for recall tasks but not for recognition tasks; (b) patients with PPC lesions performed normally on WM tasks that relied on material-specific rehearsal strategies but poorly on WM tasks that relied on attention-based maintenance strategies and patient strategy usage could be shifted by task or instructions; (c) patients' memory deficits extended into the long-term domain. These findings suggest that the PPC maintains or shifts internal attention among the representations of items in WM. Copyright © 2011 Elsevier Ltd. All rights reserved.
SUMMARY: The major accomplishment of NTD’s air toxics program is the development of an exposure-dose- response model for acute exposure to volatile organic compounds (VOCs), based on momentary brain concentration as the dose metric associated with acute neurological impairments...
Gaussian process regression for forecasting battery state of health
NASA Astrophysics Data System (ADS)
Richardson, Robert R.; Osborne, Michael A.; Howey, David A.
2017-07-01
Accurately predicting the future capacity and remaining useful life of batteries is necessary to ensure reliable system operation and to minimise maintenance costs. The complex nature of battery degradation has meant that mechanistic modelling of capacity fade has thus far remained intractable; however, with the advent of cloud-connected devices, data from cells in various applications is becoming increasingly available, and the feasibility of data-driven methods for battery prognostics is increasing. Here we propose Gaussian process (GP) regression for forecasting battery state of health, and highlight various advantages of GPs over other data-driven and mechanistic approaches. GPs are a type of Bayesian non-parametric method, and hence can model complex systems whilst handling uncertainty in a principled manner. Prior information can be exploited by GPs in a variety of ways: explicit mean functions can be used if the functional form of the underlying degradation model is available, and multiple-output GPs can effectively exploit correlations between data from different cells. We demonstrate the predictive capability of GPs for short-term and long-term (remaining useful life) forecasting on a selection of capacity vs. cycle datasets from lithium-ion cells.
Prot, Jean Matthieu; Leclerc, Eric
2012-06-01
In this paper, we will consider new in vitro cell culture platforms and the progress made, based on the microfluidic liver biochips dedicated to pharmacological and toxicological studies. Particular emphasis will be given to recent developments in the microfluidic tools dedicated to cell culture (more particularly liver cell culture), in silico opportunities for Physiologically Based PharmacoKinetic (PBPK) modelling, the challenge of the mechanistic interpretations offered by the approaches resulting from "multi-omics" data (transcriptomics, proteomics, metabolomics, cytomics) and imaging microfluidic platforms. Finally, we will discuss the critical features regarding microfabrication, design and materials, and cell functionality as the key points for the future development of new microfluidic liver biochips.
Woodhouse, Steven; Piterman, Nir; Wintersteiger, Christoph M; Göttgens, Berthold; Fisher, Jasmin
2018-05-25
Reconstruction of executable mechanistic models from single-cell gene expression data represents a powerful approach to understanding developmental and disease processes. New ambitious efforts like the Human Cell Atlas will soon lead to an explosion of data with potential for uncovering and understanding the regulatory networks which underlie the behaviour of all human cells. In order to take advantage of this data, however, there is a need for general-purpose, user-friendly and efficient computational tools that can be readily used by biologists who do not have specialist computer science knowledge. The Single Cell Network Synthesis toolkit (SCNS) is a general-purpose computational tool for the reconstruction and analysis of executable models from single-cell gene expression data. Through a graphical user interface, SCNS takes single-cell qPCR or RNA-sequencing data taken across a time course, and searches for logical rules that drive transitions from early cell states towards late cell states. Because the resulting reconstructed models are executable, they can be used to make predictions about the effect of specific gene perturbations on the generation of specific lineages. SCNS should be of broad interest to the growing number of researchers working in single-cell genomics and will help further facilitate the generation of valuable mechanistic insights into developmental, homeostatic and disease processes.
Constrained variability of modeled T:ET ratio across biomes
NASA Astrophysics Data System (ADS)
Fatichi, Simone; Pappas, Christoforos
2017-07-01
A large variability (35-90%) in the ratio of transpiration to total evapotranspiration (referred here as T:ET) across biomes or even at the global scale has been documented by a number of studies carried out with different methodologies. Previous empirical results also suggest that T:ET does not covary with mean precipitation and has a positive dependence on leaf area index (LAI). Here we use a mechanistic ecohydrological model, with a refined process-based description of evaporation from the soil surface, to investigate the variability of T:ET across biomes. Numerical results reveal a more constrained range and higher mean of T:ET (70 ± 9%, mean ± standard deviation) when compared to observation-based estimates. T:ET is confirmed to be independent from mean precipitation, while it is found to be correlated with LAI seasonally but uncorrelated across multiple sites. Larger LAI increases evaporation from interception but diminishes ground evaporation with the two effects largely compensating each other. These results offer mechanistic model-based evidence to the ongoing research about the patterns of T:ET and the factors influencing its magnitude across biomes.
Modelling insights on the partition of evapotranspiration components across biomes
NASA Astrophysics Data System (ADS)
Fatichi, Simone; Pappas, Christoforos
2017-04-01
Recent studies using various methodologies have found a large variability (from 35 to 90%) in the ratio of transpiration to total evapotranspiration (denoted as T:ET) across biomes or even at the global scale. Concurrently, previous results suggest that T:ET is independent of mean precipitation and has a positive correlation with Leaf Area Index (LAI). We used the mechanistic ecohydrological model, T&C, with a refined process-based description of soil resistance and a detailed treatment of canopy biophysics and ecophysiology, to investigate T:ET across multiple biomes. Contrary to observation-based estimates, simulation results highlight a well-constrained range of mean T:ET across biomes that is also robust to perturbations of the most sensitive parameters. Simulated T:ET was confirmed to be independent of average precipitation, while it was found to be uncorrelated with LAI across biomes. Higher values of LAI increase evaporation from interception but suppress ground evaporation with the two effects largely cancelling each other in many sites. These results offer mechanistic, model-based, evidence to the ongoing research about the range of T:ET and the factors affecting its magnitude across biomes.
Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice
2017-01-01
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24–25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. PMID:27994170
Kasprak, Alan; Caster, Joshua J.; Bangen, Sara G.; Sankey, Joel B.
2017-01-01
The ability to quantify the processes driving geomorphic change in river valley margins is vital to geomorphologists seeking to understand the relative role of transport mechanisms (e.g. fluvial, aeolian, and hillslope processes) in landscape dynamics. High-resolution, repeat topographic data are becoming readily available to geomorphologists. By contrasting digital elevation models derived from repeat surveys, the transport processes driving topographic changes can be inferred, a method termed ‘mechanistic segregation.’ Unfortunately, mechanistic segregation largely relies on subjective and time consuming manual classification, which has implications both for its reproducibility and the practical scale of its application. Here we present a novel computational workflow for the mechanistic segregation of geomorphic transport processes in geospatial datasets. We apply the workflow to seven sites along the Colorado River in the Grand Canyon, where geomorphic transport is driven by a diverse suite of mechanisms. The workflow performs well when compared to field observations, with an overall predictive accuracy of 84% across 113 validation points. The approach most accurately predicts changes due to fluvial processes (100% accuracy) and aeolian processes (96%), with reduced accuracy in predictions of alluvial and colluvial processes (64% and 73%, respectively). Our workflow is designed to be applicable to a diversity of river systems and will likely provide a rapid and objective understanding of the processes driving geomorphic change at the reach and network scales. We anticipate that such an understanding will allow insight into the response of geomorphic transport processes to external forcings, such as shifts in climate, land use, or river regulation, with implications for process-based river management and restoration.
Martínez-Pernía, David; González-Castán, Óscar; Huepe, David
2017-02-01
The development of rehabilitation has traditionally focused on measurements of motor disorders and measurements of the improvements produced during the therapeutic process; however, physical rehabilitation sciences have not focused on understanding the philosophical and scientific principles in clinical intervention and how they are interrelated. The main aim of this paper is to explain the foundation stones of the disciplines of physical therapy, occupational therapy, and speech/language therapy in recovery from motor disorder. To reach our goals, the mechanistic view and how it is integrated into physical rehabilitation will first be explained. Next, a classification into mechanistic therapy based on an old version (automaton model) and a technological version (cyborg model) will be shown. Then, it will be shown how physical rehabilitation sciences found a new perspective in motor recovery, which is based on functionalism, during the cognitive revolution in the 1960s. Through this cognitive theory, physical rehabilitation incorporated into motor recovery of those therapeutic strategies that solicit the activation of the brain and/or symbolic processing; aspects that were not taken into account in mechanistic therapy. In addition, a classification into functionalist rehabilitation based on a computational therapy and a brain therapy will be shown. At the end of the article, the methodological principles in physical rehabilitation sciences will be explained. It will allow us to go deeper into the differences and similarities between therapeutic mechanism and therapeutic functionalism.
Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data
Kümmel, Anne; Panke, Sven; Heinemann, Matthias
2006-01-01
As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595
Kuhla, Angela; Rühlmann, Claire; Lindner, Tobias; Polei, Stefan; Hadlich, Stefan; Krause, Bernd J; Vollmar, Brigitte; Teipel, Stefan J
2017-01-01
Transgenic animal models of Aβ pathology provide mechanistic insight into some aspects of Alzheimer disease (AD) pathology related to Aβ accumulation. Quantitative neuroimaging is a possible aid to improve translation of mechanistic findings in transgenic models to human end phenotypes of brain morphology or function. Therefore, we combined MRI-based morphometry, MRS-based NAA-assessment and quantitative histology of neurons and amyloid plaque load in the APPswe/PS1dE9 mouse model to determine the interrelationship between morphological changes, changes in neuron numbers and amyloid plaque load with reductions of NAA levels as marker of neuronal functional viability. The APPswe/PS1dE9 mouse showed an increase of Aβ plaques, loss of neurons and an impairment of NAA/Cr ratio, which however was not accompanied with brain atrophy. As brain atrophy is one main characteristic in human AD, conclusions from murine to human AD pathology should be drawn with caution.
Higher plant modelling for life support applications: first results of a simple mechanistic model
NASA Astrophysics Data System (ADS)
Hezard, Pauline; Dussap, Claude-Gilles; Sasidharan L, Swathy
2012-07-01
In the case of closed ecological life support systems, the air and water regeneration and food production are performed using microorganisms and higher plants. Wheat, rice, soybean, lettuce, tomato or other types of eatable annual plants produce fresh food while recycling CO2 into breathable oxygen. Additionally, they evaporate a large quantity of water, which can be condensed and used as potable water. This shows that recycling functions of air revitalization and food production are completely linked. Consequently, the control of a growth chamber for higher plant production has to be performed with efficient mechanistic models, in order to ensure a realistic prediction of plant behaviour, water and gas recycling whatever the environmental conditions. Purely mechanistic models of plant production in controlled environments are not available yet. This is the reason why new models must be developed and validated. This work concerns the design and test of a simplified version of a mathematical model coupling plant architecture and mass balance purposes in order to compare its results with available data of lettuce grown in closed and controlled chambers. The carbon exchange rate, water absorption and evaporation rate, biomass fresh weight as well as leaf surface are modelled and compared with available data. The model consists of four modules. The first one evaluates plant architecture, like total leaf surface, leaf area index and stem length data. The second one calculates the rate of matter and energy exchange depending on architectural and environmental data: light absorption in the canopy, CO2 uptake or release, water uptake and evapotranspiration. The third module evaluates which of the previous rates is limiting overall biomass growth; and the last one calculates biomass growth rate depending on matter exchange rates, using a global stoichiometric equation. All these rates are a set of differential equations, which are integrated with time in order to provide total biomass fresh weight during the full growth duration. The model predicts a growth with exponential rate at the beginning and then it becomes linear for the end of the growth; this follows rather accurately the experimental data. Even if this model is too simple to be realistic for more complex plants in changing environments, this is the first step for an integrated approach of plant growth accounting of architectural and mass transfer limitations.
NASA Astrophysics Data System (ADS)
Ilie, Iulia; Dittrich, Peter; Carvalhais, Nuno; Jung, Martin; Heinemeyer, Andreas; Migliavacca, Mirco; Morison, James I. L.; Sippel, Sebastian; Subke, Jens-Arne; Wilkinson, Matthew; Mahecha, Miguel D.
2017-09-01
Accurate model representation of land-atmosphere carbon fluxes is essential for climate projections. However, the exact responses of carbon cycle processes to climatic drivers often remain uncertain. Presently, knowledge derived from experiments, complemented by a steadily evolving body of mechanistic theory, provides the main basis for developing such models. The strongly increasing availability of measurements may facilitate new ways of identifying suitable model structures using machine learning. Here, we explore the potential of gene expression programming (GEP) to derive relevant model formulations based solely on the signals present in data by automatically applying various mathematical transformations to potential predictors and repeatedly evolving the resulting model structures. In contrast to most other machine learning regression techniques, the GEP approach generates readable
models that allow for prediction and possibly for interpretation. Our study is based on two cases: artificially generated data and real observations. Simulations based on artificial data show that GEP is successful in identifying prescribed functions, with the prediction capacity of the models comparable to four state-of-the-art machine learning methods (random forests, support vector machines, artificial neural networks, and kernel ridge regressions). Based on real observations we explore the responses of the different components of terrestrial respiration at an oak forest in south-eastern England. We find that the GEP-retrieved models are often better in prediction than some established respiration models. Based on their structures, we find previously unconsidered exponential dependencies of respiration on seasonal ecosystem carbon assimilation and water dynamics. We noticed that the GEP models are only partly portable across respiration components, the identification of a general
terrestrial respiration model possibly prevented by equifinality issues. Overall, GEP is a promising tool for uncovering new model structures for terrestrial ecology in the data-rich era, complementing more traditional modelling approaches.
Schneck, Karen B; Zhang, Xin; Bauer, Robert; Karlsson, Mats O; Sinha, Vikram P
2013-02-01
A proof of concept study was conducted to investigate the safety and tolerability of a novel oral glucokinase activator, LY2599506, during multiple dose administration to healthy volunteers and subjects with Type 2 diabetes mellitus (T2DM). To analyze the study data, a previously established semi-mechanistic integrated glucose-insulin model was extended to include characterization of glucagon dynamics. The model captured endogenous glucose and insulin dynamics, including the amplifying effects of glucose on insulin production and of insulin on glucose elimination, as well as the inhibitory influence of glucose and insulin on hepatic glucose production. The hepatic glucose production in the model was increased by glucagon and glucagon production was inhibited by elevated glucose concentrations. The contribution of exogenous factors to glycemic response, such as ingestion of carbohydrates in meals, was also included in the model. The effect of LY2599506 on glucose homeostasis in subjects with T2DM was investigated by linking a one-compartment, pharmacokinetic model to the semi-mechanistic, integrated glucose-insulin-glucagon system. Drug effects were included on pancreatic insulin secretion and hepatic glucose production. The relationships between LY2599506, glucose, insulin, and glucagon concentrations were described quantitatively and consequently, the improved understanding of the drug-response system could be used to support further clinical study planning during drug development, such as dose selection.
Optimizing construction quality management of pavements using mechanistic performance analysis.
DOT National Transportation Integrated Search
2004-08-01
This report presents a statistical-based algorithm that was developed to reconcile the results from several pavement performance models used in the state of practice with systematic process control techniques. These algorithms identify project-specif...
Rotary ultrasonic machining of CFRP: a mechanistic predictive model for cutting force.
Cong, W L; Pei, Z J; Sun, X; Zhang, C L
2014-02-01
Cutting force is one of the most important output variables in rotary ultrasonic machining (RUM) of carbon fiber reinforced plastic (CFRP) composites. Many experimental investigations on cutting force in RUM of CFRP have been reported. However, in the literature, there are no cutting force models for RUM of CFRP. This paper develops a mechanistic predictive model for cutting force in RUM of CFRP. The material removal mechanism of CFRP in RUM has been analyzed first. The model is based on the assumption that brittle fracture is the dominant mode of material removal. CFRP micromechanical analysis has been conducted to represent CFRP as an equivalent homogeneous material to obtain the mechanical properties of CFRP from its components. Based on this model, relationships between input variables (including ultrasonic vibration amplitude, tool rotation speed, feedrate, abrasive size, and abrasive concentration) and cutting force can be predicted. The relationships between input variables and important intermediate variables (indentation depth, effective contact time, and maximum impact force of single abrasive grain) have been investigated to explain predicted trends of cutting force. Experiments are conducted to verify the model, and experimental results agree well with predicted trends from this model. Copyright © 2013 Elsevier B.V. All rights reserved.
Kumar, Nagi; Crocker, Theresa; Smith, Tiffany; Connors, Shahnjayla; Pow-Sang, Julio; Spiess, Philippe E; Egan, Kathleen; Quinn, Gwen; Schell, Michael; Sebti, Said; Kazi, Aslam; Chuang, Tian; Salup, Raoul; Helal, Mohamed; Zagaja, Gregory; Trabulsi, Edouard; McLarty, Jerry; Fazili, Tajammul; Williams, Christopher R; Schreiber, Fred; Anderson, Kyle
2012-01-21
In spite of the large number of nutrient-derived agents demonstrating promise as potential chemopreventive agents, most have failed to prove effectiveness in clinical trials. Critical requirements for moving nutrient-derived agents to recommendation for clinical use include adopting a systematic, molecular-mechanism based approach and utilizing the same ethical and rigorous methods such as are used to evaluate other pharmacological agents. Preliminary data on a mechanistic rationale for chemoprevention activity as observed from epidemiological, in vitro and preclinical studies, phase I data of safety in suitable cohorts, duration of intervention based on time to progression of preneoplastic disease to cancer and the use of a valid panel of biomarkers representing the hypothesized carcinogenesis pathway for measuring efficacy must inform the design of phase II clinical trials. The goal of this paper is to provide a model for evaluating a well characterized agent- Polyphenon E- in a phase II clinical trial of prostate cancer chemoprevention.
Kumar, Nagi; Crocker, Theresa; Smith, Tiffany; Connors, Shahnjayla; Pow-Sang, Julio; Spiess, Philippe E.; Egan, Kathleen; Quinn, Gwen; Schell, Michael; Sebti, Said; Kazi, Aslam; Chuang, Tian; Salup, Raoul; Helal, Mohamed; Zagaja, Gregory; Trabulsi, Edouard; McLarty, Jerry; Fazili, Tajammul; Williams, Christopher R.; Schreiber, Fred; Anderson, Kyle
2014-01-01
In spite of the large number of nutrient-derived agents demonstrating promise as potential chemopreventive agents, most have failed to prove effectiveness in clinical trials. Critical requirements for moving nutrient-derived agents to recommendation for clinical use include adopting a systematic, molecular-mechanism based approach and utilizing the same ethical and rigorous methods such as are used to evaluate other pharmacological agents. Preliminary data on a mechanistic rationale for chemoprevention activity as observed from epidemiological, in vitro and preclinical studies, phase I data of safety in suitable cohorts, duration of intervention based on time to progression of preneoplastic disease to cancer and the use of a valid panel of biomarkers representing the hypothesized carcinogenesis pathway for measuring efficacy must inform the design of phase II clinical trials. The goal of this paper is to provide a model for evaluating a well characterized agent- Polyphenon E- in a phase II clinical trial of prostate cancer chemoprevention. PMID:24533253
Describing dengue epidemics: Insights from simple mechanistic models
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Stollenwerk, Nico; Kooi, Bob W.
2012-09-01
We present a set of nested models to be applied to dengue fever epidemiology. We perform a qualitative study in order to show how much complexity we really need to add into epidemiological models to be able to describe the fluctuations observed in empirical dengue hemorrhagic fever incidence data offering a promising perspective on inference of parameter values from dengue case notifications.
NASA Astrophysics Data System (ADS)
Carles Brangarí, Albert; Sanchez-Vila, Xavier; Freixa, Anna; Romaní, Anna M.; Fernàndez-Garcia, Daniel
2017-04-01
The distribution, amount, and characteristics of biofilms and its components govern the capacity of soils to let water through, to transport solutes, and the reactions occurring. Therefore, unraveling the relationship between microbial dynamics and the hydraulic properties of soils is of concern for the management of natural systems and many technological applications. However, the increased complexity of both the microbial communities and the geochemical processes entailed by them causes that the phenomenon of bioclogging remains poorly understood. This highlights the need for a better understanding of the microbial components such as live and dead bacteria and extracellular polymeric substances (EPS), as well as of their spatial distribution. This work tries to shed some light on these issues, providing experimental data and a new mechanistic model that predicts the variably saturated hydraulic properties of bio-amended soils based on these data. We first present a long-term laboratory infiltration experiment that aims at studying the temporal variation of selected biogeochemical parameters along the infiltration path. The setup consists of a 120-cm-high soil tank instrumented with an array of sensors plus soil and liquid samplers. Sensors measured a wide range of parameters in continuous, such as volumetric water content, electrical conductivity, temperature, water pressure, soil suction, dissolved oxygen, and pH. Samples were kept for chemical and biological analyses. Results indicate that: i) biofilm is present at all depths, denoting the potential for deep bioclogging, ii) the redox conditions profile shows different stages, indicating that the community was adapted to changing redox conditions, iii) bacterial activity, richness and diversity also exhibit zonation with depth, and iv) the hydraulic properties of the soil experienced significant changes as biofilm proliferated. Based on experimental evidences, we propose a tool to predict changes in the hydraulic properties of bio-amended variably saturated soils. The new mechanistic model provides analytical equations for the water retention curve and the relative permeability. The approach consists in assuming that the porous media behaves as an ensemble of capillary tubes, which may be obtained from the experimental saturation profile. This premise is extended by considering the existence of biofilm bodies composed of bacteria and EPS. These compounds display a channeled geometry that reshapes the pore space at the pore-scale following specific geometrical patterns and changes its volume with suction. The hydraulic properties of the bio-amended soil can then be derived from the integrate contribution of the two biofilm compounds separately. Model can successfully reproduce displacements of the soil-water retention curve towards higher saturations and permeability reductions of distinct orders of magnitude.
Predicting Biological Information Flow in a Model Oxygen Minimum Zone
NASA Astrophysics Data System (ADS)
Louca, S.; Hawley, A. K.; Katsev, S.; Beltran, M. T.; Bhatia, M. P.; Michiels, C.; Capelle, D.; Lavik, G.; Doebeli, M.; Crowe, S.; Hallam, S. J.
2016-02-01
Microbial activity drives marine biochemical fluxes and nutrient cycling at global scales. Geochemical measurements as well as molecular techniques such as metagenomics, metatranscriptomics and metaproteomics provide great insight into microbial activity. However, an integration of molecular and geochemical data into mechanistic biogeochemical models is still lacking. Recent work suggests that microbial metabolic pathways are, at the ecosystem level, strongly shaped by stoichiometric and energetic constraints. Hence, models rooted in fluxes of matter and energy may yield a holistic understanding of biogeochemistry. Furthermore, such pathway-centric models would allow a direct consolidation with meta'omic data. Here we present a pathway-centric biogeochemical model for the seasonal oxygen minimum zone in Saanich Inlet, a fjord off the coast of Vancouver Island. The model considers key dissimilatory nitrogen and sulfur fluxes, as well as the population dynamics of the genes that mediate them. By assuming a direct translation of biocatalyzed energy fluxes to biosynthesis rates, we make predictions about the distribution and activity of the corresponding genes. A comparison of the model to molecular measurements indicates that the model explains observed DNA, RNA, protein and cell depth profiles. This suggests that microbial activity in marine ecosystems such as oxygen minimum zones is well described by DNA abundance, which, in conjunction with geochemical constraints, determines pathway expression and process rates. Our work further demonstrates how meta'omic data can be mechanistically linked to environmental redox conditions and biogeochemical processes.
Stoffenmanager exposure model: company-specific exposure assessments using a Bayesian methodology.
van de Ven, Peter; Fransman, Wouter; Schinkel, Jody; Rubingh, Carina; Warren, Nicholas; Tielemans, Erik
2010-04-01
The web-based tool "Stoffenmanager" was initially developed to assist small- and medium-sized enterprises in the Netherlands to make qualitative risk assessments and to provide advice on control at the workplace. The tool uses a mechanistic model to arrive at a "Stoffenmanager score" for exposure. In a recent study it was shown that variability in exposure measurements given a certain Stoffenmanager score is still substantial. This article discusses an extension to the tool that uses a Bayesian methodology for quantitative workplace/scenario-specific exposure assessment. This methodology allows for real exposure data observed in the company of interest to be combined with the prior estimate (based on the Stoffenmanager model). The output of the tool is a company-specific assessment of exposure levels for a scenario for which data is available. The Bayesian approach provides a transparent way of synthesizing different types of information and is especially preferred in situations where available data is sparse, as is often the case in small- and medium sized-enterprises. Real-world examples as well as simulation studies were used to assess how different parameters such as sample size, difference between prior and data, uncertainty in prior, and variance in the data affect the eventual posterior distribution of a Bayesian exposure assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.
1995-08-01
This report describes the primary physical models that form the basis of the DART mechanistic computer model for calculating fission-product-induced swelling of aluminum dispersion fuels; the calculated results are compared with test data. In addition, DART calculates irradiation-induced changes in the thermal conductivity of the dispersion fuel, as well as fuel restructuring due to aluminum fuel reaction, amorphization, and recrystallization. Input instructions for execution on mainframe, workstation, and personal computers are provided, as is a description of DART output. The theory of fission gas behavior and its effect on fuel swelling is discussed. The behavior of these fission products inmore » both crystalline and amorphous fuel and in the presence of irradiation-induced recrystallization and crystalline-to-amorphous-phase change phenomena is presented, as are models for these irradiation-induced processes.« less
(Q)SARs to predict environmental toxicities: current status and future needs.
Cronin, Mark T D
2017-03-22
The current state of the art of (Quantitative) Structure-Activity Relationships ((Q)SARs) to predict environmental toxicity is assessed along with recommendations to develop these models further. The acute toxicity of compounds acting by the non-polar narcotic mechanism of action can be well predicted, however other approaches, including read-across, may be required for compounds acting by specific mechanisms of action. The chronic toxicity of compounds to environmental species is more difficult to predict from (Q)SARs, with robust data sets and more mechanistic information required. In addition, the toxicity of mixtures is little addressed by (Q)SAR approaches. Developments in environmental toxicology including Adverse Outcome Pathways (AOPs) and omics responses should be utilised to develop better, more mechanistically relevant, (Q)SAR models.
Bundy, Jacob G; Sidhu, Jasmin K; Rana, Faisal; Spurgeon, David J; Svendsen, Claus; Wren, Jodie F; Stürzenbaum, Stephen R; Morgan, A John; Kille, Peter
2008-06-03
New methods are needed for research into non-model organisms, to monitor the effects of toxic disruption at both the molecular and functional organism level. We exposed earthworms (Lumbricus rubellus Hoffmeister) to sub-lethal levels of copper (10-480 mg/kg soil) for 70 days as a real-world situation, and monitored both molecular (cDNA transcript microarrays and nuclear magnetic resonance-based metabolic profiling: metabolomics) and ecological/functional endpoints (reproduction rate and weight change, which have direct relevance to population-level impacts). Both of the molecular endpoints, metabolomics and transcriptomics, were highly sensitive, with clear copper-induced differences even at levels below those that caused a reduction in reproductive parameters. The microarray and metabolomic data provided evidence that the copper exposure led to a disruption of energy metabolism: transcripts of enzymes from oxidative phosphorylation were significantly over-represented, and increases in transcripts of carbohydrate metabolising enzymes (maltase-glucoamylase, mannosidase) had corresponding decreases in small-molecule metabolites (glucose, mannose). Treating both enzymes and metabolites as functional cohorts led to clear inferences about changes in energetic metabolism (carbohydrate use and oxidative phosphorylation), which would not have been possible by taking a 'biomarker' approach to data analysis. Multiple post-genomic techniques can be combined to provide mechanistic information about the toxic effects of chemical contaminants, even for non-model organisms with few additional mechanistic toxicological data. With 70-day no-observed-effect and lowest-observed-effect concentrations (NOEC and LOEC) of 10 and 40 mg kg-1 for metabolomic and microarray profiles, copper is shown to interfere with energy metabolism in an important soil organism at an ecologically and functionally relevant level.
MECHANISTIC DOSIMETRY MODELS OF NANOMATERIAL DEPOSITION IN THE RESPIRATORY TRACT
Accurate health risk assessments of inhalation exposure to nanomaterials will require dosimetry models that account for interspecies differences in dose delivered to the respiratory tract. Mechanistic models offer the advantage to interspecies extrapolation that physicochemica...
Fazl, Arash; Grossberg, Stephen; Mingolla, Ennio
2009-02-01
How does the brain learn to recognize an object from multiple viewpoints while scanning a scene with eye movements? How does the brain avoid the problem of erroneously classifying parts of different objects together? How are attention and eye movements intelligently coordinated to facilitate object learning? A neural model provides a unified mechanistic explanation of how spatial and object attention work together to search a scene and learn what is in it. The ARTSCAN model predicts how an object's surface representation generates a form-fitting distribution of spatial attention, or "attentional shroud". All surface representations dynamically compete for spatial attention to form a shroud. The winning shroud persists during active scanning of the object. The shroud maintains sustained activity of an emerging view-invariant category representation while multiple view-specific category representations are learned and are linked through associative learning to the view-invariant object category. The shroud also helps to restrict scanning eye movements to salient features on the attended object. Object attention plays a role in controlling and stabilizing the learning of view-specific object categories. Spatial attention hereby coordinates the deployment of object attention during object category learning. Shroud collapse releases a reset signal that inhibits the active view-invariant category in the What cortical processing stream. Then a new shroud, corresponding to a different object, forms in the Where cortical processing stream, and search using attention shifts and eye movements continues to learn new objects throughout a scene. The model mechanistically clarifies basic properties of attention shifts (engage, move, disengage) and inhibition of return. It simulates human reaction time data about object-based spatial attention shifts, and learns with 98.1% accuracy and a compression of 430 on a letter database whose letters vary in size, position, and orientation. The model provides a powerful framework for unifying many data about spatial and object attention, and their interactions during perception, cognition, and action.
NASA Astrophysics Data System (ADS)
Bridgham, S. D.
2015-12-01
Wetlands emit a third to half of the global CH4 flux and have the largest uncertainty of any emission source. Moreover, wetlands have provided an important radiative feedback to climate in the geologic and recent past. A number of largescale wetland CH4 models have been developed recently, but intermodel comparisons show wide discrepancies in their predictions. I present an empiricist's overview of the current limitations and challenges of more accurately modeling wetland CH4 emissions. One of the largest limitations is simply the poor knowledge of wetland area, with estimated global values varying by a more than a factor of three. The areas of seasonal and tropical wetlands are particularly poorly constrained. There are also few wetlands with complete, multi-year datasets for all of the input variables for many models, and this lack of data is particularly alarming in tropical wetlands given that they are arguably the single largest natural or anthropogenic global CH4 source. Almost all largescale CH4 models have little biogeochemical mechanistic detail and treat anaerobic carbon cycling in a highly simplified manner. The CH4:CO2 ratio in anaerobic carbon mineralization is a central parameter in many models, but is at most set at a few values with no mechanistic underpinning. However, empirical data show that this ratio varies by five orders of magnitude in different wetlands, and tropical wetlands appear to be particularly methanogenic, all for reasons that are very poorly understood. The predominance of the acetoclastic pathway of methanogenesis appears to be related to total CH4 production, but different methanogenesis pathways are generally not incorporated into models. Other important anaerobic processes such as humic substances acting as terminal electron acceptors, fermentation, homoacetogenesis, and anaerobic CH4 oxidation are also not included in most models despite evidence of their importance in empirical studies. Moreover, there has been an explosion of microbial studies in wetlands using high-throughput molecular techniques, but microbial community and functional parameters are largely missing from models. However, recently developed trait-based models show promise for reducing the multivariate complexity of this data into manageable parameters for large-scale CH4 models.
Understanding lizard's microhabitat use based on a mechanistic model of behavioral thermoregulation
NASA Astrophysics Data System (ADS)
Fei, Teng; Venus, Valentijn; Toxopeus, Bert; Skidmore, Andrew K.; Schlerf, Martin; Liu, Yaolin; van Overdijk, Sjef; Bian, Meng
2008-12-01
Lizards are an "excellent group of organisms" to examine the habitat and microhabitat use mainly because their ecology and physiology is well studied. Due to their behavioral body temperature regulation, the thermal environment is especially linked with their habitat use. In this study, for mapping and understanding lizard's distribution at microhabitat scale, an individual of Timon Lepidus was kept and monitored in a terrarium (245×120×115cm) in which sand, rocks, burrows, hatching chambers, UV-lamps, fog generators and heating devices were placed to simulate its natural habitat. Optical cameras, thermal cameras and other data loggers were fixed and recording the lizard's body temperature, ground surface temperature, air temperature, radiation and other important environmental parameters. By analysis the data collected, we propose a Cellular Automata (CA) model by which the movement of lizards is simulated and translated into their distribution. This paper explores the capabilities of applying GIS techniques to thermoregulatory activity studies in a microhabitat-scale. We conclude that microhabitat use of lizards can be explained in some degree by the rule based CA model.
NASA Astrophysics Data System (ADS)
Ghimire, B.; Riley, W. J.; Koven, C. D.; Randerson, J. T.; Mu, M.; Kattge, J.; Rogers, A.; Reich, P. B.
2014-12-01
In many ecosystems, nitrogen is the most limiting nutrient for plant growth and productivity. However mechanistic representation of nitrogen uptake linked to root traits, and functional nitrogen allocation among different leaf enzymes involved in respiration and photosynthesis is currently lacking in Earth System models. The linkage between nitrogen availability and plant productivity is simplistically represented by potential photosynthesis rates, and is subsequently downregulated depending on nitrogen supply and other nitrogen consumers in the model (e.g., nitrification). This type of potential photosynthesis rate calculation is problematic for several reasons. Firstly, plants do not photosynthesize at potential rates and then downregulate. Secondly, there is considerable subjectivity on the meaning of potential photosynthesis rates. Thirdly, there exists lack of understanding on modeling these potential photosynthesis rates in a changing climate. In addition to model structural issues in representing photosynthesis rates, the role of plant roots in nutrient acquisition have been largely ignored in Earth System models. For example, in CLM4.5, nitrogen uptake is linked to leaf level processes (e.g., primarily productivity) rather than root scale process involved in nitrogen uptake. We present a new plant model for CLM with an improved mechanistic presentation of plant nitrogen uptake based on root scale Michaelis Menten kinetics, and stronger linkages between leaf nitrogen and plant productivity by inferring relationships observed in global databases of plant traits (including the TRY database and several individual studies). We also incorporate improved representation of plant nitrogen leaf allocation, especially in tropical regions where significant over-prediction of plant growth and productivity in CLM4.5 simulations exist. We evaluate our improved global model simulations using the International Land Model Benchmarking (ILAMB) framework. We conclude that mechanistic representation of leaf-level nitrogen allocation and a theoretically consistent treatment of competition with belowground consumers leads to overall improvements in CLM4.5's global carbon cycling predictions.
Comparing spatial diversification and meta-population models in the Indo-Australian Archipelago
Chalmandrier, Loïc; Albouy, Camille; Descombes, Patrice; Sandel, Brody; Faurby, Soren; Svenning, Jens-Christian; Zimmermann, Niklaus E.
2018-01-01
Reconstructing the processes that have shaped the emergence of biodiversity gradients is critical to understand the dynamics of diversification of life on Earth. Islands have traditionally been used as model systems to unravel the processes shaping biological diversity. MacArthur and Wilson's island biogeographic model predicts diversity to be based on dynamic interactions between colonization and extinction rates, while treating islands themselves as geologically static entities. The current spatial configuration of islands should influence meta-population dynamics, but long-term geological changes within archipelagos are also expected to have shaped island biodiversity, in part by driving diversification. Here, we compare two mechanistic models providing inferences on species richness at a biogeographic scale: a mechanistic spatial-temporal model of species diversification and a spatial meta-population model. While the meta-population model operates over a static landscape, the diversification model is driven by changes in the size and spatial configuration of islands through time. We compare the inferences of both models to floristic diversity patterns among land patches of the Indo-Australian Archipelago. Simulation results from the diversification model better matched observed diversity than a meta-population model constrained only by the contemporary landscape. The diversification model suggests that the dynamic re-positioning of islands promoting land disconnection and reconnection induced an accumulation of particularly high species diversity on Borneo, which is central within the island network. By contrast, the meta-population model predicts a higher diversity on the mainlands, which is less compatible with empirical data. Our analyses highlight that, by comparing models with contrasting assumptions, we can pinpoint the processes that are most compatible with extant biodiversity patterns. PMID:29657753
Comparing spatial diversification and meta-population models in the Indo-Australian Archipelago.
Chalmandrier, Loïc; Albouy, Camille; Descombes, Patrice; Sandel, Brody; Faurby, Soren; Svenning, Jens-Christian; Zimmermann, Niklaus E; Pellissier, Loïc
2018-03-01
Reconstructing the processes that have shaped the emergence of biodiversity gradients is critical to understand the dynamics of diversification of life on Earth. Islands have traditionally been used as model systems to unravel the processes shaping biological diversity. MacArthur and Wilson's island biogeographic model predicts diversity to be based on dynamic interactions between colonization and extinction rates, while treating islands themselves as geologically static entities. The current spatial configuration of islands should influence meta-population dynamics, but long-term geological changes within archipelagos are also expected to have shaped island biodiversity, in part by driving diversification. Here, we compare two mechanistic models providing inferences on species richness at a biogeographic scale: a mechanistic spatial-temporal model of species diversification and a spatial meta-population model. While the meta-population model operates over a static landscape, the diversification model is driven by changes in the size and spatial configuration of islands through time. We compare the inferences of both models to floristic diversity patterns among land patches of the Indo-Australian Archipelago. Simulation results from the diversification model better matched observed diversity than a meta-population model constrained only by the contemporary landscape. The diversification model suggests that the dynamic re-positioning of islands promoting land disconnection and reconnection induced an accumulation of particularly high species diversity on Borneo, which is central within the island network. By contrast, the meta-population model predicts a higher diversity on the mainlands, which is less compatible with empirical data. Our analyses highlight that, by comparing models with contrasting assumptions, we can pinpoint the processes that are most compatible with extant biodiversity patterns.
Claassen, Karina; Willmann, Stefan; Eissing, Thomas; Preusser, Tobias; Block, Michael
2013-01-01
The renin-angiotensin-aldosterone system (RAAS) plays a key role in the pathogenesis of cardiovascular disorders including hypertension and is one of the most important targets for drugs. A whole body physiologically based pharmacokinetic (wb PBPK) model integrating this hormone circulation system and its inhibition can be used to explore the influence of drugs that interfere with this system, and thus to improve the understanding of interactions between drugs and the target system. In this study, we describe the development of a mechanistic RAAS model and exemplify drug action by a simulation of enalapril administration. Enalapril and its metabolite enalaprilat are potent inhibitors of the angiotensin-converting-enzyme (ACE). To this end, a coupled dynamic parent-metabolite PBPK model was developed and linked with the RAAS model that consists of seven coupled PBPK models for aldosterone, ACE, angiotensin 1, angiotensin 2, angiotensin 2 receptor type 1, renin, and prorenin. The results indicate that the model represents the interactions in the RAAS in response to the pharmacokinetics (PK) and pharmacodynamics (PD) of enalapril and enalaprilat in an accurate manner. The full set of RAAS-hormone profiles and interactions are consistently described at pre- and post-administration steady state as well as during their dynamic transition and show a good agreement with literature data. The model allows a simultaneous representation of the parent-metabolite conversion to the active form as well as the effect of the drug on the hormone levels, offering a detailed mechanistic insight into the hormone cascade and its inhibition. This model constitutes a first major step to establish a PBPK-PD-model including the PK and the mode of action (MoA) of a drug acting on a dynamic RAAS that can be further used to link to clinical endpoints such as blood pressure. PMID:23404365
Ecological Forecasting in Chesapeake Bay: Using a Mechanistic-Empirical Modelling Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, C. W.; Hood, Raleigh R.; Long, Wen
The Chesapeake Bay Ecological Prediction System (CBEPS) automatically generates daily nowcasts and three-day forecasts of several environmental variables, such as sea-surface temperature and salinity, the concentrations of chlorophyll, nitrate, and dissolved oxygen, and the likelihood of encountering several noxious species, including harmful algal blooms and water-borne pathogens, for the purpose of monitoring the Bay's ecosystem. While the physical and biogeochemical variables are forecast mechanistically using the Regional Ocean Modeling System configured for the Chesapeake Bay, the species predictions are generated using a novel mechanistic empirical approach, whereby real-time output from the coupled physical biogeochemical model drives multivariate empirical habitat modelsmore » of the target species. The predictions, in the form of digital images, are available via the World Wide Web to interested groups to guide recreational, management, and research activities. Though full validation of the integrated forecasts for all species is still a work in progress, we argue that the mechanistic–empirical approach can be used to generate a wide variety of short-term ecological forecasts, and that it can be applied in any marine system where sufficient data exist to develop empirical habitat models. This paper provides an overview of this system, its predictions, and the approach taken.« less
Transgenerational Adaptation to Pollution Changes Energy Allocation in Populations of Nematodes.
Goussen, Benoit; Péry, Alexandre R R; Bonzom, Jean-Marc; Beaudouin, Rémy
2015-10-20
Assessing the evolutionary responses of long-term exposed populations requires multigeneration ecotoxicity tests. However, the analysis of the data from these tests is not straightforward. Mechanistic models allow the in-depth analysis of the variation of physiological traits over many generations, by quantifying the trend of the physiological and toxicological parameters of the model. In the present study, a bioenergetic mechanistic model has been used to assess the evolution of two populations of the nematode Caenorhabditis elegans in control conditions or exposed to uranium. This evolutionary pressure resulted in a brood size reduction of 60%. We showed an adaptation of individuals of both populations to experimental conditions (increase of maximal length, decrease of growth rate, decrease of brood size, and decrease of the elimination rate). In addition, differential evolution was also highlighted between the two populations once the maternal effects had been diminished after several generations. Thus, individuals that were greater in maximal length, but with apparently a greater sensitivity to uranium were selected in the uranium population. In this study, we showed that this bioenergetics mechanistic modeling approach provided a precise, certain, and powerful analysis of the life strategy of C. elegans populations exposed to heavy metals resulting in an evolutionary pressure across successive generations.
Livestock Helminths in a Changing Climate: Approaches and Restrictions to Meaningful Predictions.
Fox, Naomi J; Marion, Glenn; Davidson, Ross S; White, Piran C L; Hutchings, Michael R
2012-03-06
Climate change is a driving force for livestock parasite risk. This is especially true for helminths including the nematodes Haemonchus contortus, Teladorsagia circumcincta, Nematodirus battus, and the trematode Fasciola hepatica, since survival and development of free-living stages is chiefly affected by temperature and moisture. The paucity of long term predictions of helminth risk under climate change has driven us to explore optimal modelling approaches and identify current bottlenecks to generating meaningful predictions. We classify approaches as correlative or mechanistic, exploring their strengths and limitations. Climate is one aspect of a complex system and, at the farm level, husbandry has a dominant influence on helminth transmission. Continuing environmental change will necessitate the adoption of mitigation and adaptation strategies in husbandry. Long term predictive models need to have the architecture to incorporate these changes. Ultimately, an optimal modelling approach is likely to combine mechanistic processes and physiological thresholds with correlative bioclimatic modelling, incorporating changes in livestock husbandry and disease control. Irrespective of approach, the principal limitation to parasite predictions is the availability of active surveillance data and empirical data on physiological responses to climate variables. By combining improved empirical data and refined models with a broad view of the livestock system, robust projections of helminth risk can be developed.
Predicting neuroblastoma using developmental signals and a logic-based model.
Kasemeier-Kulesa, Jennifer C; Schnell, Santiago; Woolley, Thomas; Spengler, Jennifer A; Morrison, Jason A; McKinney, Mary C; Pushel, Irina; Wolfe, Lauren A; Kulesa, Paul M
2018-07-01
Genomic information from human patient samples of pediatric neuroblastoma cancers and known outcomes have led to specific gene lists put forward as high risk for disease progression. However, the reliance on gene expression correlations rather than mechanistic insight has shown limited potential and suggests a critical need for molecular network models that better predict neuroblastoma progression. In this study, we construct and simulate a molecular network of developmental genes and downstream signals in a 6-gene input logic model that predicts a favorable/unfavorable outcome based on the outcome of the four cell states including cell differentiation, proliferation, apoptosis, and angiogenesis. We simulate the mis-expression of the tyrosine receptor kinases, trkA and trkB, two prognostic indicators of neuroblastoma, and find differences in the number and probability distribution of steady state outcomes. We validate the mechanistic model assumptions using RNAseq of the SHSY5Y human neuroblastoma cell line to define the input states and confirm the predicted outcome with antibody staining. Lastly, we apply input gene signatures from 77 published human patient samples and show that our model makes more accurate disease outcome predictions for early stage disease than any current neuroblastoma gene list. These findings highlight the predictive strength of a logic-based model based on developmental genes and offer a better understanding of the molecular network interactions during neuroblastoma disease progression. Copyright © 2018. Published by Elsevier B.V.
Adediran, S A; Ratkowsky, D A; Donaghy, D J; Malau-Aduli, A E O
2012-09-01
Fourteen lactation models were fitted to average and individual cow lactation data from pasture-based dairy systems in the Australian states of Victoria and Tasmania. The models included a new "log-quadratic" model, and a major objective was to evaluate and compare the performance of this model with the other models. Nine empirical and 5 mechanistic models were first fitted to average test-day milk yield of Holstein-Friesian dairy cows using the nonlinear procedure in SAS. Two additional semiparametric models were fitted using a linear model in ASReml. To investigate the influence of days to first test-day and the number of test-days, 5 of the best-fitting models were then fitted to individual cow lactation data. Model goodness of fit was evaluated using criteria such as the residual mean square, the distribution of residuals, the correlation between actual and predicted values, and the Wald-Wolfowitz runs test. Goodness of fit was similar in all but one of the models in terms of fitting average lactation but they differed in their ability to predict individual lactations. In particular, the widely used incomplete gamma model most displayed this failing. The new log-quadratic model was robust in fitting average and individual lactations, and was less affected by sampled data and more parsimonious in having only 3 parameters, each of which lends itself to biological interpretation. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ockenden, Mary C.; Tych, Wlodek; Beven, Keith J.; Collins, Adrian L.; Evans, Robert; Falloon, Peter D.; Forber, Kirsty J.; Hiscock, Kevin M.; Hollaway, Michael J.; Kahana, Ron; Macleod, Christopher J. A.; Villamizar, Martha L.; Wearing, Catherine; Withers, Paul J. A.; Zhou, Jian G.; Benskin, Clare McW. H.; Burke, Sean; Cooper, Richard J.; Freer, Jim E.; Haygarth, Philip M.
2017-12-01
Excess nutrients in surface waters, such as phosphorus (P) from agriculture, result in poor water quality, with adverse effects on ecological health and costs for remediation. However, understanding and prediction of P transfers in catchments have been limited by inadequate data and over-parameterised models with high uncertainty. We show that, with high temporal resolution data, we are able to identify simple dynamic models that capture the P load dynamics in three contrasting agricultural catchments in the UK. For a flashy catchment, a linear, second-order (two pathways) model for discharge gave high simulation efficiencies for short-term storm sequences and was useful in highlighting uncertainties in out-of-bank flows. A model with non-linear rainfall input was appropriate for predicting seasonal or annual cumulative P loads where antecedent conditions affected the catchment response. For second-order models, the time constant for the fast pathway varied between 2 and 15 h for all three catchments and for both discharge and P, confirming that high temporal resolution data are necessary to capture the dynamic responses in small catchments (10-50 km2). The models led to a better understanding of the dominant nutrient transfer modes, which will be helpful in determining phosphorus transfers following changes in precipitation patterns in the future.
Mechanistic modeling of reactive soil nitrogen emissions across agricultural management practices
NASA Astrophysics Data System (ADS)
Rasool, Q. Z.; Miller, D. J.; Bash, J. O.; Venterea, R. T.; Cooter, E. J.; Hastings, M. G.; Cohan, D. S.
2017-12-01
The global reactive nitrogen (N) budget has increased by a factor of 2-3 from pre-industrial levels. This increase is especially pronounced in highly N fertilized agricultural regions in summer. The reactive N emissions from soil to atmosphere can be in reduced (NH3) or oxidized (NO, HONO, N2O) forms, depending on complex biogeochemical transformations of soil N reservoirs. Air quality models like CMAQ typically neglect soil emissions of HONO and N2O. Previously, soil NO emissions estimated by models like CMAQ remained parametric and inconsistent with soil NH3 emissions. Thus, there is a need to more mechanistically and consistently represent the soil N processes that lead to reactive N emissions to the atmosphere. Our updated approach estimates soil NO, HONO and N2O emissions by incorporating detailed agricultural fertilizer inputs from EPIC, and CMAQ-modeled N deposition, into the soil N pool. EPIC addresses the nitrification, denitrification and volatilization rates along with soil N pools for agricultural soils. Suitable updates to account for factors like nitrite (NO2-) accumulation not addressed in EPIC, will also be made. The NO and N2O emissions from nitrification and denitrification are computed mechanistically using the N sub-model of DAYCENT. These mechanistic definitions use soil water content, temperature, NH4+ and NO3- concentrations, gas diffusivity and labile C availability as dependent parameters at various soil layers. Soil HONO emissions found to be most probable under high NO2- availability will be based on observed ratios of HONO to NO emissions under different soil moistures, pH and soil types. The updated scheme will utilize field-specific soil properties and N inputs across differing manure management practices such as tillage. Comparison of the modeled soil NO emission rates from the new mechanistic and existing schemes against field measurements will be discussed. Our updated framework will help to predict the diurnal and daily variability of different reactive N emissions (NO, HONO, N2O) with soil temperature, moisture and N inputs.
Venkata Mohan, S; Chandrasekhar Rao, N; Karthikeyan, J
2002-03-01
This communication presents the results pertaining to the investigation conducted on color removal of trisazo direct dye, C.I. Direct Brown 1:1 by adsorption onto coal based sorbents viz. charfines, lignite coal, bituminous coal and comparing results with activated carbon (Filtrasorb-400). The kinetic sorption data indicated the sorption capacity of the different coal based sorbents. The sorption interaction of direct dye on to coal based sorbents obeys first-order irreversible rate equation and activated carbon fits with the first-order reversible rate equation. Intraparticle diffusion studies revealed the dye sorption interaction was complex and intraparticle diffusion was not only the rate limiting step. Isothermal data fit well with the rearranged Langmuir adsorption model. R(L) factor revealed the favorable nature of the isotherm of the dye-coal system. Neutral solution pH yielded maximum dye color removal. Desorption and interruption studies further indicated that the coal based sorbents facilitated chemisorption in the process of dye sorption while, activated carbon resulted in physisorption interaction.
Improving Teaching and Learning: Three Models to Reshape Educational Practice
ERIC Educational Resources Information Center
Roberson, Sam
2014-01-01
The work of schools is teaching and learning. However, the current educational culture is dominated by three characteristics: (1) the mechanistic view of organization and its practice based on the assembly line model where students progress along a value added conveyor; (2) the predominance of the Essentialist philosophy of education, in which the…
An, Gary; Bartels, John; Vodovotz, Yoram
2011-01-01
The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and –content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism. PMID:21552346
A Mechanistic-Based Healing Model for Self-Healing Glass Seals Used in Solid Oxide Fuel Cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Wei; Sun, Xin; Stephens, Elizabeth V.
The usage of self-healing glass as hermetic seals is a recent advancement in sealing technology development for the planar solid oxide fuel cells (SOFCs). Because of its capability of restoring the mechanical properties at elevated temperatures, the self-healing glass seal is expected to provide high reliability in maintaining the long-term structural integrity and functionality of SOFCs. In order to accommodate the design and to evaluate the effectiveness of such engineering seals under various thermo-mechanical operating conditions, computational modeling framework needs to be developed to accurately capture and predict the healing behavior of the glass material. In the present work, amore » mechanistic-based two-stage model was developed to study the stress and temperature-dependent crack healing of the self-healing glass materials. The model was first calibrated by experimental measurements combined with the kinetic Monte Carlo (kMC) simulation results and then implemented into the finite element analysis (FEA). The effects of various factors, i.e. stress, temperature, crack morphology, on the healing behavior of the glass were investigated and discussed.« less
Fang, Baishan; Niu, Jin; Ren, Hong; Guo, Yingxia; Wang, Shizhen
2014-01-01
Mechanistic insights regarding the activity enhancement of dehydrogenase by metal ion substitution were investigated by a simple method using a kinetic and thermodynamic analysis. By profiling the binding energy of both the substrate and product, the metal ion's role in catalysis enhancement was revealed. Glycerol dehydrogenase (GDH) from Klebsiella pneumoniae sp., which demonstrated an improvement in activity by the substitution of a zinc ion with a manganese ion, was used as a model for the mechanistic study of metal ion substitution. A kinetic model based on an ordered Bi-Bi mechanism was proposed considering the noncompetitive product inhibition of dihydroxyacetone (DHA) and the competitive product inhibition of NADH. By obtaining preliminary kinetic parameters of substrate and product inhibition, the number of estimated parameters was reduced from 10 to 4 for a nonlinear regression-based kinetic parameter estimation. The simulated values of time-concentration curves fit the experimental values well, with an average relative error of 11.5% and 12.7% for Mn-GDH and GDH, respectively. A comparison of the binding energy of enzyme ternary complex for Mn-GDH and GDH derived from kinetic parameters indicated that metal ion substitution accelerated the release of dioxyacetone. The metal ion's role in catalysis enhancement was explicated.
Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N; Mantalaris, Athanasios
2012-01-01
The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.
Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N.; Mantalaris, Athanasios
2013-01-01
The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals. PMID:24688682
NASA Astrophysics Data System (ADS)
Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.
2003-03-01
A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.
Neural network modeling of the kinetics of SO2 removal by fly ash-based sorbent.
Raymond-Ooi, E H; Lee, K T; Mohamed, A R; Chu, K H
2006-01-01
The mechanistic modeling of the sulfation reaction between fly ash-based sorbent and SO2 is a challenging task due to a variety reasons including the complexity of the reaction itself and the inability to measure some of the key parameters of the reaction. In this work, the possibility of modeling the sulfation reaction kinetics using a purely data-driven neural network was investigated. Experiments on SO2 removal by a sorbent prepared from coal fly ash/CaO/CaSO4 were conducted using a fixed bed reactor to generate a database to train and validate the neural network model. Extensive SO2 removal data points were obtained by varying three process variables, namely, SO2 inlet concentration (500-2000 mg/L), reaction temperature (60-80 degreesC), and relative humidity (50-70%), as a function of reaction time (0-60 min). Modeling results show that the neural network can provide excellent fits to the SO2 removal data after considerable training and can be successfully used to predict the extent of SO2 removal as a function of time even when the process variables are outside the training domain. From a modeling standpoint, the suitably trained and validated neural network with excellent interpolation and extrapolation properties could have immediate practical benefits in the absence of a theoretical model.
Uncertainty in temperature-based determination of time of death
NASA Astrophysics Data System (ADS)
Weiser, Martin; Erdmann, Bodo; Schenkl, Sebastian; Muggenthaler, Holger; Hubig, Michael; Mall, Gita; Zachow, Stefan
2018-03-01
Temperature-based estimation of time of death (ToD) can be performed either with the help of simple phenomenological models of corpse cooling or with detailed mechanistic (thermodynamic) heat transfer models. The latter are much more complex, but allow a higher accuracy of ToD estimation as in principle all relevant cooling mechanisms can be taken into account. The potentially higher accuracy depends on the accuracy of tissue and environmental parameters as well as on the geometric resolution. We investigate the impact of parameter variations and geometry representation on the estimated ToD. For this, numerical simulation of analytic heat transport models is performed on a highly detailed 3D corpse model, that has been segmented and geometrically reconstructed from a computed tomography (CT) data set, differentiating various organs and tissue types. From that and prior information available on thermal parameters and their variability, we identify the most crucial parameters to measure or estimate, and obtain an a priori uncertainty quantification for the ToD.
DOT National Transportation Integrated Search
2015-08-31
Proper calibration of mechanistic-empirical : (M-E) design and rehabilitation performance : models to meet Texas conditions is essential : for cost-effective flexible pavement designs. : Such a calibration effort would require a : reliable source of ...
Kirk, Devin; Jones, Natalie; Peacock, Stephanie; Phillips, Jessica; Molnár, Péter K; Krkošek, Martin; Luijckx, Pepijn
2018-02-01
The complexity of host-parasite interactions makes it difficult to predict how host-parasite systems will respond to climate change. In particular, host and parasite traits such as survival and virulence may have distinct temperature dependencies that must be integrated into models of disease dynamics. Using experimental data from Daphnia magna and a microsporidian parasite, we fitted a mechanistic model of the within-host parasite population dynamics. Model parameters comprising host aging and mortality, as well as parasite growth, virulence, and equilibrium abundance, were specified by relationships arising from the metabolic theory of ecology. The model effectively predicts host survival, parasite growth, and the cost of infection across temperature while using less than half the parameters compared to modeling temperatures discretely. Our results serve as a proof of concept that linking simple metabolic models with a mechanistic host-parasite framework can be used to predict temperature responses of parasite population dynamics at the within-host level.
Jones, Natalie; Peacock, Stephanie; Phillips, Jessica; Molnár, Péter K.; Krkošek, Martin; Luijckx, Pepijn
2018-01-01
The complexity of host–parasite interactions makes it difficult to predict how host–parasite systems will respond to climate change. In particular, host and parasite traits such as survival and virulence may have distinct temperature dependencies that must be integrated into models of disease dynamics. Using experimental data from Daphnia magna and a microsporidian parasite, we fitted a mechanistic model of the within-host parasite population dynamics. Model parameters comprising host aging and mortality, as well as parasite growth, virulence, and equilibrium abundance, were specified by relationships arising from the metabolic theory of ecology. The model effectively predicts host survival, parasite growth, and the cost of infection across temperature while using less than half the parameters compared to modeling temperatures discretely. Our results serve as a proof of concept that linking simple metabolic models with a mechanistic host–parasite framework can be used to predict temperature responses of parasite population dynamics at the within-host level. PMID:29415043
Kim, Sean H. J.; Jackson, Andre J.; Hunt, C. Anthony
2014-01-01
The objective of this study was to develop and explore new, in silico experimental methods for deciphering complex, highly variable absorption and food interaction pharmacokinetics observed for a modified-release drug product. Toward that aim, we constructed an executable software analog of study participants to whom product was administered orally. The analog is an object- and agent-oriented, discrete event system, which consists of grid spaces and event mechanisms that map abstractly to different physiological features and processes. Analog mechanisms were made sufficiently complicated to achieve prespecified similarity criteria. An equation-based gastrointestinal transit model with nonlinear mixed effects analysis provided a standard for comparison. Subject-specific parameterizations enabled each executed analog’s plasma profile to mimic features of the corresponding six individual pairs of subject plasma profiles. All achieved prespecified, quantitative similarity criteria, and outperformed the gastrointestinal transit model estimations. We observed important subject-specific interactions within the simulation and mechanistic differences between the two models. We hypothesize that mechanisms, events, and their causes occurring during simulations had counterparts within the food interaction study: they are working, evolvable, concrete theories of dynamic interactions occurring within individual subjects. The approach presented provides new, experimental strategies for unraveling the mechanistic basis of complex pharmacological interactions and observed variability. PMID:25268237
Mechanistic modeling of modular co-rotating twin-screw extruders.
Eitzlmayr, Andreas; Koscher, Gerold; Reynolds, Gavin; Huang, Zhenyu; Booth, Jonathan; Shering, Philip; Khinast, Johannes
2014-10-20
In this study, we present a one-dimensional (1D) model of the metering zone of a modular, co-rotating twin-screw extruder for pharmaceutical hot melt extrusion (HME). The model accounts for filling ratio, pressure, melt temperature in screw channels and gaps, driving power, torque and the residence time distribution (RTD). It requires two empirical parameters for each screw element to be determined experimentally or numerically using computational fluid dynamics (CFD). The required Nusselt correlation for the heat transfer to the barrel was determined from experimental data. We present results for a fluid with a constant viscosity in comparison to literature data obtained from CFD simulations. Moreover, we show how to incorporate the rheology of a typical, non-Newtonian polymer melt, and present results in comparison to measurements. For both cases, we achieved excellent agreement. Furthermore, we present results for the RTD, based on experimental data from the literature, and found good agreement with simulations, in which the entire HME process was approximated with the metering model, assuming a constant viscosity for the polymer melt. Copyright © 2014. Published by Elsevier B.V.
Development and evaluation of spatial point process models for epidermal nerve fibers.
Olsbo, Viktor; Myllymäki, Mari; Waller, Lance A; Särkkä, Aila
2013-06-01
We propose two spatial point process models for the spatial structure of epidermal nerve fibers (ENFs) across human skin. The models derive from two point processes, Φb and Φe, describing the locations of the base and end points of the fibers. Each point of Φe (the end point process) is connected to a unique point in Φb (the base point process). In the first model, both Φe and Φb are Poisson processes, yielding a null model of uniform coverage of the skin by end points and general baseline results and reference values for moments of key physiologic indicators. The second model provides a mechanistic model to generate end points for each base, and we model the branching structure more directly by defining Φe as a cluster process conditioned on the realization of Φb as its parent points. In both cases, we derive distributional properties for observable quantities of direct interest to neurologists such as the number of fibers per base, and the direction and range of fibers on the skin. We contrast both models by fitting them to data from skin blister biopsy images of ENFs and provide inference regarding physiological properties of ENFs. Copyright © 2013 Elsevier Inc. All rights reserved.
Disentangling the Role of Domain-Specific Knowledge in Student Modeling
NASA Astrophysics Data System (ADS)
Ruppert, John; Duncan, Ravit Golan; Chinn, Clark A.
2017-08-01
This study explores the role of domain-specific knowledge in students' modeling practice and how this knowledge interacts with two domain-general modeling strategies: use of evidence and developing a causal mechanism. We analyzed models made by middle school students who had a year of intensive model-based instruction. These models were made to explain a familiar but unstudied biological phenomenon: late onset muscle pain. Students were provided with three pieces of evidence related to this phenomenon and asked to construct a model to account for this evidence. Findings indicate that domain-specific resources play a significant role in the extent to which the models accounted for provided evidence. On the other hand, familiarity with the situation appeared to contribute to the mechanistic character of models. Our results indicate that modeling strategies alone are insufficient for the development of a mechanistic model that accounts for provided evidence and that, while learners can develop a tentative model with a basic familiarity of the situation, scaffolding certain domain-specific knowledge is necessary to assist students with incorporating evidence in modeling tasks.
Household water use and conservation models using Monte Carlo techniques
NASA Astrophysics Data System (ADS)
Cahill, R.; Lund, J. R.; DeOreo, B.; Medellín-Azuara, J.
2013-10-01
The increased availability of end use measurement studies allows for mechanistic and detailed approaches to estimating household water demand and conservation potential. This study simulates water use in a single-family residential neighborhood using end-water-use parameter probability distributions generated from Monte Carlo sampling. This model represents existing water use conditions in 2010 and is calibrated to 2006-2011 metered data. A two-stage mixed integer optimization model is then developed to estimate the least-cost combination of long- and short-term conservation actions for each household. This least-cost conservation model provides an estimate of the upper bound of reasonable conservation potential for varying pricing and rebate conditions. The models were adapted from previous work in Jordan and are applied to a neighborhood in San Ramon, California in the eastern San Francisco Bay Area. The existing conditions model produces seasonal use results very close to the metered data. The least-cost conservation model suggests clothes washer rebates are among most cost-effective rebate programs for indoor uses. Retrofit of faucets and toilets is also cost-effective and holds the highest potential for water savings from indoor uses. This mechanistic modeling approach can improve understanding of water demand and estimate cost-effectiveness of water conservation programs.
Household water use and conservation models using Monte Carlo techniques
NASA Astrophysics Data System (ADS)
Cahill, R.; Lund, J. R.; DeOreo, B.; Medellín-Azuara, J.
2013-04-01
The increased availability of water end use measurement studies allows for more mechanistic and detailed approaches to estimating household water demand and conservation potential. This study uses, probability distributions for parameters affecting water use estimated from end use studies and randomly sampled in Monte Carlo iterations to simulate water use in a single-family residential neighborhood. This model represents existing conditions and is calibrated to metered data. A two-stage mixed integer optimization model is then developed to estimate the least-cost combination of long- and short-term conservation actions for each household. This least-cost conservation model provides an estimate of the upper bound of reasonable conservation potential for varying pricing and rebate conditions. The models were adapted from previous work in Jordan and are applied to a neighborhood in San Ramon, California in eastern San Francisco Bay Area. The existing conditions model produces seasonal use results very close to the metered data. The least-cost conservation model suggests clothes washer rebates are among most cost-effective rebate programs for indoor uses. Retrofit of faucets and toilets is also cost effective and holds the highest potential for water savings from indoor uses. This mechanistic modeling approach can improve understanding of water demand and estimate cost-effectiveness of water conservation programs.
Steinmetz, Nicholas A.; Moore, Tirin; Knudsen, Eric I.
2017-01-01
Distinct networks in the forebrain and the midbrain coordinate to control spatial attention. The critical involvement of the superior colliculus (SC)—the central structure in the midbrain network—in visuospatial attention has been shown by four seminal, published studies in monkeys (Macaca mulatta) performing multialternative tasks. However, due to the lack of a mechanistic framework for interpreting behavioral data in such tasks, the nature of the SC's contribution to attention remains unclear. Here we present and validate a novel decision framework for analyzing behavioral data in multialternative attention tasks. We apply this framework to re-examine the behavioral evidence from these published studies. Our model is a multidimensional extension to signal detection theory that distinguishes between two major classes of attentional mechanisms: those that alter the quality of sensory information or “sensitivity,” and those that alter the selective gating of sensory information or “choice bias.” Model-based simulations and model-based analyses of data from these published studies revealed a converging pattern of results that indicated that choice-bias changes, rather than sensitivity changes, were the primary outcome of SC manipulation. Our results suggest that the SC contributes to attentional performance predominantly by generating a spatial choice bias for stimuli at a selected location, and that this bias operates downstream of forebrain mechanisms that enhance sensitivity. The findings lead to a testable mechanistic framework of how the midbrain and forebrain networks interact to control spatial attention. SIGNIFICANCE STATEMENT Attention involves the selection of the most relevant information for differential sensory processing and decision making. While the mechanisms by which attention alters sensory encoding (sensitivity control) are well studied, the mechanisms by which attention alters decisional weighting of sensory evidence (choice-bias control) are poorly understood. Here, we introduce a model of multialternative decision making that distinguishes bias from sensitivity effects in attention tasks. With our model, we simulate experimental data from four seminal studies that microstimulated or inactivated a key attention-related midbrain structure, the superior colliculus (SC). We demonstrate that the experimental effects of SC manipulation are entirely consistent with the SC controlling attention by changing choice bias, thereby shedding new light on how the brain mediates attention. PMID:28100734
Sridharan, Devarajan; Steinmetz, Nicholas A; Moore, Tirin; Knudsen, Eric I
2017-01-18
Distinct networks in the forebrain and the midbrain coordinate to control spatial attention. The critical involvement of the superior colliculus (SC)-the central structure in the midbrain network-in visuospatial attention has been shown by four seminal, published studies in monkeys (Macaca mulatta) performing multialternative tasks. However, due to the lack of a mechanistic framework for interpreting behavioral data in such tasks, the nature of the SC's contribution to attention remains unclear. Here we present and validate a novel decision framework for analyzing behavioral data in multialternative attention tasks. We apply this framework to re-examine the behavioral evidence from these published studies. Our model is a multidimensional extension to signal detection theory that distinguishes between two major classes of attentional mechanisms: those that alter the quality of sensory information or "sensitivity," and those that alter the selective gating of sensory information or "choice bias." Model-based simulations and model-based analyses of data from these published studies revealed a converging pattern of results that indicated that choice-bias changes, rather than sensitivity changes, were the primary outcome of SC manipulation. Our results suggest that the SC contributes to attentional performance predominantly by generating a spatial choice bias for stimuli at a selected location, and that this bias operates downstream of forebrain mechanisms that enhance sensitivity. The findings lead to a testable mechanistic framework of how the midbrain and forebrain networks interact to control spatial attention. Attention involves the selection of the most relevant information for differential sensory processing and decision making. While the mechanisms by which attention alters sensory encoding (sensitivity control) are well studied, the mechanisms by which attention alters decisional weighting of sensory evidence (choice-bias control) are poorly understood. Here, we introduce a model of multialternative decision making that distinguishes bias from sensitivity effects in attention tasks. With our model, we simulate experimental data from four seminal studies that microstimulated or inactivated a key attention-related midbrain structure, the superior colliculus (SC). We demonstrate that the experimental effects of SC manipulation are entirely consistent with the SC controlling attention by changing choice bias, thereby shedding new light on how the brain mediates attention. Copyright © 2017 the authors 0270-6474/17/370480-32$15.00/0.
Towards new approaches in phenological modelling
NASA Astrophysics Data System (ADS)
Chmielewski, Frank-M.; Götz, Klaus-P.; Rawel, Harshard M.; Homann, Thomas
2014-05-01
Modelling of phenological stages is based on temperature sums for many decades, describing both the chilling and the forcing requirement of woody plants until the beginning of leafing or flowering. Parts of this approach go back to Reaumur (1735), who originally proposed the concept of growing degree-days. Now, there is a growing body of opinion that asks for new methods in phenological modelling and more in-depth studies on dormancy release of woody plants. This requirement is easily understandable if we consider the wide application of phenological models, which can even affect the results of climate models. To this day, in phenological models still a number of parameters need to be optimised on observations, although some basic physiological knowledge of the chilling and forcing requirement of plants is already considered in these approaches (semi-mechanistic models). Limiting, for a fundamental improvement of these models, is the lack of knowledge about the course of dormancy in woody plants, which cannot be directly observed and which is also insufficiently described in the literature. Modern metabolomic methods provide a solution for this problem and allow both, the validation of currently used phenological models as well as the development of mechanistic approaches. In order to develop this kind of models, changes of metabolites (concentration, temporal course) must be set in relation to the variability of environmental (steering) parameters (weather, day length, etc.). This necessarily requires multi-year (3-5 yr.) and high-resolution (weekly probes between autumn and spring) data. The feasibility of this approach has already been tested in a 3-year pilot-study on sweet cherries. Our suggested methodology is not only limited to the flowering of fruit trees, it can be also applied to tree species of the natural vegetation, where even greater deficits in phenological modelling exist.
Mechanistic-empirical Pavement Design Guide Implementation
DOT National Transportation Integrated Search
2010-06-01
The recently introduced Mechanistic-Empirical Pavement Design Guide (MEPDG) and associated computer software provides a state-of-practice mechanistic-empirical highway pavement design methodology. The MEPDG methodology is based on pavement responses ...
Integrated computational model of the bioenergetics of isolated lung mitochondria
Zhang, Xiao; Jacobs, Elizabeth R.; Camara, Amadou K. S.; Clough, Anne V.
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria. PMID:29889855
Integrated computational model of the bioenergetics of isolated lung mitochondria.
Zhang, Xiao; Dash, Ranjan K; Jacobs, Elizabeth R; Camara, Amadou K S; Clough, Anne V; Audi, Said H
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria.
NASA Astrophysics Data System (ADS)
Singer, M. B.; Sargeant, C. I.; Vallet-Coulomb, C.; Evans, C.; Bates, C. R.
2014-12-01
Water availability to riparian trees in lowlands is controlled through precipitation and its infiltration into floodplain soils, and through river discharge additions to the hyporheic water table. The relative contributions of both water sources to the root zone within river floodplains vary through time, depending on climatic fluctuations. There is currently limited understanding of how climatic fluctuations are expressed at local scales, especially in 'critical zone' hydrology, which is fundamental to the health and sustainability of riparian forest ecosystems. This knowledge is particularly important in water-stressed Mediterranean climate systems, considering climatic trends and projections toward hotter and drier growing seasons, which have the potential to dramatically reduce water availability to riparian forests. Our aim is to identify and quantify the relative contributions of hyporheic (discharge) water v. infiltrated precipitation to water uptake by riparian Mediterranean trees for several distinct hydrologic years, selected to isolate contrasts in water availability from these sources. Our approach includes isotopic analyses of water and tree-ring cellulose, mechanistic modeling of water uptake and wood production, and physically based modeling of subsurface hydrology. We utilize an extensive database of oxygen isotope (δ18O) measurements in surface water and precipitation alongside recent measurements of δ18O in groundwater and soil water and in tree-ring cellulose. We use a mechanistic model to back-calculate source water δ18O based on δ18O in cellulose and climate data. Finally, we test our results via 1-D hydrologic modeling of precipitation infiltration and water table rise and fall. These steps enable us to interpret hydrologic cycle variability within the 'critical zone' and their potential impact on riparian trees.
Quantitative evaluation of simulated functional brain networks in graph theoretical analysis.
Lee, Won Hee; Bullmore, Ed; Frangou, Sophia
2017-02-01
There is increasing interest in the potential of whole-brain computational models to provide mechanistic insights into resting-state brain networks. It is therefore important to determine the degree to which computational models reproduce the topological features of empirical functional brain networks. We used empirical connectivity data derived from diffusion spectrum and resting-state functional magnetic resonance imaging data from healthy individuals. Empirical and simulated functional networks, constrained by structural connectivity, were defined based on 66 brain anatomical regions (nodes). Simulated functional data were generated using the Kuramoto model in which each anatomical region acts as a phase oscillator. Network topology was studied using graph theory in the empirical and simulated data. The difference (relative error) between graph theory measures derived from empirical and simulated data was then estimated. We found that simulated data can be used with confidence to model graph measures of global network organization at different dynamic states and highlight the sensitive dependence of the solutions obtained in simulated data on the specified connection densities. This study provides a method for the quantitative evaluation and external validation of graph theory metrics derived from simulated data that can be used to inform future study designs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Exploring Organic Mechanistic Puzzles with Molecular Modeling
ERIC Educational Resources Information Center
Horowitz, Gail; Schwartz, Gary
2004-01-01
The molecular modeling was used to reinforce more general skills such as deducing and drawing reaction mechanisms, analyzing reaction kinetics and thermodynamics and drawing reaction coordinate energy diagrams. This modeling was done through the design of mechanistic puzzles, involving reactions not familiar to the students.
Mechanistic models of biofilm growth in porous media
NASA Astrophysics Data System (ADS)
Jaiswal, Priyank; Al-Hadrami, Fathiya; Atekwana, Estella A.; Atekwana, Eliot A.
2014-07-01
Nondestructive acoustics methods can be used to monitor in situ biofilm growth in porous media. In practice, however, acoustic methods remain underutilized due to the lack of models that can translate acoustic data into rock properties in the context of biofilm. In this paper we present mechanistic models of biofilm growth in porous media. The models are used to quantitatively interpret arrival times and amplitudes recorded in the 29 day long Davis et al. (2010) physical scale biostimulation experiment in terms of biofilm morphologies and saturation. The model pivots on addressing the sediment elastic behavior using the lower Hashin-Shtrikman bounds for grain mixing and Gassmann substitution for fluid saturation. The time-lapse P wave velocity (VP; a function of arrival times) is explained by a combination of two rock models (morphologies); "load bearing" which assumes the biofilm as an additional mineral in the rock matrix and "pore filling" which assumes the biofilm as an additional fluid phase in the pores. The time-lapse attenuation (QP-1; a function of amplitudes), on the other hand, can be explained adequately in two ways; first, through squirt flow where energy is lost from relative motion between rock matrix and pore fluid, and second, through an empirical function of porosity (φ), permeability (κ), and grain size. The squirt flow model-fitting results in higher internal φ (7% versus 5%) and more oblate pores (0.33 versus 0.67 aspect ratio) for the load-bearing morphology versus the pore-filling morphology. The empirical model-fitting results in up to 10% increase in κ at the initial stages of the load-bearing morphology. The two morphologies which exhibit distinct mechanical and hydraulic behavior could be a function of pore throat size. The biofilm mechanistic models developed in this study can be used for the interpretation of seismic data critical for the evaluation of biobarriers in bioremediation, microbial enhanced oil recovery, and CO2 sequestration.
Model-based image analysis of a tethered Brownian fibre for shear stress sensing
2017-01-01
The measurement of fluid dynamic shear stress acting on a biologically relevant surface is a challenging problem, particularly in the complex environment of, for example, the vasculature. While an experimental method for the direct detection of wall shear stress via the imaging of a synthetic biology nanorod has recently been developed, the data interpretation so far has been limited to phenomenological random walk modelling, small-angle approximation, and image analysis techniques which do not take into account the production of an image from a three-dimensional subject. In this report, we develop a mathematical and statistical framework to estimate shear stress from rapid imaging sequences based firstly on stochastic modelling of the dynamics of a tethered Brownian fibre in shear flow, and secondly on a novel model-based image analysis, which reconstructs fibre positions by solving the inverse problem of image formation. This framework is tested on experimental data, providing the first mechanistically rational analysis of the novel assay. What follows further develops the established theory for an untethered particle in a semi-dilute suspension, which is of relevance to, for example, the study of Brownian nanowires without flow, and presents new ideas in the field of multi-disciplinary image analysis. PMID:29212755
Kocic, Ivana; Homsek, Irena; Dacevic, Mirjana; Grbic, Sandra; Parojcic, Jelena; Vucicevic, Katarina; Prostran, Milica; Miljkovic, Branislava
2012-04-01
The aim of this case study was to develop a drug-specific absorption model for levothyroxine (LT4) using mechanistic gastrointestinal simulation technology (GIST) implemented in the GastroPlus™ software package. The required input parameters were determined experimentally, in silico predicted and/or taken from the literature. The simulated plasma profile was similar and in a good agreement with the data observed in the in vivo bioequivalence study, indicating that the GIST model gave an accurate prediction of LT4 oral absorption. Additionally, plasma concentration-time profiles were simulated based on a set of experimental and virtual in vitro dissolution data in order to estimate the influence of different in vitro drug dissolution kinetics on the simulated plasma profiles and to identify biorelevant dissolution specification for LT4 immediate-release (IR) tablets. A set of experimental and virtual in vitro data was also used for correlation purposes. In vitro-in vivo correlation model based on the convolution approach was applied in order to assess the relationship between the in vitro and in vivo data. The obtained results suggest that dissolution specification of more than 85% LT4 dissolved in 60 min might be considered as biorelevant dissolution specification criteria for LT4 IR tablets. Copyright © 2012 John Wiley & Sons, Ltd.
Chemical kinetic mechanistic models to investigate cancer biology and impact cancer medicine.
Stites, Edward C
2013-04-01
Traditional experimental biology has provided a mechanistic understanding of cancer in which the malignancy develops through the acquisition of mutations that disrupt cellular processes. Several drugs developed to target such mutations have now demonstrated clinical value. These advances are unequivocal testaments to the value of traditional cellular and molecular biology. However, several features of cancer may limit the pace of progress that can be made with established experimental approaches alone. The mutated genes (and resultant mutant proteins) function within large biochemical networks. Biochemical networks typically have a large number of component molecules and are characterized by a large number of quantitative properties. Responses to a stimulus or perturbation are typically nonlinear and can display qualitative changes that depend upon the specific values of variable system properties. Features such as these can complicate the interpretation of experimental data and the formulation of logical hypotheses that drive further research. Mathematical models based upon the molecular reactions that define these networks combined with computational studies have the potential to deal with these obstacles and to enable currently available information to be more completely utilized. Many of the pressing problems in cancer biology and cancer medicine may benefit from a mathematical treatment. As work in this area advances, one can envision a future where such models may meaningfully contribute to the clinical management of cancer patients.
Mellor, Jonathan E; Levy, Karen; Zimmerman, Julie; Elliott, Mark; Bartram, Jamie; Carlton, Elizabeth; Clasen, Thomas; Dillingham, Rebecca; Eisenberg, Joseph; Guerrant, Richard; Lantagne, Daniele; Mihelcic, James; Nelson, Kara
2016-04-01
Increased precipitation and temperature variability as well as extreme events related to climate change are predicted to affect the availability and quality of water globally. Already heavily burdened with diarrheal diseases due to poor access to water, sanitation and hygiene facilities, communities throughout the developing world lack the adaptive capacity to sufficiently respond to the additional adversity caused by climate change. Studies suggest that diarrhea rates are positively correlated with increased temperature, and show a complex relationship with precipitation. Although climate change will likely increase rates of diarrheal diseases on average, there is a poor mechanistic understanding of the underlying disease transmission processes and substantial uncertainty surrounding current estimates. This makes it difficult to recommend appropriate adaptation strategies. We review the relevant climate-related mechanisms behind transmission of diarrheal disease pathogens and argue that systems-based mechanistic approaches incorporating human, engineered and environmental components are urgently needed. We then review successful systems-based approaches used in other environmental health fields and detail one modeling framework to predict climate change impacts on diarrheal diseases and design adaptation strategies. Copyright © 2016 Elsevier B.V. All rights reserved.
Mechanistic modeling of insecticide risks to breeding birds in North American agroecosystems
Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of ...
THE EFFECTS OF NITROGEN LOADING AND FRESHWATER RESIDENCE TIME ON THE ESTUARINE ECOSYSTEM
A simple mechanistic model, designed to predict annual average concentrations of total nitrogen (TN) concentrations from nitrogen inputs and freshwater residence time in estuaries, was applied to data for several North American estuaries from previously published literature. The ...
RISK 0301 - MOLECULAR MODELING
Risk assessment practices, in general, for a range of diseases now encourages the use of mechanistic data to enhance the ability to predict responses at low, environmental exposures. In particular, the pathway from normal biology to pathologic state can be dcscribed by a set of m...
20180312 - Mechanistic Modeling of Developmental Defects through Computational Embryology (SOT)
Significant advances in the genome sciences, in automated high-throughput screening (HTS), and in alternative methods for testing enable rapid profiling of chemical libraries for quantitative effects on diverse cellular activities. While a surfeit of HTS data and information is n...
Adverse outcome pathway (AOP) development II: Best practices
Organization of existing and emerging toxicological knowledge into adverse outcome pathway (AOP) descriptions can facilitate greater application of mechanistic data, including high throughput in vitro, high content omics and imaging, and biomarkers, in risk-based decision-making....
Roberts, David W; Patlewicz, Grace; Kern, Petra S; Gerberick, Frank; Kimber, Ian; Dearman, Rebecca J; Ryan, Cindy A; Basketter, David A; Aptula, Aynur O
2007-07-01
The goal of eliminating animal testing in the predictive identification of chemicals with the intrinsic ability to cause skin sensitization is an important target, the attainment of which has recently been brought into even sharper relief by the EU Cosmetics Directive and the requirements of the REACH legislation. Development of alternative methods requires that the chemicals used to evaluate and validate novel approaches comprise not only confirmed skin sensitizers and non-sensitizers but also substances that span the full chemical mechanistic spectrum associated with skin sensitization. To this end, a recently published database of more than 200 chemicals tested in the mouse local lymph node assay (LLNA) has been examined in relation to various chemical reaction mechanistic domains known to be associated with sensitization. It is demonstrated here that the dataset does cover the main reaction mechanistic domains. In addition, it is shown that assignment to a reaction mechanistic domain is a critical first step in a strategic approach to understanding, ultimately on a quantitative basis, how chemical properties influence the potency of skin sensitizing chemicals. This understanding is necessary if reliable non-animal approaches, including (quantitative) structure-activity relationships (Q)SARs, read-across, and experimental chemistry based models, are to be developed.
Preventable Exposures Associated With Human Cancers
Baan, Robert; Straif, Kurt; Grosse, Yann; Lauby-Secretan, Béatrice; El Ghissassi, Fatiha; Bouvard, Véronique; Benbrahim-Tallaa, Lamia; Guha, Neela; Freeman, Crystal; Galichet, Laurent; Wild, Christopher P.
2011-01-01
Information on the causes of cancer at specific sites is important to cancer control planners, cancer researchers, cancer patients, and the general public. The International Agency for Research on Cancer (IARC) Monograph series, which has classified human carcinogens for more than 40 years, recently completed a review to provide up-to-date information on the cancer sites associated with more than 100 carcinogenic agents. Based on IARC’s review, we listed the cancer sites associated with each agent and then rearranged this information to list the known and suspected causes of cancer at each site. We also summarized the rationale for classifications that were based on mechanistic data. This information, based on the forthcoming IARC Monographs Volume 100, offers insights into the current state-of-the-science of carcinogen identification. Use of mechanistic data to identify carcinogens is increasing, and epidemiological research is identifying additional carcinogens and cancer sites or confirming carcinogenic potential under conditions of lower exposure. Nevertheless, some common human cancers still have few (or no) identified causal agents. PMID:22158127
Fluid mechanics of Windkessel effect.
Mei, C C; Zhang, J; Jing, H X
2018-01-08
We describe a mechanistic model of Windkessel phenomenon based on the linear dynamics of fluid-structure interactions. The phenomenon has its origin in an old-fashioned fire-fighting equipment where an air chamber serves to transform the intermittent influx from a pump to a more steady stream out of the hose. A similar mechanism exists in the cardiovascular system where blood injected intermittantly from the heart becomes rather smooth after passing through an elastic aorta. In existing haeodynamics literature, this mechanism is explained on the basis of electric circuit analogy with empirical impedances. We present a mechanistic theory based on the principles of fluid/structure interactions. Using a simple one-dimensional model, wave motion in the elastic aorta is coupled to the viscous flow in the rigid peripheral artery. Explicit formulas are derived that exhibit the role of material properties such as the blood density, viscosity, wall elasticity, and radii and lengths of the vessels. The current two-element model in haemodynamics is shown to be the limit of short aorta and low injection frequency and the impedance coefficients are derived theoretically. Numerical results for different aorta lengths and radii are discussed to demonstrate their effects on the time variations of blood pressure, wall shear stress, and discharge. Graphical Abstract A mechanistic analysis of Windkessel Effect is described which confirms theoretically the well-known feature that intermittent influx becomes continuous outflow. The theory depends only on the density and viscosity of the blood, the elasticity and dimensions of the vessel. Empirical impedence parameters are avoided.
NASA Astrophysics Data System (ADS)
Guarracino, L.; Jougnot, D.
2018-01-01
Among the different contributions generating self-potential, the streaming potential is of particular interest in hydrogeology for its sensitivity to water flow. Estimating water flux in porous media using streaming potential data relies on our capacity to understand, model, and upscale the electrokinetic coupling at the mineral-solution interface. Different approaches have been proposed to predict streaming potential generation in porous media. One of these approaches is the flux averaging which is based on determining the excess charge which is effectively dragged in the medium by water flow. In this study, we develop a physically based analytical model to predict the effective excess charge in saturated porous media using a flux-averaging approach in a bundle of capillary tubes with a fractal pore size distribution. The proposed model allows the determination of the effective excess charge as a function of pore water ionic concentration and hydrogeological parameters like porosity, permeability, and tortuosity. The new model has been successfully tested against different set of experimental data from the literature. One of the main findings of this study is the mechanistic explanation to the empirical dependence between the effective excess charge and the permeability that has been found by several researchers. The proposed model also highlights the link to other lithological properties, and it is able to reproduce the evolution of effective excess charge with electrolyte concentrations.
NASA Astrophysics Data System (ADS)
Teal, Lorna R.; Marras, Stefano; Peck, Myron A.; Domenici, Paolo
2018-02-01
Models are useful tools for predicting the impact of global change on species distribution and abundance. As ectotherms, fish are being challenged to adapt or track changes in their environment, either in time through a phenological shift or in space by a biogeographic shift. Past modelling efforts have largely been based on correlative Species Distribution Models, which use known occurrences of species across landscapes of interest to define sets of conditions under which species are likely to maintain populations. The practical advantages of this correlative approach are its simplicity and the flexibility in terms of data requirements. However, effective conservation management requires models that make projections beyond the range of available data. One way to deal with such an extrapolation is to use a mechanistic approach based on physiological processes underlying climate change effects on organisms. Here we illustrate two approaches for developing physiology-based models to characterize fish habitat suitability. (i) Aerobic Scope Models (ASM) are based on the relationship between environmental factors and aerobic scope (defined as the difference between maximum and standard (basal) metabolism). This approach is based on experimental data collected by using a number of treatments that allow a function to be derived to predict aerobic metabolic scope from the stressor/environmental factor(s). This function is then integrated with environmental (oceanographic) data of current and future scenarios. For any given species, this approach allows habitat suitability maps to be generated at various spatiotemporal scales. The strength of the ASM approach relies on the estimate of relative performance when comparing, for example, different locations or different species. (ii) Dynamic Energy Budget (DEB) models are based on first principles including the idea that metabolism is organised in the same way within all animals. The (standard) DEB model aims to describe empirical relationships which can be found consistently within physiological data across the animal kingdom. The advantages of the DEB models are that they make use of the generalities found in terms of animal physiology and can therefore be applied to species for which little data or empirical observations are available. In addition, the limitations as well as useful potential refinements of these and other physiology-based modelling approaches are discussed. Inclusion of the physiological response of various life stages and modelling the patterns of extreme events observed in nature are suggested for future work.
Aziza, Fanny; Mettler, Eric; Daudin, Jean-Jacques; Sanaa, Moez
2006-06-01
Cheese smearing is a complex process and the potential for cross-contamination with pathogenic or undesirable microorganisms is critical. During ripening, cheeses are salted and washed with brine to develop flavor and remove molds that could develop on the surfaces. Considering the potential for cross-contamination of this process in quantitative risk assessments could contribute to a better understanding of this phenomenon and, eventually, improve its control. The purpose of this article is to model the cross-contamination of smear-ripened cheeses due to the smearing operation under industrial conditions. A compartmental, dynamic, and stochastic model is proposed for mechanical brush smearing. This model has been developed to describe the exchange of microorganisms between compartments. Based on the analytical solution of the model equations and on experimental data collected with an industrial smearing machine, we assessed the values of the transfer parameters of the model. Monte Carlo simulations, using the distributions of transfer parameters, provide the final number of contaminated products in a batch and their final level of contamination for a given scenario taking into account the initial number of contaminated cheeses of the batch and their contaminant load. Based on analytical results, the model provides indicators for smearing efficiency and propensity of the process for cross-contamination. Unlike traditional approaches in mechanistic models, our approach captures the variability and uncertainty inherent in the process and the experimental data. More generally, this model could represent a generic base to use in modeling similar processes prone to cross-contamination.
A mechanistic model to predict the capture of gas phase mercury species using in-situ generated titania nanosize particles activated by UV irradiation is developed. The model is an extension of a recently reported model1 for photochemical reactions that accounts for the rates of...
Modelling the ecological niche from functional traits
Kearney, Michael; Simpson, Stephen J.; Raubenheimer, David; Helmuth, Brian
2010-01-01
The niche concept is central to ecology but is often depicted descriptively through observing associations between organisms and habitats. Here, we argue for the importance of mechanistically modelling niches based on functional traits of organisms and explore the possibilities for achieving this through the integration of three theoretical frameworks: biophysical ecology (BE), the geometric framework for nutrition (GF) and dynamic energy budget (DEB) models. These three frameworks are fundamentally based on the conservation laws of thermodynamics, describing energy and mass balance at the level of the individual and capturing the prodigious predictive power of the concepts of ‘homeostasis’ and ‘evolutionary fitness’. BE and the GF provide mechanistic multi-dimensional depictions of climatic and nutritional niches, respectively, providing a foundation for linking organismal traits (morphology, physiology, behaviour) with habitat characteristics. In turn, they provide driving inputs and cost functions for mass/energy allocation within the individual as determined by DEB models. We show how integration of the three frameworks permits calculation of activity constraints, vital rates (survival, development, growth, reproduction) and ultimately population growth rates and species distributions. When integrated with contemporary niche theory, functional trait niche models hold great promise for tackling major questions in ecology and evolutionary biology. PMID:20921046
Rational and Mechanistic Perspectives on Reinforcement Learning
ERIC Educational Resources Information Center
Chater, Nick
2009-01-01
This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: "mechanistic" and "rational." Reinforcement learning is often viewed in mechanistic terms--as…
Investigation of mechanistic deterioration modeling for bridge design and management.
DOT National Transportation Integrated Search
2017-04-01
The ongoing deterioration of highway bridges in Colorado dictates that an effective method for allocating limited management resources be developed. In order to predict bridge deterioration in advance, mechanistic models that analyze the physical pro...
Integration of QSAR and in vitro toxicology.
Barratt, M D
1998-01-01
The principles of quantitative structure-activity relationships (QSAR) are based on the premise that the properties of a chemical are implicit in its molecular structure. Therefore, if a mechanistic hypothesis can be proposed linking a group of related chemicals with a particular toxic end point, the hypothesis can be used to define relevant parameters to establish a QSAR. Ways in which QSAR and in vitro toxicology can complement each other in development of alternatives to live animal experiments are described and illustrated by examples from acute toxicological end points. Integration of QSAR and in vitro methods is examined in the context of assessing mechanistic competence and improving the design of in vitro assays and the development of prediction models. The nature of biological variability is explored together with its implications for the selection of sets of chemicals for test development, optimization, and validation. Methods are described to support the use of data from in vivo tests that do not meet today's stringent requirements of acceptability. Integration of QSAR and in vitro methods into strategic approaches for the replacement, reduction, and refinement of the use of animals is described with examples. PMID:9599692
Assessment of the impact of climate shifts on malaria transmission in the Sahel.
Bomblies, Arne; Eltahir, Elfatih A B
2009-09-01
Climate affects malaria transmission through a complex network of causative pathways. We seek to evaluate the impact of hypothetical climate change scenarios on malaria transmission in the Sahel by using a novel mechanistic, high spatial- and temporal-resolution coupled hydrology and agent-based entomology model. The hydrology model component resolves individual precipitation events and individual breeding pools. The impact of future potential climate shifts on the representative Sahel village of Banizoumbou, Niger, is estimated by forcing the model of Banizoumbou environment with meteorological data from two locations along the north-south climatological gradient observed in the Sahel--both for warmer, drier scenarios from the north and cooler, wetter scenarios from the south. These shifts in climate represent hypothetical but historically realistic climate change scenarios. For Banizoumbou climatic conditions (latitude 13.54 N), a shift toward cooler, wetter conditions may dramatically increase mosquito abundance; however, our modeling results indicate that the increased malaria transmissibility is not simply proportional to the precipitation increase. The cooler, wetter conditions increase the length of the sporogonic cycle, dampening a large vectorial capacity increase otherwise brought about by increased mosquito survival and greater overall abundance. Furthermore, simulations varying rainfall event frequency demonstrate the importance of precipitation patterns, rather than simply average or time-integrated precipitation, as a controlling factor of these dynamics. Modeling results suggest that in addition to changes in temperature and total precipitation, changes in rainfall patterns are very important to predict changes in disease susceptibility resulting from climate shifts. The combined effect of these climate-shift-induced perturbations can be represented with the aid of a detailed mechanistic model.
Ebrahimi, Ali; Or, Dani
2016-09-01
Microbial communities inhabiting soil aggregates dynamically adjust their activity and composition in response to variations in hydration and other external conditions. These rapid dynamics shape signatures of biogeochemical activity and gas fluxes emitted from soil profiles. Recent mechanistic models of microbial processes in unsaturated aggregate-like pore networks revealed a highly dynamic interplay between oxic and anoxic microsites jointly shaped by hydration conditions and by aerobic and anaerobic microbial community abundance and self-organization. The spatial extent of anoxic niches (hotspots) flicker in time (hot moments) and support substantial anaerobic microbial activity even in aerated soil profiles. We employed an individual-based model for microbial community life in soil aggregate assemblies represented by 3D angular pore networks. Model aggregates of different sizes were subjected to variable water, carbon and oxygen contents that varied with soil depth as boundary conditions. The study integrates microbial activity within aggregates of different sizes and soil depth to obtain estimates of biogeochemical fluxes from the soil profile. The results quantify impacts of dynamic shifts in microbial community composition on CO2 and N2 O production rates in soil profiles in good agreement with experimental data. Aggregate size distribution and the shape of resource profiles in a soil determine how hydration dynamics shape denitrification and carbon utilization rates. Results from the mechanistic model for microbial activity in aggregates of different sizes were used to derive parameters for analytical representation of soil biogeochemical processes across large scales of practical interest for hydrological and climate models. © 2016 John Wiley & Sons Ltd.
Semi-mechanistic modelling of ammonia absorption in an acid spray wet scrubber based on mass balance
USDA-ARS?s Scientific Manuscript database
A model to describe reactive absorption of ammonia (NH3) in an acid spray scrubber was developed as a function of the combined overall mass transfer coefficient K. An experimental study of NH3 absorption using 1% dilute sulphuric acid was carried out under different operating conditions. An empiric...
NASA Astrophysics Data System (ADS)
Keppel-Aleks, G.; Butterfield, Z.; Doney, S. C.; Dlugokencky, E. J.; Miller, J.; Morton, D. C.
2017-12-01
Quantifying the climatic drivers of variations in atmospheric CO2 observations over a range of timescales is necessary to develop a mechanistic understanding of the global carbon cycle that will enable prediction of future changes. Here, we combine NOAA cooperative global air sampling network CO2 observations, remote sensing data, and a flux perturbation model to quantify the feedbacks between interannual variability in physical climate and the atmospheric CO2 growth rate. In particular, we focus on the differences between the 1997/1998 El Niño and the 2015/2016 El Niño during which atmospheric CO2 increased at an unprecedented rate. The flux perturbation model was trained on data from 1997 to 2012, and then used to predict regional atmospheric CO2 growth rate anomalies for the period from 2013 through 2016. Given gridded temperature anomalies from the Hadley Center's Climate Research Unit (CRU), precipitation anomalies from the Global Precipitation Climatology Project (GPCP), and fire emissions from the Global Fire Emissions Database (GFEDv4s), the model was able to the reproduce regional growth rate variations observed at marine boundary layer stations in the NOAA network, including the rapid CO2 growth rate in 2015/2016. The flux perturbation model output suggests that the carbon cycle responses differed for1997 and 2015 El Niño periods, with tropical precipitation anomalies causing a much larger net flux of CO2 to the atmosphere during the latter period, while direct fire emissions dominated the former. The flux perturbation model also suggests that high temperature stress in the Northern Hemisphere extratropics contributed almost one-third of the CO2 growth rate enhancement during the 2015 El Niño. We use satellite-based metrics for atmospheric column CO2, vegetation, and moisture to corroborate the regional El Niño impacts from the flux perturbation model. Finally, we discuss how these observational results and independent data on ocean air-sea flux anomalies, couched in an empirical model, may be useful for evaluating the fidelity of mechanistic land models.
NASA Astrophysics Data System (ADS)
Turner, D. P.; Jacobson, A. R.; Nemani, R. R.
2013-12-01
The recent development of large spatially-explicit datasets for multiple variables relevant to monitoring terrestrial carbon flux offers the opportunity to estimate the terrestrial land flux using several alternative, potentially complimentary, approaches. Here we developed and compared regional estimates of net ecosystem exchange (NEE) over the Pacific Northwest region of the U.S. using three approaches. In the prognostic modeling approach, the process-based Biome-BGC model was driven by distributed meteorological station data and was informed by Landsat-based coverages of forest stand age and disturbance regime. In the diagnostic modeling approach, the quasi-mechanistic CFLUX model estimated net ecosystem production (NEP) by upscaling eddy covariance flux tower observations. The model was driven by distributed climate data and MODIS FPAR (the fraction of incident PAR that is absorbed by the vegetation canopy). It was informed by coarse resolution (1 km) data about forest stand age. In both the prognostic and diagnostic modeling approaches, emissions estimates for biomass burning, harvested products, and river/stream evasion were added to model-based NEP to get NEE. The inversion model (CarbonTracker) relied on observations of atmospheric CO2 concentration to optimize prior surface carbon flux estimates. The Pacific Northwest is heterogeneous with respect to land cover and forest management, and repeated surveys of forest inventory plots support the presence of a strong regional carbon sink. The diagnostic model suggested a stronger carbon sink than the prognostic model, and a much larger sink that the inversion model. The introduction of Landsat data on disturbance history served to reduce uncertainty with respect to regional NEE in the diagnostic and prognostic modeling approaches. The FPAR data was particularly helpful in capturing the seasonality of the carbon flux using the diagnostic modeling approach. The inversion approach took advantage of a global network of CO2 observation stations, but had difficulty resolving regional fluxes such as that in the PNW given the still sparse nature of the CO2 measurement network.
Boosted structured additive regression for Escherichia coli fed-batch fermentation modeling.
Melcher, Michael; Scharl, Theresa; Luchner, Markus; Striedner, Gerald; Leisch, Friedrich
2017-02-01
The quality of biopharmaceuticals and patients' safety are of highest priority and there are tremendous efforts to replace empirical production process designs by knowledge-based approaches. Main challenge in this context is that real-time access to process variables related to product quality and quantity is severely limited. To date comprehensive on- and offline monitoring platforms are used to generate process data sets that allow for development of mechanistic and/or data driven models for real-time prediction of these important quantities. Ultimate goal is to implement model based feed-back control loops that facilitate online control of product quality. In this contribution, we explore structured additive regression (STAR) models in combination with boosting as a variable selection tool for modeling the cell dry mass, product concentration, and optical density on the basis of online available process variables and two-dimensional fluorescence spectroscopic data. STAR models are powerful extensions of linear models allowing for inclusion of smooth effects or interactions between predictors. Boosting constructs the final model in a stepwise manner and provides a variable importance measure via predictor selection frequencies. Our results show that the cell dry mass can be modeled with a relative error of about ±3%, the optical density with ±6%, the soluble protein with ±16%, and the insoluble product with an accuracy of ±12%. Biotechnol. Bioeng. 2017;114: 321-334. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M E Bette; Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M; Whelan, Maurice
2017-02-01
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24-25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.
A Perspective on the Role of Computational Models in Immunology.
Chakraborty, Arup K
2017-04-26
This is an exciting time for immunology because the future promises to be replete with exciting new discoveries that can be translated to improve health and treat disease in novel ways. Immunologists are attempting to answer increasingly complex questions concerning phenomena that range from the genetic, molecular, and cellular scales to that of organs, whole animals or humans, and populations of humans and pathogens. An important goal is to understand how the many different components involved interact with each other within and across these scales for immune responses to emerge, and how aberrant regulation of these processes causes disease. To aid this quest, large amounts of data can be collected using high-throughput instrumentation. The nonlinear, cooperative, and stochastic character of the interactions between components of the immune system as well as the overwhelming amounts of data can make it difficult to intuit patterns in the data or a mechanistic understanding of the phenomena being studied. Computational models are increasingly important in confronting and overcoming these challenges. I first describe an iterative paradigm of research that integrates laboratory experiments, clinical data, computational inference, and mechanistic computational models. I then illustrate this paradigm with a few examples from the recent literature that make vivid the power of bringing together diverse types of computational models with experimental and clinical studies to fruitfully interrogate the immune system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shankaran, Harish; Zhang, Yi; Chrisler, William B.
2012-10-02
The epidermal growth factor receptor (EGFR) belongs to the ErbB family of receptor tyrosine kinases, and controls a diverse set of cellular responses relevant to development and tumorigenesis. ErbB activation is a complex process involving receptor-ligand binding, receptor dimerization, phosphorylation, and trafficking (internalization, recycling and degradation), which together dictate the spatio-temporal distribution of active receptors within the cell. The ability to predict this distribution, and elucidation of the factors regulating it, would help to establish a mechanistic link between ErbB expression levels and the cellular response. Towards this end, we constructed mathematical models for deconvolving the contributions of receptor dimerizationmore » and phosphorylation to EGFR activation, and to examine the dependence of these processes on sub-cellular location. We collected experimental datasets for EGFR activation dynamics in human mammary epithelial cells, with the specific goal of model parameterization, and used the data to estimate parameters for several alternate models. Model-based analysis indicated that: 1) signal termination via receptor dephosphorylation in late endosomes, prior to degradation, is an important component of the response, 2) less than 40% of the receptors in the cell are phosphorylated at any given time, even at saturating ligand doses, and 3) receptor dephosphorylation rates at the cell surface and early endosomes are comparable. We validated the last finding by measuring EGFR dephosphorylation rates at various times following ligand addition both in whole cells, and in endosomes using ELISAs and fluorescent imaging. Overall, our results provide important information on how EGFR phosphorylation levels are regulated within cells. Further, the mathematical model described here can be extended to determine receptor dimer abundances in cells co-expressing various levels of ErbB receptors. This study demonstrates that an iterative cycle of experiments and modeling can be used to gain mechanistic insight regarding complex cell signaling networks.« less
Application of PBPK modelling in drug discovery and development at Pfizer.
Jones, Hannah M; Dickins, Maurice; Youdim, Kuresh; Gosset, James R; Attkins, Neil J; Hay, Tanya L; Gurrell, Ian K; Logan, Y Raj; Bungay, Peter J; Jones, Barry C; Gardner, Iain B
2012-01-01
Early prediction of human pharmacokinetics (PK) and drug-drug interactions (DDI) in drug discovery and development allows for more informed decision making. Physiologically based pharmacokinetic (PBPK) modelling can be used to answer a number of questions throughout the process of drug discovery and development and is thus becoming a very popular tool. PBPK models provide the opportunity to integrate key input parameters from different sources to not only estimate PK parameters and plasma concentration-time profiles, but also to gain mechanistic insight into compound properties. Using examples from the literature and our own company, we have shown how PBPK techniques can be utilized through the stages of drug discovery and development to increase efficiency, reduce the need for animal studies, replace clinical trials and to increase PK understanding. Given the mechanistic nature of these models, the future use of PBPK modelling in drug discovery and development is promising, however, some limitations need to be addressed to realize its application and utility more broadly.
The biological processes by which environmental pollutants induce adverse health effects is most likely regulated by complex interactions dependent upon the route of exposure, dose, kinetics of distribution, and multiple cellular responses. To further complicate deciphering thes...
The Japanese utilities` expectations for subchannel analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toba, Akio; Omoto, Akira
1995-12-01
Boiling water reactor (BWR) utilities in Japan began to consider the development of a mechanistic model to describe the critical heat transfer conditions in the BWR fuel subchannel. Such a mechanistic model will not only decrease the necessity of tests, but will also help by removing some overly conservative safety margins in thermal hydraulics. With the use of a postdryout heat transfer correlation, new acceptance criteria may be applicable to evaluate the fuel integrity. Mechanistic subchannel analysis models will certainly back up this approach. This model will also be applicable to the analysis of large-size fuel bundles and examination ofmore » corrosion behavior.« less
International Guide to Highway Transportation Information: Volume 2 - Websites
DOT National Transportation Integrated Search
2013-10-01
"This guide addresses the selection and use of axle loading defaults for Mechanistic-Empirical Pavement Design Guide (MEPDG) applications. The defaults were developed based on weigh-in-motion (WIM) data from the Long- Term Pavement Performance (LTPP)...
Dixit, Anshuman; Verkhivker, Gennady M.
2009-01-01
Structural and functional studies of the ABL and EGFR kinase domains have recently suggested a common mechanism of activation by cancer-causing mutations. However, dynamics and mechanistic aspects of kinase activation by cancer mutations that stimulate conformational transitions and thermodynamic stabilization of the constitutively active kinase form remain elusive. We present a large-scale computational investigation of activation mechanisms in the ABL and EGFR kinase domains by a panel of clinically important cancer mutants ABL-T315I, ABL-L387M, EGFR-T790M, and EGFR-L858R. We have also simulated the activating effect of the gatekeeper mutation on conformational dynamics and allosteric interactions in functional states of the ABL-SH2-SH3 regulatory complexes. A comprehensive analysis was conducted using a hierarchy of computational approaches that included homology modeling, molecular dynamics simulations, protein stability analysis, targeted molecular dynamics, and molecular docking. Collectively, the results of this study have revealed thermodynamic and mechanistic catalysts of kinase activation by major cancer-causing mutations in the ABL and EGFR kinase domains. By using multiple crystallographic states of ABL and EGFR, computer simulations have allowed one to map dynamics of conformational fluctuations and transitions in the normal (wild-type) and oncogenic kinase forms. A proposed multi-stage mechanistic model of activation involves a series of cooperative transitions between different conformational states, including assembly of the hydrophobic spine, the formation of the Src-like intermediate structure, and a cooperative breakage and formation of characteristic salt bridges, which signify transition to the active kinase form. We suggest that molecular mechanisms of activation by cancer mutations could mimic the activation process of the normal kinase, yet exploiting conserved structural catalysts to accelerate a conformational transition and the enhanced stabilization of the active kinase form. The results of this study reconcile current experimental data with insights from theoretical approaches, pointing to general mechanistic aspects of activating transitions in protein kinases. PMID:19714203
Xue, Ling; Holford, Nick; Ding, Xiao-Liang; Shen, Zhen-Ya; Huang, Chen-Rong; Zhang, Hua; Zhang, Jing-Jing; Guo, Zhe-Ning; Xie, Cheng; Zhou, Ling; Chen, Zhi-Yao; Liu, Lin-Sheng; Miao, Li-Yan
2017-04-01
The aims of this study are to apply a theory-based mechanistic model to describe the pharmacokinetics (PK) and pharmacodynamics (PD) of S- and R-warfarin. Clinical data were obtained from 264 patients. Total concentrations for S- and R-warfarin were measured by ultra-high performance liquid tandem mass spectrometry. Genotypes were measured using pyrosequencing. A sequential population PK parameter with data method was used to describe the international normalized ratio (INR) time course. Data were analyzed with NONMEM. Model evaluation was based on parameter plausibility and prediction-corrected visual predictive checks. Warfarin PK was described using a one-compartment model. CYP2C9 *1/*3 genotype had reduced clearance for S-warfarin, but increased clearance for R-warfarin. The in vitro parameters for the relationship between prothrombin complex activity (PCA) and INR were markedly different (A = 0.560, B = 0.386) from the theory-based values (A = 1, B = 0). There was a small difference between healthy subjects and patients. A sigmoid E max PD model inhibiting PCA synthesis as a function of S-warfarin concentration predicted INR. Small R-warfarin effects was described by competitive antagonism of S-warfarin inhibition. Patients with VKORC1 AA and CYP4F2 CC or CT genotypes had lower C50 for S-warfarin. A theory-based PKPD model describes warfarin concentrations and clinical response. Expected PK and PD genotype effects were confirmed. The role of predicted fat free mass with theory-based allometric scaling of PK parameters was identified. R-warfarin had a minor effect compared with S-warfarin on PCA synthesis. INR is predictable from 1/PCA in vivo. © 2016 The British Pharmacological Society.
Xue, Ling; Holford, Nick; Ding, Xiao‐liang; Shen, Zhen‐ya; Huang, Chen‐rong; Zhang, Hua; Zhang, Jing‐jing; Guo, Zhe‐ning; Xie, Cheng; Zhou, Ling; Chen, Zhi‐yao; Liu, Lin‐sheng
2016-01-01
Aims The aims of this study are to apply a theory‐based mechanistic model to describe the pharmacokinetics (PK) and pharmacodynamics (PD) of S‐ and R‐warfarin. Methods Clinical data were obtained from 264 patients. Total concentrations for S‐ and R‐warfarin were measured by ultra‐high performance liquid tandem mass spectrometry. Genotypes were measured using pyrosequencing. A sequential population PK parameter with data method was used to describe the international normalized ratio (INR) time course. Data were analyzed with NONMEM. Model evaluation was based on parameter plausibility and prediction‐corrected visual predictive checks. Results Warfarin PK was described using a one‐compartment model. CYP2C9 *1/*3 genotype had reduced clearance for S‐warfarin, but increased clearance for R‐warfarin. The in vitro parameters for the relationship between prothrombin complex activity (PCA) and INR were markedly different (A = 0.560, B = 0.386) from the theory‐based values (A = 1, B = 0). There was a small difference between healthy subjects and patients. A sigmoid Emax PD model inhibiting PCA synthesis as a function of S‐warfarin concentration predicted INR. Small R‐warfarin effects was described by competitive antagonism of S‐warfarin inhibition. Patients with VKORC1 AA and CYP4F2 CC or CT genotypes had lower C50 for S‐warfarin. Conclusion A theory‐based PKPD model describes warfarin concentrations and clinical response. Expected PK and PD genotype effects were confirmed. The role of predicted fat free mass with theory‐based allometric scaling of PK parameters was identified. R‐warfarin had a minor effect compared with S‐warfarin on PCA synthesis. INR is predictable from 1/PCA in vivo. PMID:27763679
Specialists without spirit: limitations of the mechanistic biomedical model.
Hewa, S; Hetherington, R W
1995-06-01
This paper examines the origin and the development of the mechanistic model of the human body and health in terms of Max Weber's theory of rationalization. It is argued that the development of Western scientific medicine is a part of the broad process of rationalization that began in sixteenth century Europe as a result of the Reformation. The development of the mechanistic view of the human body in Western medicine is consistent with the ideas of calculability, predictability, and control-the major tenets of the process of rationalization as described by Weber. In recent years, however, the limitations of the mechanistic model have been the topic of many discussions. George Engel, a leading advocate of general systems theory, is one of the leading proponents of a new medical model which includes the general quality of life, clean environment, and psychological, or spiritual stability of life. The paper concludes with consideration of the potential of Engel's proposed new model in the context of the current state of rationalization in modern industrialized society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Lei; Shi, Zhenqing; Lu, Yang
Understanding the kinetics of toxic ion reactions with ferrihydrite is crucial for predicting the dynamic behavior of contaminants in soil environments. In this study, the kinetics of As(V), Cr(VI), Cu, and Pb adsorption and desorption on ferrihydrite were investigated with a combination of laboratory macroscopic experiments, microscopic investigation and mechanistic modeling. The rates of As(V), Cr(VI), Cu, and Pb adsorption and desorption on ferrihydrite, as systematically studied using a stirred-flow method, was highly dependent on the reaction pH and metal concentrations and varied significantly among four metals. Spherical aberration-corrected scanning transmission electron microscopy (Cs-STEM) showed, at sub-nano scales, all fourmore » metals were distributed within the ferrihydrite particle aggregates homogeneously after adsorption reactions, with no evidence of surface diffusion-controlled processes. Based on experimental results, we developed a unifying kinetics model for both cation and oxyanion adsorption/desorption on ferrihydrite based on the mechanistic-based equilibrium model CD-MUSIC. Overall, the model described the kinetic results well, and we quantitatively demonstrated how the equilibrium properties of the cation and oxyanion binding to various ferrihydrite sites affected the adsorption and desorption rates. Our results provided a unifying quantitative modeling method for the kinetics of both cation and oxyanion adsorption/desorption on iron minerals.« less
A systems biology-led insight into the role of the proteome in neurodegenerative diseases.
Fasano, Mauro; Monti, Chiara; Alberio, Tiziana
2016-09-01
Multifactorial disorders are the result of nonlinear interactions of several factors; therefore, a reductionist approach does not appear to be appropriate. Proteomics is a global approach that can be efficiently used to investigate pathogenetic mechanisms of neurodegenerative diseases. Here, we report a general introduction about the systems biology approach and mechanistic insights recently obtained by over-representation analysis of proteomics data of cellular and animal models of Alzheimer's disease, Parkinson's disease and other neurodegenerative disorders, as well as of affected human tissues. Expert commentary: As an inductive method, proteomics is based on unbiased observations that further require validation of generated hypotheses. Pathway databases and over-representation analysis tools allow researchers to assign an expectation value to pathogenetic mechanisms linked to neurodegenerative diseases. The systems biology approach based on omics data may be the key to unravel the complex mechanisms underlying neurodegeneration.
O'Connor, B.L.; Hondzo, Miki; Harvey, J.W.
2009-01-01
Traditionally, dissolved oxygen (DO) fluxes have been calculated using the thin-film theory with DO microstructure data in systems characterized by fine sediments and low velocities. However, recent experimental evidence of fluctuating DO concentrations near the sediment-water interface suggests that turbulence and coherent motions control the mass transfer, and the surface renewal theory gives a more mechanistic model for quantifying fluxes. Both models involve quantifying the mass transfer coefficient (k) and the relevant concentration difference (??C). This study compared several empirical models for quantifying k based on both thin-film and surface renewal theories, as well as presents a new method for quantifying ??C (dynamic approach) that is consistent with the observed DO concentration fluctuations near the interface. Data were used from a series of flume experiments that includes both physical and kinetic uptake limitations of the flux. Results indicated that methods for quantifying k and ??C using the surface renewal theory better estimated the DO flux across a range of fluid-flow conditions. ?? 2009 ASCE.
Modelling the mating system of polar bears: a mechanistic approach to the Allee effect.
Molnár, Péter K; Derocher, Andrew E; Lewis, Mark A; Taylor, Mitchell K
2008-01-22
Allee effects may render exploited animal populations extinction prone, but empirical data are often lacking to describe the circumstances leading to an Allee effect. Arbitrary assumptions regarding Allee effects could lead to erroneous management decisions so that predictive modelling approaches are needed that identify the circumstances leading to an Allee effect before such a scenario occurs. We present a predictive approach of Allee effects for polar bears where low population densities, an unpredictable habitat and harvest-depleted male populations result in infrequent mating encounters. We develop a mechanistic model for the polar bear mating system that predicts the proportion of fertilized females at the end of the mating season given population density and operational sex ratio. The model is parametrized using pairing data from Lancaster Sound, Canada, and describes the observed pairing dynamics well. Female mating success is shown to be a nonlinear function of the operational sex ratio, so that a sudden and rapid reproductive collapse could occur if males are severely depleted. The operational sex ratio where an Allee effect is expected is dependent on population density. We focus on the prediction of Allee effects in polar bears but our approach is also applicable to other species.
Darwich, Adam S; Pade, Devendra; Ammori, Basil J; Jamei, Masoud; Ashcroft, Darren M; Rostami-Hodjegan, Amin
2012-07-01
Due to the multi-factorial physiological implications of bariatric surgery, attempts to explain trends in oral bioavailability following bariatric surgery using singular attributes of drugs or simplified categorisations such as the biopharmaceutics classification system have been unsuccessful. So we have attempted to use mechanistic models to assess changes to bioavailability of model drugs. Pharmacokinetic post bariatric surgery models were created for Roux-en-Y gastric bypass, biliopancreatic diversion with duodenal switch, sleeve gastrectomy and jejunoileal bypass, through altering the 'Advanced Dissolution Absorption and Metabolism' (ADAM) model incorporated into the Simcyp® Simulator. Post to pre surgical simulations were carried out for five drugs with varying characteristics regarding their gut wall metabolism, dissolution and permeability (simvastatin, omeprazole, diclofenac, fluconazole and ciprofloxacin). The trends in oral bioavailability pre to post surgery were found to be dependent on a combination of drug parameters, including solubility, permeability and gastrointestinal metabolism as well as the surgical procedure carried out. In the absence of clinical studies, the ability to project the direction and the magnitude of changes in bioavailability of drug therapy, using evidence-based mechanistic pharmacokinetic in silico models would be of significant value in guiding prescribers to make the necessary adjustments to dosage regimens for an increasing population of patients who are undergoing bariatric surgery. © 2012 The Authors. JPP © 2012 Royal Pharmaceutical Society.
Incorporating zebrafish omics into chemical biology and toxicology.
Sukardi, Hendrian; Ung, Choong Yong; Gong, Zhiyuan; Lam, Siew Hong
2010-03-01
In this communication, we describe the general aspects of omics approaches for analyses of transcriptome, proteome, and metabolome, and how they can be strategically incorporated into chemical screening and perturbation studies using the zebrafish system. Pharmacological efficacy and selectivity of chemicals can be evaluated based on chemical-induced phenotypic effects; however, phenotypic observation has limitations in identifying mechanistic action of chemicals. We suggest adapting gene-expression-based high-throughput screening as a complementary strategy to zebrafish-phenotype-based screening for mechanistic insights about the mode of action and toxicity of a chemical, large-scale predictive applications and comparative analysis of chemical-induced omics signatures, which are useful to identify conserved biological responses, signaling pathways, and biomarkers. The potential mechanistic, predictive, and comparative applications of omics approaches can be implemented in the zebrafish system. Examples of these using the omics approaches in zebrafish, including data of ours and others, are presented and discussed. Omics also facilitates the translatability of zebrafish studies across species through comparison of conserved chemical-induced responses. This review is intended to update interested readers with the current omics approaches that have been applied in chemical studies on zebrafish and their potential in enhancing discovery in chemical biology.
The Mechanistic Indicators of Childhood Asthma (MICA) study in Detroit, Michigan introduced a participant-based approach to reduce the resource burden associated with collection of indoor and outdoor residential air sampling data. A subset of participants designated as MICA-Air c...
Assessing and correcting spatial representativeness of tower eddy-covariance flux measurements
NASA Astrophysics Data System (ADS)
Metzger, S.; Xu, K.; Desai, A. R.; Taylor, J. R.; Kljun, N.; Blanken, P.; Burns, S. P.; Scott, R. L.
2014-12-01
Estimating the landscape-scale exchange of ecologically relevant trace gas and energy fluxes from tower eddy-covariance (EC) measurements is often complicated by surface heterogeneity. For example, a tower EC measurement may represent less than 1% of a grid cell resolved by mechanistic models (order 100-1000 km2). In particular for data assimilation or comparison with large-scale observations, it is hence critical to assess and correct the spatial representativeness of tower EC measurements. We present a procedure that determines from a single EC tower the spatio-temporally explicit flux field of its surrounding. The underlying principle is to extract the relationship between biophysical drivers and ecological responses from measurements under varying environmental conditions. For this purpose, high-frequency EC flux processing and source area calculations (≈60 h-1) are combined with remote sensing retrievals of land surface properties and subsequent machine learning. Methodological details are provided in our companion presentation "Towards the spatial rectification of tower-based eddy-covariance flux observations". We apply the procedure to one year of data from each of four AmeriFlux sites under different climate and ecological environments: Lost Creek shrub fen wetland, Niwot Ridge subalpine conifer, Park Falls mixed forest, and Santa Rita mesquite savanna. We find that heat fluxes from the Park Falls 122-m-high EC measurement and from a surrounding 100 km2 target area differ up to 100 W m-2, or 65%. Moreover, 85% and 24% of the EC flux observations are adequate surrogates of the mean surface-atmosphere exchange and its spatial variability across a 900 km2 target area, respectively, at 5% significance and 80% representativeness levels. Alternatively, the resulting flux grids can be summarized as probability density functions, and used to inform mechanistic models directly with the mean flux value and its spatial variability across a model grid cell. Lastly, for each site we evaluate the applicability of the procedure based on a full bottom-up uncertainty budget.
Mechanistic interpretation of nondestructive pavement testing deflections
NASA Astrophysics Data System (ADS)
Hoffman, M. S.; Thompson, M. R.
1981-06-01
A method for the back calculation of material properties in flexible pavements based on the interpretation of surface deflection measurements is proposed. The ILLI-PAVE, a stress-dependent finite element pavement model, was used to generate data for developing algorithms and nomographs for deflection basin interpretation. Twenty four different flexible pavement sections throughout the State of Illinois were studied. Deflections were measured and loading mode effects on pavement response were investigated. The factors controlling the pavement response to different loading modes are identified and explained. Correlations between different devices are developed. The back calculated parameters derived from the proposed evaluation procedure can be used as inputs for asphalt concrete overlay design.
Modeling receptor kinetics in the analysis of survival data for organophosphorus pesticides.
Jager, Tjalling; Kooijman, Sebastiaan A L M
2005-11-01
Acute ecotoxicological tests usually focus on survival at a standardized exposure time. However, LC50's decrease in time in a manner that depends both on the chemical and on the organism. DEBtox is an existing approach to analyze toxicity data in time, based on hazard modeling (the internal concentration increases the probability to die). However, certain chemicals elicit their response through (irreversible) interaction with a specific receptor, such as inhibition of acetylcholinesterase (AChE). Effects therefore do not solely depend on the actual internal concentration, but also on its (recent) past. In this paper, the DEBtox method is extended with a simple mechanistic model to deal with receptor interactions. We analyzed data from the literature for organophosphorus pesticides in guppies, fathead minnows, and springtails. Overall, the observed survival patterns do not clearly differ from those of chemicals with a less-specific mode of action. However, using the receptor model, resulting parameter estimates are easier to interpret in terms of underlying mechanisms and reveal similarities between the various pesticides. We observed thatthe no-effect concentration estimated from the receptor model is basically identical to the value from standard DEBtox, illustrating the robustness of this summary statistic.
Toward a mechanistic modeling of nitrogen limitation on vegetation dynamics.
Xu, Chonggang; Fisher, Rosie; Wullschleger, Stan D; Wilson, Cathy J; Cai, Michael; McDowell, Nate G
2012-01-01
Nitrogen is a dominant regulator of vegetation dynamics, net primary production, and terrestrial carbon cycles; however, most ecosystem models use a rather simplistic relationship between leaf nitrogen content and photosynthetic capacity. Such an approach does not consider how patterns of nitrogen allocation may change with differences in light intensity, growing-season temperature and CO(2) concentration. To account for this known variability in nitrogen-photosynthesis relationships, we develop a mechanistic nitrogen allocation model based on a trade-off of nitrogen allocated between growth and storage, and an optimization of nitrogen allocated among light capture, electron transport, carboxylation, and respiration. The developed model is able to predict the acclimation of photosynthetic capacity to changes in CO(2) concentration, temperature, and radiation when evaluated against published data of V(c,max) (maximum carboxylation rate) and J(max) (maximum electron transport rate). A sensitivity analysis of the model for herbaceous plants, deciduous and evergreen trees implies that elevated CO(2) concentrations lead to lower allocation of nitrogen to carboxylation but higher allocation to storage. Higher growing-season temperatures cause lower allocation of nitrogen to carboxylation, due to higher nitrogen requirements for light capture pigments and for storage. Lower levels of radiation have a much stronger effect on allocation of nitrogen to carboxylation for herbaceous plants than for trees, resulting from higher nitrogen requirements for light capture for herbaceous plants. As far as we know, this is the first model of complete nitrogen allocation that simultaneously considers nitrogen allocation to light capture, electron transport, carboxylation, respiration and storage, and the responses of each to altered environmental conditions. We expect this model could potentially improve our confidence in simulations of carbon-nitrogen interactions and the vegetation feedbacks to climate in Earth system models.
Toward a Mechanistic Modeling of Nitrogen Limitation on Vegetation Dynamics
Xu, Chonggang; Fisher, Rosie; Wullschleger, Stan D.; Wilson, Cathy J.; Cai, Michael; McDowell, Nate G.
2012-01-01
Nitrogen is a dominant regulator of vegetation dynamics, net primary production, and terrestrial carbon cycles; however, most ecosystem models use a rather simplistic relationship between leaf nitrogen content and photosynthetic capacity. Such an approach does not consider how patterns of nitrogen allocation may change with differences in light intensity, growing-season temperature and CO2 concentration. To account for this known variability in nitrogen-photosynthesis relationships, we develop a mechanistic nitrogen allocation model based on a trade-off of nitrogen allocated between growth and storage, and an optimization of nitrogen allocated among light capture, electron transport, carboxylation, and respiration. The developed model is able to predict the acclimation of photosynthetic capacity to changes in CO2 concentration, temperature, and radiation when evaluated against published data of Vc,max (maximum carboxylation rate) and Jmax (maximum electron transport rate). A sensitivity analysis of the model for herbaceous plants, deciduous and evergreen trees implies that elevated CO2 concentrations lead to lower allocation of nitrogen to carboxylation but higher allocation to storage. Higher growing-season temperatures cause lower allocation of nitrogen to carboxylation, due to higher nitrogen requirements for light capture pigments and for storage. Lower levels of radiation have a much stronger effect on allocation of nitrogen to carboxylation for herbaceous plants than for trees, resulting from higher nitrogen requirements for light capture for herbaceous plants. As far as we know, this is the first model of complete nitrogen allocation that simultaneously considers nitrogen allocation to light capture, electron transport, carboxylation, respiration and storage, and the responses of each to altered environmental conditions. We expect this model could potentially improve our confidence in simulations of carbon-nitrogen interactions and the vegetation feedbacks to climate in Earth system models. PMID:22649564
Reconciled rat and human metabolic networks for comparative toxicogenomics and biomarker predictions
Blais, Edik M.; Rawls, Kristopher D.; Dougherty, Bonnie V.; Li, Zhuo I.; Kolling, Glynis L.; Ye, Ping; Wallqvist, Anders; Papin, Jason A.
2017-01-01
The laboratory rat has been used as a surrogate to study human biology for more than a century. Here we present the first genome-scale network reconstruction of Rattus norvegicus metabolism, iRno, and a significantly improved reconstruction of human metabolism, iHsa. These curated models comprehensively capture metabolic features known to distinguish rats from humans including vitamin C and bile acid synthesis pathways. After reconciling network differences between iRno and iHsa, we integrate toxicogenomics data from rat and human hepatocytes, to generate biomarker predictions in response to 76 drugs. We validate comparative predictions for xanthine derivatives with new experimental data and literature-based evidence delineating metabolite biomarkers unique to humans. Our results provide mechanistic insights into species-specific metabolism and facilitate the selection of biomarkers consistent with rat and human biology. These models can serve as powerful computational platforms for contextualizing experimental data and making functional predictions for clinical and basic science applications. PMID:28176778
Phenemenological vs. biophysical models of thermal stress in aquatic eggs
NASA Astrophysics Data System (ADS)
Martin, B.
2016-12-01
Predicting species responses to climate change is a central challenge in ecology, with most efforts relying on lab derived phenomenological relationships between temperature and fitness metrics. We tested one of these models using the embryonic stage of a Chinook salmon population. We parameterized the model with laboratory data, applied it to predict survival in the field, and found that it significantly underestimated field-derived estimates of thermal mortality. We used a biophysical model based on mass-transfer theory to show that the discrepancy was due to the differences in water flow velocities between the lab and the field. This mechanistic approach provides testable predictions for how the thermal tolerance of embryos depends on egg size and flow velocity of the surrounding water. We found support for these predictions across more than 180 fish species, suggesting that flow and temperature mediated oxygen limitation is a general mechanism underlying the thermal tolerance of embryos.
A Systems Approach to Climate, Water and Diarrhea in Hubli-Dharward, India
NASA Astrophysics Data System (ADS)
Mellor, J. E.; Zimmerman, J.
2014-12-01
Although evidence suggests that climate change will negatively impact water resources and hence diarrheal disease rates in the developing world, there is uncertainty surrounding prior studies. This is due to the complexity of the pathways by which climate impacts diarrhea rates making it difficult to develop interventions. Therefore, our goal was to develop a mechanistic systems approach that incorporates the complex climate, human, engineered and water systems to relate climate change to diarrhea rates under future climate scenarios.To do this, we developed an agent-based model (ABM). Our agents are households and children living in Hubli-Dharward, India. The model was informed with 15 months of weather, water quality, ethnographic and diarrhea incidence data. The model's front end is a stochastic weather simulator incorporating 15 global climate models to simulate rainfall and temperature. The water quality available to agents (residents) on a model "day" is a function of the simulated day's weather and is fully validated with field data. As with the field data, as the ambient temperature increases or it rains, the quality of water available to residents in the model deteriorates. The propensity for an resident to get diarrhea is calculated with an integrated Quantitative Microbial Risk Assessment model with uncertainty simulated with a bootstrap method. Other factors include hand-washing, improved water sources, household water treatment and improved sanitation.The benefits of our approach are as follows: Our mechanistic method allows us to develop scientifically derived adaptation strategies. We can quantitatively link climate scenarios with diarrhea incidence over long time periods. We can explore the complex climate and water system dynamics, rank risk factor importance, examine a broad range of scenarios and identify tipping points. Our approach is modular and expandable such that new datasets can be integrated to study climate impacts on a larger scale. Our results indicate that climate change will have a serious effect on diarrhea incidence in the region. However, adaptation strategies including more reliable water supplies and household water treatment can mitigate these impacts.
Development of a Stochastically-driven, Forward Predictive Performance Model for PEMFCs
NASA Astrophysics Data System (ADS)
Harvey, David Benjamin Paul
A one-dimensional multi-scale coupled, transient, and mechanistic performance model for a PEMFC membrane electrode assembly has been developed. The model explicitly includes each of the 5 layers within a membrane electrode assembly and solves for the transport of charge, heat, mass, species, dissolved water, and liquid water. Key features of the model include the use of a multi-step implementation of the HOR reaction on the anode, agglomerate catalyst sub-models for both the anode and cathode catalyst layers, a unique approach that links the composition of the catalyst layer to key properties within the agglomerate model and the implementation of a stochastic input-based approach for component material properties. The model employs a new methodology for validation using statistically varying input parameters and statistically-based experimental performance data; this model represents the first stochastic input driven unit cell performance model. The stochastic input driven performance model was used to identify optimal ionomer content within the cathode catalyst layer, demonstrate the role of material variation in potential low performing MEA materials, provide explanation for the performance of low-Pt loaded MEAs, and investigate the validity of transient-sweep experimental diagnostic methods.
Sun, Dajun D; Lee, Ping I
2013-11-04
The combination of a rapidly dissolving and supersaturating "spring" with a precipitation retarding "parachute" has often been pursued as an effective formulation strategy for amorphous solid dispersions (ASDs) to enhance the rate and extent of oral absorption. However, the interplay between these two rate processes in achieving and maintaining supersaturation remains inadequately understood, and the effect of rate of supersaturation buildup on the overall time evolution of supersaturation during the dissolution of amorphous solids has not been explored. The objective of this study is to investigate the effect of supersaturation generation rate on the resulting kinetic solubility profiles of amorphous pharmaceuticals and to delineate the evolution of supersaturation from a mechanistic viewpoint. Experimental concentration-time curves under varying rates of supersaturation generation and recrystallization for model drugs, indomethacin (IND), naproxen (NAP) and piroxicam (PIR), were generated from infusing dissolved drug (e.g., in ethanol) into the dissolution medium and compared with that predicted from a comprehensive mechanistic model based on the classical nucleation theory taking into account both the particle growth and ripening processes. In the absence of any dissolved polymer to inhibit drug precipitation, both our experimental and predicted results show that the maximum achievable supersaturation (i.e., kinetic solubility) of the amorphous solids increases, the time to reach maximum decreases, and the rate of concentration decline in the de-supersaturation phase increases, with increasing rate of supersaturation generation (i.e., dissolution rate). Our mechanistic model also predicts the existence of an optimal supersaturation rate which maximizes the area under the curve (AUC) of the kinetic solubility concentration-time profile, which agrees well with experimental data. In the presence of a dissolved polymer from ASD dissolution, these observed trends also hold true except the de-supersaturation phase is more extended due to the crystallization inhibition effect. Since the observed kinetic solubility of nonequilibrium amorphous solids depends on the rate of supersaturation generation, our results also highlight the underlying difficulty in determining a reproducible solubility advantage for amorphous solids.
NASA Astrophysics Data System (ADS)
Michael, R. A.; Stuart, A. L.
2007-12-01
Phase partitioning during freezing affects the transport and distribution of volatile chemical species in convective clouds. This consequently can have impacts on tropospheric chemistry, air quality, pollutant deposition, and climate change. Here, we discuss the development, evaluation, and application of a mechanistic model for the study and prediction of volatile chemical partitioning during steady-state hailstone growth. The model estimates the fraction of a chemical species retained in a two-phase freezing hailstone. It is based upon mass rate balances over water and solute for accretion under wet-growth conditions. Expressions for the calculation of model components, including the rates of super-cooled drop collection, shedding, evaporation, and hail growth were developed and implemented based on available cloud microphysics literature. Solute fate calculations assume equilibrium partitioning at air-liquid and liquid-ice interfaces. Currently, we are testing the model by performing mass balance calculations, sensitivity analyses, and comparison to available experimental data. Application of the model will improve understanding of the effects of cloud conditions and chemical properties on the fate of dissolved chemical species during hail growth.
Influence of urban pattern on inundation flow in floodplains of lowland rivers.
Bruwier, M; Mustafa, A; Aliaga, D G; Archambeau, P; Erpicum, S; Nishida, G; Zhang, X; Pirotton, M; Teller, J; Dewals, B
2018-05-01
The objective of this paper is to investigate the respective influence of various urban pattern characteristics on inundation flow. A set of 2000 synthetic urban patterns were generated using an urban procedural model providing locations and shapes of streets and buildings over a square domain of 1×1km 2 . Steady two-dimensional hydraulic computations were performed over the 2000 urban patterns with identical hydraulic boundary conditions. To run such a large amount of simulations, the computational efficiency of the hydraulic model was improved by using an anisotropic porosity model. This model computes on relatively coarse computational cells, but preserves information from the detailed topographic data through porosity parameters. Relationships between urban characteristics and the computed inundation water depths have been based on multiple linear regressions. Finally, a simple mechanistic model based on two district-scale porosity parameters, combining several urban characteristics, is shown to capture satisfactorily the influence of urban characteristics on inundation water depths. The findings of this study give guidelines for more flood-resilient urban planning. Copyright © 2017 Elsevier B.V. All rights reserved.
The structural basis of secondary active transport mechanisms.
Forrest, Lucy R; Krämer, Reinhard; Ziegler, Christine
2011-02-01
Secondary active transporters couple the free energy of the electrochemical potential of one solute to the transmembrane movement of another. As a basic mechanistic explanation for their transport function the model of alternating access was put forward more than 40 years ago, and has been supported by numerous kinetic, biochemical and biophysical studies. According to this model, the transporter exposes its substrate binding site(s) to one side of the membrane or the other during transport catalysis, requiring a substantial conformational change of the carrier protein. In the light of recent structural data for a number of secondary transport proteins, we analyze the model of alternating access in more detail, and correlate it with specific structural and chemical properties of the transporters, such as their assignment to different functional states in the catalytic cycle of the respective transporter, the definition of substrate binding sites, the type of movement of the central part of the carrier harboring the substrate binding site, as well as the impact of symmetry on fold-specific conformational changes. Besides mediating the transmembrane movement of solutes, the mechanism of secondary carriers inherently involves a mechanistic coupling of substrate flux to the electrochemical potential of co-substrate ions or solutes. Mainly because of limitations in resolution of available transporter structures, this important aspect of secondary transport cannot yet be substantiated by structural data to the same extent as the conformational change aspect. We summarize the concepts of coupling in secondary transport and discuss them in the context of the available evidence for ion binding to specific sites and the impact of the ions on the conformational state of the carrier protein, which together lead to mechanistic models for coupling. Copyright © 2010 Elsevier B.V. All rights reserved.
Faris, Allison T.; Seed, Raymond B.; Kayen, Robert E.; Wu, Jiaer
2006-01-01
During the 1906 San Francisco Earthquake, liquefaction-induced lateral spreading and resultant ground displacements damaged bridges, buried utilities, and lifelines, conventional structures, and other developed works. This paper presents an improved engineering tool for the prediction of maximum displacement due to liquefaction-induced lateral spreading. A semi-empirical approach is employed, combining mechanistic understanding and data from laboratory testing with data and lessons from full-scale earthquake field case histories. The principle of strain potential index, based primary on correlation of cyclic simple shear laboratory testing results with in-situ Standard Penetration Test (SPT) results, is used as an index to characterized the deformation potential of soils after they liquefy. A Bayesian probabilistic approach is adopted for development of the final predictive model, in order to take fullest advantage of the data available and to deal with the inherent uncertainties intrinstiic to the back-analyses of field case histories. A case history from the 1906 San Francisco Earthquake is utilized to demonstrate the ability of the resultant semi-empirical model to estimate maximum horizontal displacement due to liquefaction-induced lateral spreading.
Sarkar, Joydeep
2018-01-01
Iron plays vital roles in the human body including enzymatic processes, oxygen-transport via hemoglobin and immune response. Iron metabolism is characterized by ~95% recycling and minor replenishment through diet. Anemia of chronic kidney disease (CKD) is characterized by a lack of synthesis of erythropoietin leading to reduced red blood cell (RBC) formation and aberrant iron recycling. Treatment of CKD anemia aims to normalize RBC count and serum hemoglobin. Clinically, the various fluxes of iron transport and accumulation are not measured so that changes during disease (e.g., CKD) and treatment are unknown. Unwanted iron accumulation in patients is known to lead to adverse effects. Current whole-body models lack the mechanistic details of iron transport related to RBC maturation, transferrin (Tf and TfR) dynamics and assume passive iron efflux from macrophages. Hence, they are not predictive of whole-body iron dynamics and cannot be used to design individualized patient treatment. For prediction, we developed a mechanistic, multi-scale computational model of whole-body iron metabolism incorporating four compartments containing major pools of iron and RBC generation process. The model accounts for multiple forms of iron in vivo, mechanisms involved in iron uptake and release and their regulation. Furthermore, the model is interfaced with drug pharmacokinetics to allow simulation of treatment dynamics. We calibrated our model with experimental and clinical data from peer-reviewed literature to reliably simulate CKD anemia and the effects of current treatment involving combination of epoietin-alpha and iron dextran. This in silico whole-body model of iron metabolism predicts that a year of treatment can potentially lead to 90% downregulation of ferroportin (FPN) levels, 15-fold increase in iron stores with only a 20% increase in iron flux from the reticulo-endothelial system (RES). Model simulations quantified unmeasured iron fluxes, previously unknown effects of treatment on FPN-level and iron stores in the RES. This mechanistic whole-body model can be the basis for future studies that incorporate iron metabolism together with related clinical experiments. Such an approach could pave the way for development of effective personalized treatment of CKD anemia. PMID:29659573
DOT National Transportation Integrated Search
1998-04-01
The study reported here was conducted to assess how well some of the existing asphalt pavement mechanistic-empirical distress prediction models performed when used in conjunction with the data being collected as part of the national Long Term Pavemen...
Alierta, J A; Pérez, M A; Seral, B; García-Aznar, J M
2016-09-01
The aim of this study is to evaluate the fracture union or non-union for a specific patient that presented oblique fractures in tibia and fibula, using a mechanistic-based bone healing model. Normally, this kind of fractures can be treated through an intramedullary nail using two possible configurations that depends on the mechanical stabilisation: static and dynamic. Both cases are simulated under different fracture geometries in order to understand the effect of the mechanical stabilisation on the fracture healing outcome. The results of both simulations are in good agreement with previous clinical experience. From the results, it is demonstrated that the dynamization of the fracture improves healing in comparison with a static or rigid fixation of the fracture. This work shows the versatility and potential of a mechanistic-based bone healing model to predict the final outcome (union, non-union, delayed union) of realistic 3D fractures where even more than one bone is involved.
Providing data science support for systems pharmacology and its implications to drug discovery.
Hart, Thomas; Xie, Lei
2016-01-01
The conventional one-drug-one-target-one-disease drug discovery process has been less successful in tracking multi-genic, multi-faceted complex diseases. Systems pharmacology has emerged as a new discipline to tackle the current challenges in drug discovery. The goal of systems pharmacology is to transform huge, heterogeneous, and dynamic biological and clinical data into interpretable and actionable mechanistic models for decision making in drug discovery and patient treatment. Thus, big data technology and data science will play an essential role in systems pharmacology. This paper critically reviews the impact of three fundamental concepts of data science on systems pharmacology: similarity inference, overfitting avoidance, and disentangling causality from correlation. The authors then discuss recent advances and future directions in applying the three concepts of data science to drug discovery, with a focus on proteome-wide context-specific quantitative drug target deconvolution and personalized adverse drug reaction prediction. Data science will facilitate reducing the complexity of systems pharmacology modeling, detecting hidden correlations between complex data sets, and distinguishing causation from correlation. The power of data science can only be fully realized when integrated with mechanism-based multi-scale modeling that explicitly takes into account the hierarchical organization of biological systems from nucleic acid to proteins, to molecular interaction networks, to cells, to tissues, to patients, and to populations.
CAAT Altex workshop paper entitled "Towards Good Read-Across Practice (GRAP) Guidance"
Grouping of substances and utilizing read-across within those groups represents an important data gap filling technique for chemical safety assessments. Categories/analogue groups are typically developed based on structural similarity, and increasingly often, also on mechanistic ...
DOT National Transportation Integrated Search
2014-11-01
The main objective of Part 3 was to locally calibrate and validate the mechanistic-empirical pavement : design guide (Pavement-ME) performance models to Michigan conditions. The local calibration of the : performance models in the Pavement-ME is a ch...
McKim, James M.; Hartung, Thomas; Kleensang, Andre; Sá-Rocha, Vanessa
2016-01-01
Supervised learning methods promise to improve integrated testing strategies (ITS), but must be adjusted to handle high dimensionality and dose–response data. ITS approaches are currently fueled by the increasing mechanistic understanding of adverse outcome pathways (AOP) and the development of tests reflecting these mechanisms. Simple approaches to combine skin sensitization data sets, such as weight of evidence, fail due to problems in information redundancy and high dimension-ality. The problem is further amplified when potency information (dose/response) of hazards would be estimated. Skin sensitization currently serves as the foster child for AOP and ITS development, as legislative pressures combined with a very good mechanistic understanding of contact dermatitis have led to test development and relatively large high-quality data sets. We curated such a data set and combined a recursive variable selection algorithm to evaluate the information available through in silico, in chemico and in vitro assays. Chemical similarity alone could not cluster chemicals’ potency, and in vitro models consistently ranked high in recursive feature elimination. This allows reducing the number of tests included in an ITS. Next, we analyzed with a hidden Markov model that takes advantage of an intrinsic inter-relationship among the local lymph node assay classes, i.e. the monotonous connection between local lymph node assay and dose. The dose-informed random forest/hidden Markov model was superior to the dose-naive random forest model on all data sets. Although balanced accuracy improvement may seem small, this obscures the actual improvement in misclassifications as the dose-informed hidden Markov model strongly reduced "false-negatives" (i.e. extreme sensitizers as non-sensitizer) on all data sets. PMID:26046447
Luechtefeld, Thomas; Maertens, Alexandra; McKim, James M; Hartung, Thomas; Kleensang, Andre; Sá-Rocha, Vanessa
2015-11-01
Supervised learning methods promise to improve integrated testing strategies (ITS), but must be adjusted to handle high dimensionality and dose-response data. ITS approaches are currently fueled by the increasing mechanistic understanding of adverse outcome pathways (AOP) and the development of tests reflecting these mechanisms. Simple approaches to combine skin sensitization data sets, such as weight of evidence, fail due to problems in information redundancy and high dimensionality. The problem is further amplified when potency information (dose/response) of hazards would be estimated. Skin sensitization currently serves as the foster child for AOP and ITS development, as legislative pressures combined with a very good mechanistic understanding of contact dermatitis have led to test development and relatively large high-quality data sets. We curated such a data set and combined a recursive variable selection algorithm to evaluate the information available through in silico, in chemico and in vitro assays. Chemical similarity alone could not cluster chemicals' potency, and in vitro models consistently ranked high in recursive feature elimination. This allows reducing the number of tests included in an ITS. Next, we analyzed with a hidden Markov model that takes advantage of an intrinsic inter-relationship among the local lymph node assay classes, i.e. the monotonous connection between local lymph node assay and dose. The dose-informed random forest/hidden Markov model was superior to the dose-naive random forest model on all data sets. Although balanced accuracy improvement may seem small, this obscures the actual improvement in misclassifications as the dose-informed hidden Markov model strongly reduced " false-negatives" (i.e. extreme sensitizers as non-sensitizer) on all data sets. Copyright © 2015 John Wiley & Sons, Ltd.
Root plasticity buffers competition among plants: theory meets experimental data.
Schiffers, Katja; Tielbörger, Katja; Tietjen, Britta; Jeltsch, Florian
2011-03-01
Morphological plasticity is a striking characteristic of plants in natural communities. In the context of foraging behavior particularly, root plasticity has been documented for numerous species. Root plasticity is known to mitigate competitive interactions by reducing the overlap of the individuals' rhizospheres. But despite its obvious effect on resource acquisition, plasticity has been generally neglected in previous empirical and theoretical studies estimating interaction intensity among plants. In this study, we developed a semi-mechanistic model that addresses this shortcoming by introducing the idea of compensatory growth into the classical-zone-of influence (ZOI) and field-of-neighborhood (FON) approaches. The model parameters describing the belowground plastic sphere of influence (PSI) were parameterized using data from an accompanying field experiment. Measurements of the uptake of a stable nutrient analogue at distinct distances to the neighboring plants showed that the study species responded plastically to belowground competition by avoiding overlap of individuals' rhizospheres. An unexpected finding was that the sphere of influence of the study species Bromus hordeaceus could be best described by a unimodal function of distance to the plant's center and not with a continuously decreasing function as commonly assumed. We employed the parameterized model to investigate the interplay between plasticity and two other important factors determining the intensity of competitive interactions: overall plant density and the distribution of individuals in space. The simulation results confirm that the reduction of competition intensity due to morphological plasticity strongly depends on the spatial structure of the competitive environment. We advocate the use of semi-mechanistic simulations that explicitly consider morphological plasticity to improve our mechanistic understanding of plant interactions.
An, Gary
2015-01-01
Agent-based modeling has been used to characterize the nested control loops and non-linear dynamics associated with inflammatory and immune responses, particularly as a means of visualizing putative mechanistic hypotheses. This process is termed dynamic knowledge representation and serves a critical role in facilitating the ability to test and potentially falsify hypotheses in the current data- and hypothesis-rich biomedical research environment. Importantly, dynamic computational modeling aids in identifying useful abstractions, a fundamental scientific principle that pervades the physical sciences. Recognizing the critical scientific role of abstraction provides an intellectual and methodological counterweight to the tendency in biology to emphasize comprehensive description as the primary manifestation of biological knowledge. Transplant immunology represents yet another example of the challenge of identifying sufficient understanding of the inflammatory/immune response in order to develop and refine clinically effective interventions. Advances in immunosuppressive therapies have greatly improved solid organ transplant (SOT) outcomes, most notably by reducing and treating acute rejection. The end goal of these transplant immune strategies is to facilitate effective control of the balance between regulatory T cells and the effector/cytotoxic T-cell populations in order to generate, and ideally maintain, a tolerant phenotype. Characterizing the dynamics of immune cell populations and the interactive feedback loops that lead to graft rejection or tolerance is extremely challenging, but is necessary if rational modulation to induce transplant tolerance is to be accomplished. Herein is presented the solid organ agent-based model (SOTABM) as an initial example of an agent-based model (ABM) that abstractly reproduces the cellular and molecular components of the immune response to SOT. Despite its abstract nature, the SOTABM is able to qualitatively reproduce acute rejection and the suppression of acute rejection by immunosuppression to generate transplant tolerance. The SOTABM is intended as an initial example of how ABMs can be used to dynamically represent mechanistic knowledge concerning transplant immunology in a scalable and expandable form and can thus potentially serve as useful adjuncts to the investigation and development of control strategies to induce transplant tolerance. PMID:26594211
An, Gary
2015-01-01
Agent-based modeling has been used to characterize the nested control loops and non-linear dynamics associated with inflammatory and immune responses, particularly as a means of visualizing putative mechanistic hypotheses. This process is termed dynamic knowledge representation and serves a critical role in facilitating the ability to test and potentially falsify hypotheses in the current data- and hypothesis-rich biomedical research environment. Importantly, dynamic computational modeling aids in identifying useful abstractions, a fundamental scientific principle that pervades the physical sciences. Recognizing the critical scientific role of abstraction provides an intellectual and methodological counterweight to the tendency in biology to emphasize comprehensive description as the primary manifestation of biological knowledge. Transplant immunology represents yet another example of the challenge of identifying sufficient understanding of the inflammatory/immune response in order to develop and refine clinically effective interventions. Advances in immunosuppressive therapies have greatly improved solid organ transplant (SOT) outcomes, most notably by reducing and treating acute rejection. The end goal of these transplant immune strategies is to facilitate effective control of the balance between regulatory T cells and the effector/cytotoxic T-cell populations in order to generate, and ideally maintain, a tolerant phenotype. Characterizing the dynamics of immune cell populations and the interactive feedback loops that lead to graft rejection or tolerance is extremely challenging, but is necessary if rational modulation to induce transplant tolerance is to be accomplished. Herein is presented the solid organ agent-based model (SOTABM) as an initial example of an agent-based model (ABM) that abstractly reproduces the cellular and molecular components of the immune response to SOT. Despite its abstract nature, the SOTABM is able to qualitatively reproduce acute rejection and the suppression of acute rejection by immunosuppression to generate transplant tolerance. The SOTABM is intended as an initial example of how ABMs can be used to dynamically represent mechanistic knowledge concerning transplant immunology in a scalable and expandable form and can thus potentially serve as useful adjuncts to the investigation and development of control strategies to induce transplant tolerance.
USDA-ARS?s Scientific Manuscript database
A new mechanistic growth model was developed to describe microbial growth under isothermal conditions. The new mathematical model was derived from the basic observation of bacterial growth that may include lag, exponential, and stationary phases. With this model, the lag phase duration and exponen...
Development of a Mechanistic-Based Healing Model for Self-Healing Glass Seals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Wei; Stephens, Elizabeth V.; Sun, Xin
Self-healing glass, a recent development of hermetic sealant materials, has the ability to effectively repair damage when heated to elevated temperatures; thus, able to extend its service life. Since crack healing morphological changes in the glass material are usually temperature and stress dependent, quantitative studies to determine the effects of thermo-mechanical conditions on the healing behavior of the self-healing glass sealants are extremely useful to accommodate the design and optimization of the sealing systems within SOFCs. The goal of this task is to develop a mechanistic-based healing model to quantify the stress and temperature dependent healing behavior. A two-step healingmore » mechanism was developed and implemented into finite element (FE) models through user-subroutines. Integrated experimental/kinetic Monte Carlo (kMC) simulation methodology was taken to calibrate the model parameters. The crack healing model is able to investigate the effects of various thermo-mechanical factors; therefore, able to determine the critical conditions under which the healing mechanism will be activated. Furthermore, the predicted results can be used to formulate the continuum damage-healing model and to assist the SOFC stack level simulations in predicting and evaluating the effectiveness and the performance of various engineering seal designs.« less
Linking 3D spatial models of fuels and fire: Effects of spatial heterogeneity on fire behavior
Russell A. Parsons; William E. Mell; Peter McCauley
2011-01-01
Crownfire endangers fire fighters and can have severe ecological consequences. Prediction of fire behavior in tree crowns is essential to informed decisions in fire management. Current methods used in fire management do not address variability in crown fuels. New mechanistic physics-based fire models address convective heat transfer with computational fluid dynamics (...
Thermodynamics-based models of transcriptional regulation with gene sequence.
Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing
2015-12-01
Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.
Predictive and mechanistic multivariate linear regression models for reaction development
Santiago, Celine B.; Guo, Jing-Yao
2018-01-01
Multivariate Linear Regression (MLR) models utilizing computationally-derived and empirically-derived physical organic molecular descriptors are described in this review. Several reports demonstrating the effectiveness of this methodological approach towards reaction optimization and mechanistic interrogation are discussed. A detailed protocol to access quantitative and predictive MLR models is provided as a guide for model development and parameter analysis. PMID:29719711
Soil carbon stocks across tropical forests of Panama regulated by base cation effects on fine roots
Cusack, Daniela F.; Markesteijn, Lars; Condit, Richard; ...
2018-01-02
We report that tropical forests are the most carbon (C)- rich ecosystems on Earth, containing 25–40% of global terrestrial C stocks. While large-scale quantifi- cation of aboveground biomass in tropical forests has improved recently, soil C dynamics remain one of the largest sources of uncertainty in Earth system models, which inhibits our ability to predict future climate. Globally, soil texture and climate predict B 30% of the variation in soil C stocks, so ecosystem models often predict soil C using measures of aboveground plant growth. However, this approach can underestimate tropical soil C stocks, and has proven inaccurate when comparedmore » with data for soil C in data-rich northern ecosystems. By quantifying soil organic C stocks to 1 m depth for 48 humid tropical forest plots across gradients of rainfall and soil fertility in Panama, we show that soil C does not correlate with common predictors used in models, such as plant biomass or litter production. Instead, a structural equation model including base cations, soil clay content, and rainfall as exogenous factors and root biomass as an endogenous factor predicted nearly 50% of the variation in tropical soil C stocks, indicating a strong indirect effect of base cation availability on tropical soil C storage. Including soil base cations in C cycle models, and thus emphasizing mechanistic links among nutrients, root biomass, and soil C stocks, will improve prediction of climate-soil feedbacks in tropical forests.« less
Soil carbon stocks across tropical forests of Panama regulated by base cation effects on fine roots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cusack, Daniela F.; Markesteijn, Lars; Condit, Richard
We report that tropical forests are the most carbon (C)- rich ecosystems on Earth, containing 25–40% of global terrestrial C stocks. While large-scale quantifi- cation of aboveground biomass in tropical forests has improved recently, soil C dynamics remain one of the largest sources of uncertainty in Earth system models, which inhibits our ability to predict future climate. Globally, soil texture and climate predict B 30% of the variation in soil C stocks, so ecosystem models often predict soil C using measures of aboveground plant growth. However, this approach can underestimate tropical soil C stocks, and has proven inaccurate when comparedmore » with data for soil C in data-rich northern ecosystems. By quantifying soil organic C stocks to 1 m depth for 48 humid tropical forest plots across gradients of rainfall and soil fertility in Panama, we show that soil C does not correlate with common predictors used in models, such as plant biomass or litter production. Instead, a structural equation model including base cations, soil clay content, and rainfall as exogenous factors and root biomass as an endogenous factor predicted nearly 50% of the variation in tropical soil C stocks, indicating a strong indirect effect of base cation availability on tropical soil C storage. Including soil base cations in C cycle models, and thus emphasizing mechanistic links among nutrients, root biomass, and soil C stocks, will improve prediction of climate-soil feedbacks in tropical forests.« less
Melin, Johanna; Parra-Guillen, Zinnia P; Hartung, Niklas; Huisinga, Wilhelm; Ross, Richard J; Whitaker, Martin J; Kloft, Charlotte
2018-04-01
Optimisation of hydrocortisone replacement therapy in children is challenging as there is currently no licensed formulation and dose in Europe for children under 6 years of age. In addition, hydrocortisone has non-linear pharmacokinetics caused by saturable plasma protein binding. A paediatric hydrocortisone formulation, Infacort ® oral hydrocortisone granules with taste masking, has therefore been developed. The objective of this study was to establish a population pharmacokinetic model based on studies in healthy adult volunteers to predict hydrocortisone exposure in paediatric patients with adrenal insufficiency. Cortisol and binding protein concentrations were evaluated in the absence and presence of dexamethasone in healthy volunteers (n = 30). Dexamethasone was used to suppress endogenous cortisol concentrations prior to and after single doses of 0.5, 2, 5 and 10 mg of Infacort ® or 20 mg of Infacort ® /hydrocortisone tablet/hydrocortisone intravenously. A plasma protein binding model was established using unbound and total cortisol concentrations, and sequentially integrated into the pharmacokinetic model. Both specific (non-linear) and non-specific (linear) protein binding were included in the cortisol binding model. A two-compartment disposition model with saturable absorption and constant endogenous cortisol baseline (Baseline cort ,15.5 nmol/L) described the data accurately. The predicted cortisol exposure for a given dose varied considerably within a small body weight range in individuals weighing <20 kg. Our semi-mechanistic population pharmacokinetic model for hydrocortisone captures the complex pharmacokinetics of hydrocortisone in a simplified but comprehensive framework. The predicted cortisol exposure indicated the importance of defining an accurate hydrocortisone dose to mimic physiological concentrations for neonates and infants weighing <20 kg. EudraCT number: 2013-000260-28, 2013-000259-42.
Defence mechanisms: the role of physiology in current and future environmental protection paradigms
Glover, Chris N
2018-01-01
Abstract Ecological risk assessments principally rely on simplified metrics of organismal sensitivity that do not consider mechanism or biological traits. As such, they are unable to adequately extrapolate from standard laboratory tests to real-world settings, and largely fail to account for the diversity of organisms and environmental variables that occur in natural environments. However, an understanding of how stressors influence organism health can compensate for these limitations. Mechanistic knowledge can be used to account for species differences in basal biological function and variability in environmental factors, including spatial and temporal changes in the chemical, physical and biological milieu. Consequently, physiological understanding of biological function, and how this is altered by stressor exposure, can facilitate proactive, predictive risk assessment. In this perspective article, existing frameworks that utilize physiological knowledge (e.g. biotic ligand models, adverse outcomes pathways and mechanistic effect models), are outlined, and specific examples of how mechanistic understanding has been used to predict risk are highlighted. Future research approaches and data needs for extending the incorporation of physiological information into ecological risk assessments are discussed. Although the review focuses on chemical toxicants in aquatic systems, physical and biological stressors and terrestrial environments are also briefly considered. PMID:29564135
Comparison of mechanistic transport cycle models of ABC exporters.
Szöllősi, Dániel; Rose-Sperling, Dania; Hellmich, Ute A; Stockner, Thomas
2018-04-01
ABC (ATP binding cassette) transporters, ubiquitous in all kingdoms of life, carry out essential substrate transport reactions across cell membranes. Their transmembrane domains bind and translocate substrates and are connected to a pair of nucleotide binding domains, which bind and hydrolyze ATP to energize import or export of substrates. Over four decades of investigations into ABC transporters have revealed numerous details from atomic-level structural insights to their functional and physiological roles. Despite all these advances, a comprehensive understanding of the mechanistic principles of ABC transporter function remains elusive. The human multidrug resistance transporter ABCB1, also referred to as P-glycoprotein (P-gp), is one of the most intensively studied ABC exporters. Using ABCB1 as the reference point, we aim to compare the dominating mechanistic models of substrate transport and ATP hydrolysis for ABC exporters and to highlight the experimental and computational evidence in their support. In particular, we point out in silico studies that enhance and complement available biochemical data. "This article is part of a Special Issue entitled: Beyond the Structure-Function Horizon of Membrane Proteins edited by Ute Hellmich, Rupak Doshi and Benjamin McIlwain." Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.