Compartmental and Spatial Rule-Based Modeling with Virtual Cell.
Blinov, Michael L; Schaff, James C; Vasilescu, Dan; Moraru, Ion I; Bloom, Judy E; Loew, Leslie M
2017-10-03
In rule-based modeling, molecular interactions are systematically specified in the form of reaction rules that serve as generators of reactions. This provides a way to account for all the potential molecular complexes and interactions among multivalent or multistate molecules. Recently, we introduced rule-based modeling into the Virtual Cell (VCell) modeling framework, permitting graphical specification of rules and merger of networks generated automatically (using the BioNetGen modeling engine) with hand-specified reaction networks. VCell provides a number of ordinary differential equation and stochastic numerical solvers for single-compartment simulations of the kinetic systems derived from these networks, and agent-based network-free simulation of the rules. In this work, compartmental and spatial modeling of rule-based models has been implemented within VCell. To enable rule-based deterministic and stochastic spatial simulations and network-free agent-based compartmental simulations, the BioNetGen and NFSim engines were each modified to support compartments. In the new rule-based formalism, every reactant and product pattern and every reaction rule are assigned locations. We also introduce the rule-based concept of molecular anchors. This assures that any species that has a molecule anchored to a predefined compartment will remain in this compartment. Importantly, in addition to formulation of compartmental models, this now permits VCell users to seamlessly connect reaction networks derived from rules to explicit geometries to automatically generate a system of reaction-diffusion equations. These may then be simulated using either the VCell partial differential equations deterministic solvers or the Smoldyn stochastic simulator. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Modeling disease transmission near eradication: An equation free approach
NASA Astrophysics Data System (ADS)
Williams, Matthew O.; Proctor, Joshua L.; Kutz, J. Nathan
2015-01-01
Although disease transmission in the near eradication regime is inherently stochastic, deterministic quantities such as the probability of eradication are of interest to policy makers and researchers. Rather than running large ensembles of discrete stochastic simulations over long intervals in time to compute these deterministic quantities, we create a data-driven and deterministic "coarse" model for them using the Equation Free (EF) framework. In lieu of deriving an explicit coarse model, the EF framework approximates any needed information, such as coarse time derivatives, by running short computational experiments. However, the choice of the coarse variables (i.e., the state of the coarse system) is critical if the resulting model is to be accurate. In this manuscript, we propose a set of coarse variables that result in an accurate model in the endemic and near eradication regimes, and demonstrate this on a compartmental model representing the spread of Poliomyelitis. When combined with adaptive time-stepping coarse projective integrators, this approach can yield over a factor of two speedup compared to direct simulation, and due to its lower dimensionality, could be beneficial when conducting systems level tasks such as designing eradication or monitoring campaigns.
A model for Huanglongbing spread between citrus plants including delay times and human intervention
NASA Astrophysics Data System (ADS)
Vilamiu, Raphael G. d'A.; Ternes, Sonia; Braga, Guilherme A.; Laranjeira, Francisco F.
2012-09-01
The objective of this work was to present a compartmental deterministic mathematical model for representing the dynamics of HLB disease in a citrus orchard, including delay in the disease's incubation phase in the plants, and a delay period on the nymphal stage of Diaphorina citri, the most important HLB insect vector in Brazil. Numerical simulations were performed to assess the possible impacts of human detection efficiency of symptomatic plants, as well as the influence of a long incubation period of HLB in the plant.
Scribner, Richard; Ackleh, Azmy S; Fitzpatrick, Ben G; Jacquez, Geoffrey; Thibodeaux, Jeremy J; Rommel, Robert; Simonsen, Neal
2009-09-01
The misuse and abuse of alcohol among college students remain persistent problems. Using a systems approach to understand the dynamics of student drinking behavior and thus forecasting the impact of campus policy to address the problem represents a novel approach. Toward this end, the successful development of a predictive mathematical model of college drinking would represent a significant advance for prevention efforts. A deterministic, compartmental model of college drinking was developed, incorporating three processes: (1) individual factors, (2) social interactions, and (3) social norms. The model quantifies these processes in terms of the movement of students between drinking compartments characterized by five styles of college drinking: abstainers, light drinkers, moderate drinkers, problem drinkers, and heavy episodic drinkers. Predictions from the model were first compared with actual campus-level data and then used to predict the effects of several simulated interventions to address heavy episodic drinking. First, the model provides a reasonable fit of actual drinking styles of students attending Social Norms Marketing Research Project campuses varying by "wetness" and by drinking styles of matriculating students. Second, the model predicts that a combination of simulated interventions targeting heavy episodic drinkers at a moderately "dry" campus would extinguish heavy episodic drinkers, replacing them with light and moderate drinkers. Instituting the same combination of simulated interventions at a moderately "wet" campus would result in only a moderate reduction in heavy episodic drinkers (i.e., 50% to 35%). A simple, five-state compartmental model adequately predicted the actual drinking patterns of students from a variety of campuses surveyed in the Social Norms Marketing Research Project study. The model predicted the impact on drinking patterns of several simulated interventions to address heavy episodic drinking on various types of campuses.
Scribner, Richard; Ackleh, Azmy S.; Fitzpatrick, Ben G.; Jacquez, Geoffrey; Thibodeaux, Jeremy J.; Rommel, Robert; Simonsen, Neal
2009-01-01
Objective: The misuse and abuse of alcohol among college students remain persistent problems. Using a systems approach to understand the dynamics of student drinking behavior and thus forecasting the impact of campus policy to address the problem represents a novel approach. Toward this end, the successful development of a predictive mathematical model of college drinking would represent a significant advance for prevention efforts. Method: A deterministic, compartmental model of college drinking was developed, incorporating three processes: (1) individual factors, (2) social interactions, and (3) social norms. The model quantifies these processes in terms of the movement of students between drinking compartments characterized by five styles of college drinking: abstainers, light drinkers, moderate drinkers, problem drinkers, and heavy episodic drinkers. Predictions from the model were first compared with actual campus-level data and then used to predict the effects of several simulated interventions to address heavy episodic drinking. Results: First, the model provides a reasonable fit of actual drinking styles of students attending Social Norms Marketing Research Project campuses varying by “wetness” and by drinking styles of matriculating students. Second, the model predicts that a combination of simulated interventions targeting heavy episodic drinkers at a moderately “dry” campus would extinguish heavy episodic drinkers, replacing them with light and moderate drinkers. Instituting the same combination of simulated interventions at a moderately “wet” campus would result in only a moderate reduction in heavy episodic drinkers (i.e., 50% to 35%). Conclusions: A simple, five-state compartmental model adequately predicted the actual drinking patterns of students from a variety of campuses surveyed in the Social Norms Marketing Research Project study. The model predicted the impact on drinking patterns of several simulated interventions to address heavy episodic drinking on various types of campuses. PMID:19737506
Bittig, Arne T; Uhrmacher, Adelinde M
2017-01-01
Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.
Modelling the effect of an alternative host population on the spread of citrus Huanglongbing
NASA Astrophysics Data System (ADS)
d'A. Vilamiu, Raphael G.; Ternes, Sonia; Laranjeira, Francisco F.; de C. Santos, Tâmara T.
2013-10-01
The objective of this work was to model the spread of citrus Huanglongbing (HLB) considering the presence of a population of alternative hosts (Murraya paniculata). We developed a compartmental deterministic mathematical model for representing the dynamics of HLB disease in a citrus orchard, including delays in the latency and incubation phases of the disease in the plants and a delay period on the nymphal stage of Diaphorina citri, the insect vector of HLB in Brazil. The results of numerical simulations indicate that alternative hosts should not play a crucial role on HLB dynamics considering a typical scenario for the Recôncavo Baiano region in Brazil . Also, the current policy of removing symptomatic plants every three months should not be expected to significantly hinder HLB spread.
Inverse problems and computational cell metabolic models: a statistical approach
NASA Astrophysics Data System (ADS)
Calvetti, D.; Somersalo, E.
2008-07-01
In this article, we give an overview of the Bayesian modelling of metabolic systems at the cellular and subcellular level. The models are based on detailed description of key biochemical reactions occurring in tissue, which may in turn be compartmentalized into cytosol and mitochondria, and of transports between the compartments. The classical deterministic approach which models metabolic systems as dynamical systems with Michaelis-Menten kinetics, is replaced by a stochastic extension where the model parameters are interpreted as random variables with an appropriate probability density. The inverse problem of cell metabolism in this setting consists of estimating the density of the model parameters. After discussing some possible approaches to solving the problem, we address the issue of how to assess the reliability of the predictions of a stochastic model by proposing an output analysis in terms of model uncertainties. Visualization modalities for organizing the large amount of information provided by the Bayesian dynamic sensitivity analysis are also illustrated.
Compartmental and Data-Based Modeling of Cerebral Hemodynamics: Linear Analysis.
Henley, B C; Shin, D C; Zhang, R; Marmarelis, V Z
Compartmental and data-based modeling of cerebral hemodynamics are alternative approaches that utilize distinct model forms and have been employed in the quantitative study of cerebral hemodynamics. This paper examines the relation between a compartmental equivalent-circuit and a data-based input-output model of dynamic cerebral autoregulation (DCA) and CO2-vasomotor reactivity (DVR). The compartmental model is constructed as an equivalent-circuit utilizing putative first principles and previously proposed hypothesis-based models. The linear input-output dynamics of this compartmental model are compared with data-based estimates of the DCA-DVR process. This comparative study indicates that there are some qualitative similarities between the two-input compartmental model and experimental results.
Rönn, Minttu M; Wolf, Emory E; Chesson, Harrell; Menzies, Nicolas A; Galer, Kara; Gorwitz, Rachel; Gift, Thomas; Hsu, Katherine; Salomon, Joshua A
2017-05-01
Mathematical models of chlamydia transmission can help inform disease control policy decisions when direct empirical evaluation of alternatives is impractical. We reviewed published chlamydia models to understand the range of approaches used for policy analyses and how the studies have responded to developments in the field. We performed a literature review by searching Medline and Google Scholar (up to October 2015) to identify publications describing dynamic chlamydia transmission models used to address public health policy questions. We extracted information on modeling methodology, interventions, and key findings. We identified 47 publications (including two model comparison studies), which reported collectively on 29 distinct mathematical models. Nine models were individual-based, and 20 were deterministic compartmental models. The earliest studies evaluated the benefits of national-level screening programs and predicted potentially large benefits from increased screening. Subsequent trials and further modeling analyses suggested the impact might have been overestimated. Partner notification has been increasingly evaluated in mathematical modeling, whereas behavioral interventions have received relatively limited attention. Our review provides an overview of chlamydia transmission models and gives a perspective on how mathematical modeling has responded to increasing empirical evidence and addressed policy questions related to prevention of chlamydia infection and sequelae.
Popović, Jovan K; Atanacković, Milica T; Pilipović, Ana S; Rapaić, Milan R; Pilipović, Stevan; Atanacković, Teodor M
2010-04-01
This study presents a new two compartmental model and its application to the evaluation of diclofenac pharmacokinetics in a small number of healthy adults, during a bioequivalence trial. In the model the integer order derivatives are replaced by derivatives of real order often called fractional order derivatives. Physically that means that a history (memory) of a biological process, realized as a transfer from one compartment to another one with the mass balance conservation, is taken into account. This kind of investigations in pharmacokinetics is founded by Dokoumetzidis and Macheras through the one compartmental models while our contribution is the analysis of multi-dimensional compartmental models with the applications of the two compartmental model in evaluation of diclofenac pharmacokinetics. Two experiments were preformed with 12 healthy volunteers with two slow release 100 mg diclofenac tablet formulations. The agreement of the values predicted by the proposed model with the values obtained through experiments is shown to be good. Thus, pharmacokinetics of slow release diclofenac can be described well by a specific two compartmental model with fractional derivatives of the same order. Parameters in the model are determined by the least-squares method and the Particle Swarm Optimization (PSO) numerical procedure is used. The results show that the fractional order two compartmental model for diclofenac is superior in comparison to the classical two compartmental model. Actually this is true in general case since the classical one is a special case of the fractional one.
Grenfell, B T; Lonergan, M E; Harwood, J
1992-04-20
This paper uses simple mathematical models to examine the long-term dynamic consequences of the 1988 epizootic of phocine distemper virus (PDV) infection in Northern European common seal populations. In a preliminary analysis of single outbreaks of infection deterministic compartmental models are used to estimate feasible ranges for the transmission rate of the infection and the level of disease-induced mortality. These results also indicate that the level of transmission in 1988 was probably sufficient to eradicate the infection throughout the Northern European common seal populations by the end of the first outbreak. An analysis of longer-term infection dynamics, which takes account of the density-dependent recovery of seal population levels, corroborates this finding. It also indicates that a reintroduction of the virus would be unlikely to cause an outbreak on the scale of the 1988 epizootic until the seal population had recovered for at least 10 years. The general ecological implications of these results are discussed.
Modelling and Optimal Control of Typhoid Fever Disease with Cost-Effective Strategies.
Tilahun, Getachew Teshome; Makinde, Oluwole Daniel; Malonza, David
2017-01-01
We propose and analyze a compartmental nonlinear deterministic mathematical model for the typhoid fever outbreak and optimal control strategies in a community with varying population. The model is studied qualitatively using stability theory of differential equations and the basic reproductive number that represents the epidemic indicator is obtained from the largest eigenvalue of the next-generation matrix. Both local and global asymptotic stability conditions for disease-free and endemic equilibria are determined. The model exhibits a forward transcritical bifurcation and the sensitivity analysis is performed. The optimal control problem is designed by applying Pontryagin maximum principle with three control strategies, namely, the prevention strategy through sanitation, proper hygiene, and vaccination; the treatment strategy through application of appropriate medicine; and the screening of the carriers. The cost functional accounts for the cost involved in prevention, screening, and treatment together with the total number of the infected persons averted. Numerical results for the typhoid outbreak dynamics and its optimal control revealed that a combination of prevention and treatment is the best cost-effective strategy to eradicate the disease.
Hybrid stochastic simulations of intracellular reaction-diffusion systems.
Kalantzis, Georgios
2009-06-01
With the observation that stochasticity is important in biological systems, chemical kinetics have begun to receive wider interest. While the use of Monte Carlo discrete event simulations most accurately capture the variability of molecular species, they become computationally costly for complex reaction-diffusion systems with large populations of molecules. On the other hand, continuous time models are computationally efficient but they fail to capture any variability in the molecular species. In this study a hybrid stochastic approach is introduced for simulating reaction-diffusion systems. We developed an adaptive partitioning strategy in which processes with high frequency are simulated with deterministic rate-based equations, and those with low frequency using the exact stochastic algorithm of Gillespie. Therefore the stochastic behavior of cellular pathways is preserved while being able to apply it to large populations of molecules. We describe our method and demonstrate its accuracy and efficiency compared with the Gillespie algorithm for two different systems. First, a model of intracellular viral kinetics with two steady states and second, a compartmental model of the postsynaptic spine head for studying the dynamics of Ca+2 and NMDA receptors.
Sellei, R M; Hingmann, S J; Kobbe, P; Weber, C; Grice, J E; Zimmerman, F; Jeromin, S; Gansslen, A; Hildebrand, F; Pape, H C
2015-01-01
PURPOSE OF THE STUDY Decision-making in treatment of an acute compartment syndrome is based on clinical assessment, supported by invasive monitoring. Thus, evolving compartment syndrome may require repeated pressure measurements. In suspected cases of potential compartment syndromes clinical assessment alone seems to be unreliable. The objective of this study was to investigate the feasibility of a non-invasive application estimating whole compartmental elasticity by ultrasound, which may improve accuracy of diagnostics. MATERIAL AND METHODS In an in-vitro model, using an artificial container simulating dimensions of the human anterior tibial compartment, intracompartmental pressures (p) were raised subsequently up to 80 mm Hg by infusion of saline solution. The compartmental depth (mm) in the cross-section view was measured before and after manual probe compression (100 mm Hg) upon the surface resulting in a linear compartmental displacement (Δd). This was repeated at rising compartmental pressures. The resulting displacements were related to the corresponding intra-compartmental pressures simulated in our model. A hypothesized relationship between pressures related compartmental displacement and the elasticity at elevated compartment pressures was investigated. RESULTS With rising compartmental pressures, a non-linear, reciprocal proportional relation between the displacement (mm) and the intra-compartmental pressure (mm Hg) occurred. The Pearson's coefficient showed a high correlation (r2 = -0.960). The intraobserver reliability value kappa resulted in a statistically high reliability (κ = 0.840). The inter-observer value indicated a fair reliability (κ = 0.640). CONCLUSIONS Our model reveals that a strong correlation between compartmental strain displacements assessed by ultrasound and the intra-compartmental pressure changes occurs. Further studies are required to prove whether this assessment is transferable to human muscle tissue. Determining the complete compartmental elasticity by ultrasound enhancement, this application may improve detection of early signs of potential compartment syndrome. Key words: compartment syndrome, intra-compartmental pressure, non-invasive diagnostic, elasticity measurement, elastography.
Sellei, Richard Martin; Hingmann, Simon Johannes; Kobbe, Philipp; Weber, Christian; Grice, John Edward; Zimmerman, Frauke; Jeromin, Sabine; Hildebrand, Frank; Pape, Hans-Christoph
2015-01-01
Decision-making in treatment of an acute compartment syndrome is based on clinical assessment, supported by invasive monitoring. Thus, evolving compartment syndrome may require repeated pressure measurements. In suspected cases of potential compartment syndromes clinical assessment alone seems to be unreliable. The objective of this study was to investigate the feasibility of a non-invasive application estimating whole compartmental elasticity by ultrasound, which may improve accuracy of diagnostics. In an in vitro model, using an artificial container simulating dimensions of the human anterior tibial compartment, intra-compartmental pressures (p) were raised subsequently up to 80 mmHg by infusion of saline solution. The compartmental depth (mm) in the cross-section view was measured before and after manual probe compression (100 mmHg) upon the surface resulting in a linear compartmental displacement (∆d). This was repeated at rising compartmental pressures. The resulting displacements were related to the corresponding intra-compartmental pressures simulated in our model. A hypothesized relationship between pressures related compartmental displacement and the elasticity at elevated compartment pressures was investigated. With rising compartmental pressures, a non-linear, reciprocal proportional relation between the displacement (mm) and the intra-compartmental pressure (mmHg) occurred. The Pearson coefficient showed a high correlation (r(2) = -0.960). The intra-observer reliability value kappa resulted in a statistically high reliability (κ = 0.840). The inter-observer value indicated a fair reliability (κ = 0.640). Our model reveals that a strong correlation between compartmental strain displacements assessed by ultrasound and the intra-compartmental pressure changes occurs. Further studies are required to prove whether this assessment is transferable to human muscle tissue. Determining the complete compartmental elasticity by ultrasound enhancement, this application may improve detection of early signs of potential compartment syndrome.
Compartmentalization of decayed wood associated with Armillaria mellea in several tree species
Alex L. Shigo; Joanna T. Tippett
1981-01-01
Decayed wood associated with Armillaria mellea was compartmentalized according to the CODIT (Compartmentalization Of Decay In Trees) model. Compartmentalization in the sapwood began after the tree walled off the area of dead cambium associated with the inflection of the fungus. The fungus spread into dying sapwood beneath and beyond the area of...
Transit times and mean ages for nonautonomous and autonomous compartmental systems
Rasmussen, Martin; Hastings, Alan; Smith, Matthew J.; ...
2016-04-01
In this study, we develop a theory for transit times and mean ages for nonautonomous compartmental systems. Using the McKendrick–von Förster equation, we show that the mean ages of mass in a compartmental system satisfy a linear nonautonomous ordinary differential equation that is exponentially stable. We then define a nonautonomous version of transit time as the mean age of mass leaving the compartmental system at a particular time and show that our nonautonomous theory generalises the autonomous case. We apply these results to study a nine-dimensional nonautonomous compartmental system modeling the terrestrial carbon cycle, which is a modification of themore » Carnegie–Ames–Stanford approach model, and we demonstrate that the nonautonomous versions of transit time and mean age differ significantly from the autonomous quantities when calculated for that model.« less
Global identifiability of linear compartmental models--a computer algebra algorithm.
Audoly, S; D'Angiò, L; Saccomani, M P; Cobelli, C
1998-01-01
A priori global identifiability deals with the uniqueness of the solution for the unknown parameters of a model and is, thus, a prerequisite for parameter estimation of biological dynamic models. Global identifiability is however difficult to test, since it requires solving a system of algebraic nonlinear equations which increases both in nonlinearity degree and number of terms and unknowns with increasing model order. In this paper, a computer algebra tool, GLOBI (GLOBal Identifiability) is presented, which combines the topological transfer function method with the Buchberger algorithm, to test global identifiability of linear compartmental models. GLOBI allows for the automatic testing of a priori global identifiability of general structure compartmental models from general multi input-multi output experiments. Examples of usage of GLOBI to analyze a priori global identifiability of some complex biological compartmental models are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rasmussen, Martin; Hastings, Alan; Smith, Matthew J.
We develop a theory for residence times and mean ages for nonautonomous compartmental systems. Using the McKendrick–von Forster equation, we show that the mean ages of mass in a compartmental system satisfy a linear nonautonomous ordinary differential equation that is exponentially stable. We then define a nonautonomous version of residence time as the mean age of mass leaving the compartmental system at a particular time and show that our nonautonomous theory is consistent with the autonomous case. We apply these results to study a nine-dimensional nonautonomous compartmental system modeling the carbon cycle, which is a simplified version of the Carnegie–Ames–Stanfordmore » approach (CASA) model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rasmussen, Martin; Hastings, Alan; Smith, Matthew J.
In this study, we develop a theory for transit times and mean ages for nonautonomous compartmental systems. Using the McKendrick–von Förster equation, we show that the mean ages of mass in a compartmental system satisfy a linear nonautonomous ordinary differential equation that is exponentially stable. We then define a nonautonomous version of transit time as the mean age of mass leaving the compartmental system at a particular time and show that our nonautonomous theory generalises the autonomous case. We apply these results to study a nine-dimensional nonautonomous compartmental system modeling the terrestrial carbon cycle, which is a modification of themore » Carnegie–Ames–Stanford approach model, and we demonstrate that the nonautonomous versions of transit time and mean age differ significantly from the autonomous quantities when calculated for that model.« less
A Network Thermodynamic Approach to Compartmental Analysis
Mikulecky, D. C.; Huf, E. G.; Thomas, S. R.
1979-01-01
We introduce a general network thermodynamic method for compartmental analysis which uses a compartmental model of sodium flows through frog skin as an illustrative example (Huf and Howell, 1974a). We use network thermodynamics (Mikulecky et al., 1977b) to formulate the problem, and a circuit simulation program (ASTEC 2, SPICE2, or PCAP) for computation. In this way, the compartment concentrations and net fluxes between compartments are readily obtained for a set of experimental conditions involving a square-wave pulse of labeled sodium at the outer surface of the skin. Qualitative features of the influx at the outer surface correlate very well with those observed for the short circuit current under another similar set of conditions by Morel and LeBlanc (1975). In related work, the compartmental model is used as a basis for simulation of the short circuit current and sodium flows simultaneously using a two-port network (Mikulecky et al., 1977a, and Mikulecky et al., A network thermodynamic model for short circuit current transients in frog skin. Manuscript in preparation; Gary-Bobo et al., 1978). The network approach lends itself to computation of classic compartmental problems in a simple manner using circuit simulation programs (Chua and Lin, 1975), and it further extends the compartmental models to more complicated situations involving coupled flows and non-linearities such as concentration dependencies, chemical reaction kinetics, etc. PMID:262387
Network thermodynamic approach compartmental analysis. Na+ transients in frog skin.
Mikulecky, D C; Huf, E G; Thomas, S R
1979-01-01
We introduce a general network thermodynamic method for compartmental analysis which uses a compartmental model of sodium flows through frog skin as an illustrative example (Huf and Howell, 1974a). We use network thermodynamics (Mikulecky et al., 1977b) to formulate the problem, and a circuit simulation program (ASTEC 2, SPICE2, or PCAP) for computation. In this way, the compartment concentrations and net fluxes between compartments are readily obtained for a set of experimental conditions involving a square-wave pulse of labeled sodium at the outer surface of the skin. Qualitative features of the influx at the outer surface correlate very well with those observed for the short circuit current under another similar set of conditions by Morel and LeBlanc (1975). In related work, the compartmental model is used as a basis for simulation of the short circuit current and sodium flows simultaneously using a two-port network (Mikulecky et al., 1977a, and Mikulecky et al., A network thermodynamic model for short circuit current transients in frog skin. Manuscript in preparation; Gary-Bobo et al., 1978). The network approach lends itself to computation of classic compartmental problems in a simple manner using circuit simulation programs (Chua and Lin, 1975), and it further extends the compartmental models to more complicated situations involving coupled flows and non-linearities such as concentration dependencies, chemical reaction kinetics, etc.
COMPARTMENTAL MODEL OF NITRATE RETENTION IN STREAMS
A compartmental modeling approach is presented to route nitrate retention along a cascade of stream reach sections. A process transfer function is used for transient storage equations with first order reaction terms to represent nitrate uptake in the free stream, and denitrifica...
An integrated hybrid spatial-compartmental modeling approach is presented for analyzing the dynamic distribution of chemicals in the multimedia environment. Information obtained from such analysis, which includes temporal chemical concentration profiles in various media, mass ...
Analytical Modelling of the Spread of Disease in Confined and Crowded Spaces
NASA Astrophysics Data System (ADS)
Goscé, Lara; Barton, David A. W.; Johansson, Anders
2014-05-01
Since 1927 and until recently, most models describing the spread of disease have been of compartmental type, based on the assumption that populations are homogeneous and well-mixed. Recent models have utilised agent-based models and complex networks to explicitly study heterogeneous interaction patterns, but this leads to an increasing computational complexity. Compartmental models are appealing because of their simplicity, but their parameters, especially the transmission rate, are complex and depend on a number of factors, which makes it hard to predict how a change of a single environmental, demographic, or epidemiological factor will affect the population. Therefore, in this contribution we propose a middle ground, utilising crowd-behaviour research to improve compartmental models in crowded situations. We show how both the rate of infection as well as the walking speed depend on the local crowd density around an infected individual. The combined effect is that the rate of infection at a population scale has an analytically tractable non-linear dependency on crowd density. We model the spread of a hypothetical disease in a corridor and compare our new model with a typical compartmental model, which highlights the regime in which current models may not produce credible results.
Deterministic Models of Inhalational Anthrax in New Zealand White Rabbits
2014-01-01
Computational models describing bacterial kinetics were developed for inhalational anthrax in New Zealand white (NZW) rabbits following inhalation of Ames strain B. anthracis. The data used to parameterize the models included bacterial numbers in the airways, lung tissue, draining lymph nodes, and blood. Initial bacterial numbers were deposited spore dose. The first model was a single exponential ordinary differential equation (ODE) with 3 rate parameters that described mucociliated (physical) clearance, immune clearance (bacterial killing), and bacterial growth. At 36 hours postexposure, the ODE model predicted 1.7×107 bacteria in the rabbit, which agreed well with data from actual experiments (4.0×107 bacteria at 36 hours). Next, building on the single ODE model, a physiological-based biokinetic (PBBK) compartmentalized model was developed in which 1 physiological compartment was the lumen of the airways and the other was the rabbit body (lung tissue, lymph nodes, blood). The 2 compartments were connected with a parameter describing transport of bacteria from the airways into the body. The PBBK model predicted 4.9×107 bacteria in the body at 36 hours, and by 45 hours the model showed all clearance mechanisms were saturated, suggesting the rabbit would quickly succumb to the infection. As with the ODE model, the PBBK model results agreed well with laboratory observations. These data are discussed along with the need for and potential application of the models in risk assessment, drug development, and as a general aid to the experimentalist studying inhalational anthrax. PMID:24527843
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramanathan, Arvind; Steed, Chad A; Pullum, Laura L
Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we buildmore » a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.« less
Robust global identifiability theory using potentials--Application to compartmental models.
Wongvanich, N; Hann, C E; Sirisena, H R
2015-04-01
This paper presents a global practical identifiability theory for analyzing and identifying linear and nonlinear compartmental models. The compartmental system is prolonged onto the potential jet space to formulate a set of input-output equations that are integrals in terms of the measured data, which allows for robust identification of parameters without requiring any simulation of the model differential equations. Two classes of linear and non-linear compartmental models are considered. The theory is first applied to analyze the linear nitrous oxide (N2O) uptake model. The fitting accuracy of the identified models from differential jet space and potential jet space identifiability theories is compared with a realistic noise level of 3% which is derived from sensor noise data in the literature. The potential jet space approach gave a match that was well within the coefficient of variation. The differential jet space formulation was unstable and not suitable for parameter identification. The proposed theory is then applied to a nonlinear immunological model for mastitis in cows. In addition, the model formulation is extended to include an iterative method which allows initial conditions to be accurately identified. With up to 10% noise, the potential jet space theory predicts the normalized population concentration infected with pathogens, to within 9% of the true curve. Copyright © 2015 Elsevier Inc. All rights reserved.
Leander, Jacob; Almquist, Joachim; Ahlström, Christine; Gabrielsson, Johan; Jirstrand, Mats
2015-05-01
Inclusion of stochastic differential equations in mixed effects models provides means to quantify and distinguish three sources of variability in data. In addition to the two commonly encountered sources, measurement error and interindividual variability, we also consider uncertainty in the dynamical model itself. To this end, we extend the ordinary differential equation setting used in nonlinear mixed effects models to include stochastic differential equations. The approximate population likelihood is derived using the first-order conditional estimation with interaction method and extended Kalman filtering. To illustrate the application of the stochastic differential mixed effects model, two pharmacokinetic models are considered. First, we use a stochastic one-compartmental model with first-order input and nonlinear elimination to generate synthetic data in a simulated study. We show that by using the proposed method, the three sources of variability can be successfully separated. If the stochastic part is neglected, the parameter estimates become biased, and the measurement error variance is significantly overestimated. Second, we consider an extension to a stochastic pharmacokinetic model in a preclinical study of nicotinic acid kinetics in obese Zucker rats. The parameter estimates are compared between a deterministic and a stochastic NiAc disposition model, respectively. Discrepancies between model predictions and observations, previously described as measurement noise only, are now separated into a comparatively lower level of measurement noise and a significant uncertainty in model dynamics. These examples demonstrate that stochastic differential mixed effects models are useful tools for identifying incomplete or inaccurate model dynamics and for reducing potential bias in parameter estimates due to such model deficiencies.
Two Approaches to Calibration in Metrology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campanelli, Mark
2014-04-01
Inferring mathematical relationships with quantified uncertainty from measurement data is common to computational science and metrology. Sufficient knowledge of measurement process noise enables Bayesian inference. Otherwise, an alternative approach is required, here termed compartmentalized inference, because collection of uncertain data and model inference occur independently. Bayesian parameterized model inference is compared to a Bayesian-compatible compartmentalized approach for ISO-GUM compliant calibration problems in renewable energy metrology. In either approach, model evidence can help reduce model discrepancy.
Armbruster, Benjamin; Roy, Sourya; Kapur, Abhinav; Schneider, John A
2013-01-01
Men who have sex with men (MSM) practice role segregation - insertive or receptive only sex positions instead of a versatile role - in several international settings where candidate biomedical HIV prevention interventions (e.g., circumcision, anal microbicide) will be tested. The effects of these position-specific interventions on HIV incidence are modeled. We developed a deterministic compartmental model to predict HIV incidence among Indian MSM using data from 2003-2010. The model's sex mixing matrix was derived from network data of Indian MSM (n=4604). Our model captures changing distribution of sex roles over time. We modeled microbicide and circumcision efficacy on trials with heterosexuals. Increasing numbers of versatile MSM resulted in little change in HIV incidence over 20 years. Anal microbicides and circumcision would decrease the HIV prevalence at 10 years from 15.6% to 12.9% and 12.7% respectively. Anal microbicides would provide similar protection to circumcision at the population level despite lower modeled efficacy (54% and 60% risk reduction, respectively). Combination of the interventions were additive: in 5 years, the reduction in HIV prevalence of the combination (-3.2%) is almost the sum of their individual reductions in HIV prevalence (-1.8% and -1.7%). MSM sex role segregation and mixing, unlike changes in the sex role distribution, may be important for evaluating HIV prevention interventions in international settings. Synergies between some position-specific prevention interventions such as circumcision and anal microbicides warrant further study.
Fractional two-compartmental model for articaine serum levels
NASA Astrophysics Data System (ADS)
Petronijevic, Branislava; Sarcev, Ivan; Zorica, Dusan; Janev, Marko; Atanackovic, Teodor M.
2016-06-01
Two fractional two-compartmental models are applied to the pharmacokinetics of articaine. Integer order derivatives are replaced by fractional derivatives, either of different, or of same orders. Models are formulated so that the mass balance is preserved. Explicit forms of the solutions are obtained in terms of the Mittag-Leffler functions. Pharmacokinetic parameters are determined by the use of the evolutionary algorithm and trust regions optimization to recover the experimental data.
Accounting for Ecohydrologic Separation Alters Interpreted Catchment Hydrology
NASA Astrophysics Data System (ADS)
Cain, M. R.; Ward, A. S.; Hrachowitz, M.
2017-12-01
Recent studies have demonstrated that in in some catchments, compartmentalized pools of water supply either plant transpiration (poorly mobile water) or streamflow and groundwater (highly mobile water), a phenomenon referred to as ecohydrologic separation. Although the literature has acknowledged that omission of ecohydrologic separation in hydrological models may influence estimates of residence times of water and solutes, no study has investigated how and when this compartmentalization might alter interpretations of fluxes and storages within a catchment. In this study, we develop two hydrochemical lumped rainfall-runoff models, one which incorporates ecohydrologic separation and one which does not for a watershed at the H.J. Andrews Experimental Forest (Oregon, USA), the study site where ecohydrologic separation was first observed. The models are calibrated against stream discharge, as well as stream chloride concentration. The objectives of this study are (1) to compare calibrated parameters and identifiability across models, (2) to determine how and when compartmentalization of water in the vadose zone might alter interpretations of fluxes and stores within the catchment, and (3) to identify how and when these changes alter residence times. Preliminary results suggest that compartmentalization of the vadose zone alters interpretations of fluxes and storages in the catchment and improves our ability to simulate solute transport.
The past, present and future of cyber-physical systems: a focus on models.
Lee, Edward A
2015-02-26
This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical.
The Past, Present and Future of Cyber-Physical Systems: A Focus on Models
Lee, Edward A.
2015-01-01
This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical. PMID:25730486
Davidson, Natalie R; Godfrey, Keith R; Alquaddoomi, Faisal; Nola, David; DiStefano, Joseph J
2017-05-01
We describe and illustrate use of DISTING, a novel web application for computing alternative structurally identifiable linear compartmental models that are input-output indistinguishable from a postulated linear compartmental model. Several computer packages are available for analysing the structural identifiability of such models, but DISTING is the first to be made available for assessing indistinguishability. The computational algorithms embedded in DISTING are based on advanced versions of established geometric and algebraic properties of linear compartmental models, embedded in a user-friendly graphic model user interface. Novel computational tools greatly speed up the overall procedure. These include algorithms for Jacobian matrix reduction, submatrix rank reduction, and parallelization of candidate rank computations in symbolic matrix analysis. The application of DISTING to three postulated models with respectively two, three and four compartments is given. The 2-compartment example is used to illustrate the indistinguishability problem; the original (unidentifiable) model is found to have two structurally identifiable models that are indistinguishable from it. The 3-compartment example has three structurally identifiable indistinguishable models. It is found from DISTING that the four-compartment example has five structurally identifiable models indistinguishable from the original postulated model. This example shows that care is needed when dealing with models that have two or more compartments which are neither perturbed nor observed, because the numbering of these compartments may be arbitrary. DISTING is universally and freely available via the Internet. It is easy to use and circumvents tedious and complicated algebraic analysis previously done by hand. Copyright © 2017 Elsevier B.V. All rights reserved.
Schryer, David W; Peterson, Pearu; Paalme, Toomas; Vendelin, Marko
2009-04-17
Isotope labeling is one of the few methods of revealing the in vivo bidirectionality and compartmentalization of metabolic fluxes within metabolic networks. We argue that a shift from steady state to dynamic isotopomer analysis is required to deal with these cellular complexities and provide a review of dynamic studies of compartmentalized energy fluxes in eukaryotic cells including cardiac muscle, plants, and astrocytes. Knowledge of complex metabolic behaviour on a molecular level is prerequisite for the intelligent design of genetically modified organisms able to realize their potential of revolutionizing food, energy, and pharmaceutical production. We describe techniques to explore the bidirectionality and compartmentalization of metabolic fluxes using information contained in the isotopic transient, and discuss the integration of kinetic models with MFA. The flux parameters of an example metabolic network were optimized to examine the compartmentalization of metabolites and and the bidirectionality of fluxes in the TCA cycle of Saccharomyces uvarum for steady-state respiratory growth.
Horn, Johannes; Damm, Oliver; Greiner, Wolfgang; Hengel, Hartmut; Kretzschmar, Mirjam E; Siedler, Anette; Ultsch, Bernhard; Weidemann, Felix; Wichmann, Ole; Karch, André; Mikolajczyk, Rafael T
2018-01-09
Epidemiological studies suggest that reduced exposure to varicella might lead to an increased risk for herpes zoster (HZ). Reduction of exposure to varicella is a consequence of varicella vaccination but also of demographic changes. We analyzed how the combination of vaccination programs and demographic dynamics will affect the epidemiology of varicella and HZ in Germany over the next 50 years. We used a deterministic dynamic compartmental model to assess the impact of different varicella and HZ vaccination strategies on varicella and HZ epidemiology in three demographic scenarios, namely the projected population for Germany, the projected population additionally accounting for increased immigration as observed in 2015/2016, and a stationary population. Projected demographic changes alone result in an increase of annual HZ cases by 18.3% and a decrease of varicella cases by 45.7% between 1990 and 2060. Independently of the demographic scenario, varicella vaccination reduces the cumulative number of varicella cases until 2060 by approximately 70%, but also increases HZ cases by 10%. Unlike the currently licensed live attenuated HZ vaccine, the new subunit vaccine candidate might completely counteract this effect. Relative vaccine effects were consistent across all demographic scenarios. Demographic dynamics will be a major determinant of HZ epidemiology in the next 50 years. While stationary population models are appropriate for assessing vaccination impact, models incorporating realistic population structures allow a direct comparison to surveillance data and can thus provide additional input for immunization decision-making and resource planning.
Talaminos, A; López-Cerero, L; Calvillo, J; Pascual, A; Roa, L M; Rodríguez-Baño, J
2016-07-01
ST131 Escherichia coli is an emergent clonal group that has achieved successful worldwide spread through a combination of virulence and antimicrobial resistance. Our aim was to develop a mathematical model, based on current knowledge of the epidemiology of ESBL-producing and non-ESBL-producing ST131 E. coli, to provide a framework enabling a better understanding of its spread within the community, in hospitals and long-term care facilities, and the potential impact of specific interventions on the rates of infection. A model belonging to the SEIS (Susceptible-Exposed-Infected-Susceptible) class of compartmental models, with specific modifications, was developed. Quantification of the model is based on the law of mass preservation, which helps determine the relationships between flows of individuals and different compartments. Quantification is deterministic or probabilistic depending on subpopulation size. The assumptions for the model are based on several developed epidemiological studies. Based on the assumptions of the model, an intervention capable of sustaining a 25% reduction in person-to-person transmission shows a significant reduction in the rate of infections caused by ST131; the impact is higher for non-ESBL-producing ST131 isolates than for ESBL producers. On the other hand, an isolated intervention reducing exposure to antimicrobial agents has much more limited impact on the rate of ST131 infection. Our results suggest that interventions achieving a continuous reduction in the transmission of ST131 in households, nursing homes and hospitals offer the best chance of reducing the burden of the infections caused by these isolates.
Malunguza, Noble; Mushayabasa, Steady; Chiyaka, Christinah; Mukandavire, Zindoga
2010-09-01
A deterministic compartmental sex-structured HIV/AIDS model for assessing the effects of homosexuals and bisexuals in heterosexual settings in which homosexuality and bisexuality issues have remained taboo is presented. We extend the model to focus on the effects of condom use as a single strategy approach in HIV prevention in the absence of any other intervention strategies. Initially, we model the use of male condoms, followed by incorporating the use of both the female and male condoms. The model includes two primary factors in condom use to control HIV which are condom efficacy and compliance. Reproductive numbers for these models are computed and compared to assess the effectiveness of male and female condom use in a community. We also extend the basic model to consider the effects of antiretroviral therapy as a single strategy. The results from the study show that condoms can reduce the number of secondary infectives and thus can slow the development of the HIV/AIDS epidemic. Further, we note from the study that treatment of AIDS patients may enlarge the epidemic when the treatment drugs are not 100% effective and when treated AIDS patients indulge in risky sexual behaviour. Thus, the treatment with amelioration of AIDS patients should be accompanied with intense public health educational programs, which are capable of changing the attitude of treated AIDS patients towards safe sex. It is also shown from the study that the use of condoms in settings with the treatment may help in reducing the number of secondary infections thus slowing the epidemic.
Molina-Romero, Miguel; Gómez, Pedro A; Sperl, Jonathan I; Czisch, Michael; Sämann, Philipp G; Jones, Derek K; Menzel, Marion I; Menze, Bjoern H
2018-03-23
The compartmental nature of brain tissue microstructure is typically studied by diffusion MRI, MR relaxometry or their correlation. Diffusion MRI relies on signal representations or biophysical models, while MR relaxometry and correlation studies are based on regularized inverse Laplace transforms (ILTs). Here we introduce a general framework for characterizing microstructure that does not depend on diffusion modeling and replaces ill-posed ILTs with blind source separation (BSS). This framework yields proton density, relaxation times, volume fractions, and signal disentanglement, allowing for separation of the free-water component. Diffusion experiments repeated for several different echo times, contain entangled diffusion and relaxation compartmental information. These can be disentangled by BSS using a physically constrained nonnegative matrix factorization. Computer simulations, phantom studies, together with repeatability and reproducibility experiments demonstrated that BSS is capable of estimating proton density, compartmental volume fractions and transversal relaxations. In vivo results proved its potential to correct for free-water contamination and to estimate tissue parameters. Formulation of the diffusion-relaxation dependence as a BSS problem introduces a new framework for studying microstructure compartmentalization, and a novel tool for free-water elimination. © 2018 International Society for Magnetic Resonance in Medicine.
Deterministic and stochastic CTMC models from Zika disease transmission
NASA Astrophysics Data System (ADS)
Zevika, Mona; Soewono, Edy
2018-03-01
Zika infection is one of the most important mosquito-borne diseases in the world. Zika virus (ZIKV) is transmitted by many Aedes-type mosquitoes including Aedes aegypti. Pregnant women with the Zika virus are at risk of having a fetus or infant with a congenital defect and suffering from microcephaly. Here, we formulate a Zika disease transmission model using two approaches, a deterministic model and a continuous-time Markov chain stochastic model. The basic reproduction ratio is constructed from a deterministic model. Meanwhile, the CTMC stochastic model yields an estimate of the probability of extinction and outbreaks of Zika disease. Dynamical simulations and analysis of the disease transmission are shown for the deterministic and stochastic models.
Cell-to-Cell Communication Circuits: Quantitative Analysis of Synthetic Logic Gates
Hoffman-Sommer, Marta; Supady, Adriana; Klipp, Edda
2012-01-01
One of the goals in the field of synthetic biology is the construction of cellular computation devices that could function in a manner similar to electronic circuits. To this end, attempts are made to create biological systems that function as logic gates. In this work we present a theoretical quantitative analysis of a synthetic cellular logic-gates system, which has been implemented in cells of the yeast Saccharomyces cerevisiae (Regot et al., 2011). It exploits endogenous MAP kinase signaling pathways. The novelty of the system lies in the compartmentalization of the circuit where all basic logic gates are implemented in independent single cells that can then be cultured together to perform complex logic functions. We have constructed kinetic models of the multicellular IDENTITY, NOT, OR, and IMPLIES logic gates, using both deterministic and stochastic frameworks. All necessary model parameters are taken from literature or estimated based on published kinetic data, in such a way that the resulting models correctly capture important dynamic features of the included mitogen-activated protein kinase pathways. We analyze the models in terms of parameter sensitivity and we discuss possible ways of optimizing the system, e.g., by tuning the culture density. We apply a stochastic modeling approach, which simulates the behavior of whole populations of cells and allows us to investigate the noise generated in the system; we find that the gene expression units are the major sources of noise. Finally, the model is used for the design of system modifications: we show how the current system could be transformed to operate on three discrete values. PMID:22934039
Stability analysis of multi-group deterministic and stochastic epidemic models with vaccination rate
NASA Astrophysics Data System (ADS)
Wang, Zhi-Gang; Gao, Rui-Mei; Fan, Xiao-Ming; Han, Qi-Xing
2014-09-01
We discuss in this paper a deterministic multi-group MSIR epidemic model with a vaccination rate, the basic reproduction number ℛ0, a key parameter in epidemiology, is a threshold which determines the persistence or extinction of the disease. By using Lyapunov function techniques, we show if ℛ0 is greater than 1 and the deterministic model obeys some conditions, then the disease will prevail, the infective persists and the endemic state is asymptotically stable in a feasible region. If ℛ0 is less than or equal to 1, then the infective disappear so the disease dies out. In addition, stochastic noises around the endemic equilibrium will be added to the deterministic MSIR model in order that the deterministic model is extended to a system of stochastic ordinary differential equations. In the stochastic version, we carry out a detailed analysis on the asymptotic behavior of the stochastic model. In addition, regarding the value of ℛ0, when the stochastic system obeys some conditions and ℛ0 is greater than 1, we deduce the stochastic system is stochastically asymptotically stable. Finally, the deterministic and stochastic model dynamics are illustrated through computer simulations.
One shall become two: Separation of the esophagus and trachea from the common foregut tube
Billmyre, Katherine Kretovich; Hutson, Mary; Klingensmith, John
2016-01-01
The alimentary and respiratory organ systems arise from a common endodermal origin, the anterior foregut tube. Formation of the esophagus from the dorsal region and the trachea from the ventral region of the foregut primordium occurs via a poorly understood compartmentalization process. Disruption of this process can result in severe birth defects, such as esophageal atresia and tracheoesphageal fistula (EA/TEF), in which the lumina of the trachea and esophagus remain connected. Here we summarize the signaling networks known to be necessary for regulating dorso-ventral patterning within the common foregut tube and cellular behaviors that may occur during normal foregut compartmentalization. We propose that dorso-ventral patterning serves to establish a lateral region of the foregut tube that is capable of undergoing specialized cellular rearrangements, culminating in compartmentalization. We review established as well as new rodent models that may be useful in addressing this hypothesis. Finally, we discuss new experimental models that could help elucidate the mechanism behind foregut compartmentalization. An integrated approach to future foregut morphogenesis research will allow for a better understanding of this complex process. PMID:25329576
Inferring Fitness Effects from Time-Resolved Sequence Data with a Delay-Deterministic Model
Nené, Nuno R.; Dunham, Alistair S.; Illingworth, Christopher J. R.
2018-01-01
A common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the nondeterministic properties of mutation in a finite population. We propose an alternative approach that acts to correct for this error, and which we denote the delay-deterministic model. Applying our model to a simple evolutionary system, we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model. PMID:29500183
Razzaq, Misbah; Ahmad, Jamil
2015-01-01
Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework. PMID:26713449
Razzaq, Misbah; Ahmad, Jamil
2015-01-01
Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework.
Guymon, Gary L.; Yen, Chung-Cheng
1990-01-01
The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.
NASA Astrophysics Data System (ADS)
Guymon, Gary L.; Yen, Chung-Cheng
1990-07-01
The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.
Schnell, Gretja; Spudich, Serena; Harrington, Patrick; Price, Richard W; Swanstrom, Ronald
2009-04-01
Human immunodeficiency virus type 1 (HIV-1) invades the central nervous system (CNS) shortly after systemic infection and can result in the subsequent development of HIV-1-associated dementia (HAD) in a subset of infected individuals. Genetically compartmentalized virus in the CNS is associated with HAD, suggesting autonomous viral replication as a factor in the disease process. We examined the source of compartmentalized HIV-1 in the CNS of subjects with HIV-1-associated neurological disease and in asymptomatic subjects who were initiating antiretroviral therapy. The heteroduplex tracking assay (HTA), targeting the variable regions of env, was used to determine which HIV-1 genetic variants in the cerebrospinal fluid (CSF) were compartmentalized and which variants were shared with the blood plasma. We then measured the viral decay kinetics of individual variants after the initiation of antiretroviral therapy. Compartmentalized HIV-1 variants in the CSF of asymptomatic subjects decayed rapidly after the initiation of antiretroviral therapy, with a mean half-life of 1.57 days. Rapid viral decay was also measured for CSF-compartmentalized variants in four HAD subjects (t(1/2) mean = 2.27 days). However, slow viral decay was measured for CSF-compartmentalized variants from an additional four subjects with neurological disease (t(1/2) range = 9.85 days to no initial decay). The slow decay detected for CSF-compartmentalized variants was not associated with poor CNS drug penetration, drug resistant virus in the CSF, or the presence of X4 virus genotypes. We found that the slow decay measured for CSF-compartmentalized variants in subjects with neurological disease was correlated with low peripheral CD4 cell count and reduced CSF pleocytosis. We propose a model in which infiltrating macrophages replace CD4(+) T cells as the primary source of productive viral replication in the CNS to maintain high viral loads in the CSF in a substantial subset of subjects with HAD.
Schnell, Gretja; Spudich, Serena; Harrington, Patrick; Price, Richard W.; Swanstrom, Ronald
2009-01-01
Human immunodeficiency virus type 1 (HIV-1) invades the central nervous system (CNS) shortly after systemic infection and can result in the subsequent development of HIV-1–associated dementia (HAD) in a subset of infected individuals. Genetically compartmentalized virus in the CNS is associated with HAD, suggesting autonomous viral replication as a factor in the disease process. We examined the source of compartmentalized HIV-1 in the CNS of subjects with HIV-1–associated neurological disease and in asymptomatic subjects who were initiating antiretroviral therapy. The heteroduplex tracking assay (HTA), targeting the variable regions of env, was used to determine which HIV-1 genetic variants in the cerebrospinal fluid (CSF) were compartmentalized and which variants were shared with the blood plasma. We then measured the viral decay kinetics of individual variants after the initiation of antiretroviral therapy. Compartmentalized HIV-1 variants in the CSF of asymptomatic subjects decayed rapidly after the initiation of antiretroviral therapy, with a mean half-life of 1.57 days. Rapid viral decay was also measured for CSF-compartmentalized variants in four HAD subjects (t1/2 mean = 2.27 days). However, slow viral decay was measured for CSF-compartmentalized variants from an additional four subjects with neurological disease (t1/2 range = 9.85 days to no initial decay). The slow decay detected for CSF-compartmentalized variants was not associated with poor CNS drug penetration, drug resistant virus in the CSF, or the presence of X4 virus genotypes. We found that the slow decay measured for CSF-compartmentalized variants in subjects with neurological disease was correlated with low peripheral CD4 cell count and reduced CSF pleocytosis. We propose a model in which infiltrating macrophages replace CD4+ T cells as the primary source of productive viral replication in the CNS to maintain high viral loads in the CSF in a substantial subset of subjects with HAD. PMID:19390619
Furubayashi, Taro
2018-01-01
The emergence and dominance of parasitic replicators are among the major hurdles for the proliferation of primitive replicators. Compartmentalization of replicators is proposed to relieve the parasite dominance; however, it remains unclear under what conditions simple compartmentalization uncoupled with internal reaction secures the long-term survival of a population of primitive replicators against incessant parasite emergence. Here, we investigate the sustainability of a compartmentalized host-parasite replicator (CHPR) system undergoing periodic washout-mixing cycles, by constructing a mathematical model and performing extensive simulations. We describe sustainable landscapes of the CHPR system in the parameter space and elucidate the mechanism of phase transitions between sustainable and extinct regions. Our findings revealed that a large population size of compartments, a high mixing intensity, and a modest amount of nutrients are important factors for the robust survival of replicators. We also found two distinctive sustainable phases with different mixing intensities. These results suggest that a population of simple host–parasite replicators assumed before the origin of life can be sustained by a simple compartmentalization with periodic washout-mixing processes. PMID:29373536
Expansion or extinction: deterministic and stochastic two-patch models with Allee effects.
Kang, Yun; Lanchier, Nicolas
2011-06-01
We investigate the impact of Allee effect and dispersal on the long-term evolution of a population in a patchy environment. Our main focus is on whether a population already established in one patch either successfully invades an adjacent empty patch or undergoes a global extinction. Our study is based on the combination of analytical and numerical results for both a deterministic two-patch model and a stochastic counterpart. The deterministic model has either two, three or four attractors. The existence of a regime with exactly three attractors only appears when patches have distinct Allee thresholds. In the presence of weak dispersal, the analysis of the deterministic model shows that a high-density and a low-density populations can coexist at equilibrium in nearby patches, whereas the analysis of the stochastic model indicates that this equilibrium is metastable, thus leading after a large random time to either a global expansion or a global extinction. Up to some critical dispersal, increasing the intensity of the interactions leads to an increase of both the basin of attraction of the global extinction and the basin of attraction of the global expansion. Above this threshold, for both the deterministic and the stochastic models, the patches tend to synchronize as the intensity of the dispersal increases. This results in either a global expansion or a global extinction. For the deterministic model, there are only two attractors, while the stochastic model no longer exhibits a metastable behavior. In the presence of strong dispersal, the limiting behavior is entirely determined by the value of the Allee thresholds as the global population size in the deterministic and the stochastic models evolves as dictated by their single-patch counterparts. For all values of the dispersal parameter, Allee effects promote global extinction in terms of an expansion of the basin of attraction of the extinction equilibrium for the deterministic model and an increase of the probability of extinction for the stochastic model.
The relationship between stochastic and deterministic quasi-steady state approximations.
Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R
2015-11-23
The quasi steady-state approximation (QSSA) is frequently used to reduce deterministic models of biochemical networks. The resulting equations provide a simplified description of the network in terms of non-elementary reaction functions (e.g. Hill functions). Such deterministic reductions are frequently a basis for heuristic stochastic models in which non-elementary reaction functions are used to define reaction propensities. Despite their popularity, it remains unclear when such stochastic reductions are valid. It is frequently assumed that the stochastic reduction can be trusted whenever its deterministic counterpart is accurate. However, a number of recent examples show that this is not necessarily the case. Here we explain the origin of these discrepancies, and demonstrate a clear relationship between the accuracy of the deterministic and the stochastic QSSA for examples widely used in biological systems. With an analysis of a two-state promoter model, and numerical simulations for a variety of other models, we find that the stochastic QSSA is accurate whenever its deterministic counterpart provides an accurate approximation over a range of initial conditions which cover the likely fluctuations from the quasi steady-state (QSS). We conjecture that this relationship provides a simple and computationally inexpensive way to test the accuracy of reduced stochastic models using deterministic simulations. The stochastic QSSA is one of the most popular multi-scale stochastic simulation methods. While the use of QSSA, and the resulting non-elementary functions has been justified in the deterministic case, it is not clear when their stochastic counterparts are accurate. In this study, we show how the accuracy of the stochastic QSSA can be tested using their deterministic counterparts providing a concrete method to test when non-elementary rate functions can be used in stochastic simulations.
The mechanics of cellular compartmentalization as a model for tumor spreading
NASA Astrophysics Data System (ADS)
Fritsch, Anatol; Pawlizak, Steve; Zink, Mareike; Kaes, Josef A.
2012-02-01
Based on a recently developed surgical method of Michael H"ockel, which makes use of cellular confinement to compartments in the human body, we study the mechanics of the process of cell segregation. Compartmentalization is a fundamental process of cellular organization and occurs during embryonic development. A simple model system can demonstrate the process of compartmentalization: When two populations of suspended cells are mixed, this mixture will eventually segregate into two phases, whereas mixtures of the same cell type will not. In the 1960s, Malcolm S. Steinberg formulated the so-called differential adhesion hypothesis which explains the segregation in the model system and the process of compartmentalization by differences in surface tension and adhesiveness of the interacting cells. We are interested in to which extend the same physical principles affect tumor growth and spreading between compartments. For our studies, we use healthy and cancerous breast cell lines of different malignancy as well as primary cells from human cervix carcinoma. We apply a set of techniques to study their mechanical properties and interactions. The Optical Stretcher is used for whole cell rheology, while Cell-cell-adhesion forces are directly measured with a modified AFM. In combination with 3D segregation experiments in droplet cultures we try to clarify the role of surface tension in tumor spreading.
Analytical properties of a three-compartmental dynamical demographic model
NASA Astrophysics Data System (ADS)
Postnikov, E. B.
2015-07-01
The three-compartmental demographic model by Korotaeyv-Malkov-Khaltourina, connecting population size, economic surplus, and education level, is considered from the point of view of dynamical systems theory. It is shown that there exist two integrals of motion, which enables the system to be reduced to one nonlinear ordinary differential equation. The study of its structure provides analytical criteria for the dominance ranges of the dynamics of Malthus and Kremer. Additionally, the particular ranges of parameters enable the derived general ordinary differential equations to be reduced to the models of Gompertz and Thoularis-Wallace.
Co-Compartmentation of Terpene Biosynthesis and Storage via Synthetic Droplet.
Zhao, Cheng; Kim, YongKyoung; Zeng, Yining; Li, Man; Wang, Xin; Hu, Cheng; Gorman, Connor; Dai, Susie Y; Ding, Shi-You; Yuan, Joshua S
2018-03-16
Traditional bioproduct engineering focuses on pathway optimization, yet is often complicated by product inhibition, downstream consumption, and the toxicity of certain products. Here, we present the co-compartmentation of biosynthesis and storage via a synthetic droplet as an effective new strategy to improve the bioproduct yield, with squalene as a model compound. A hydrophobic protein was designed and introduced into the tobacco chloroplast to generate a synthetic droplet for terpene storage. Simultaneously, squalene biosynthesis enzymes were introduced to chloroplasts together with the droplet-forming protein to co-compartmentalize the biosynthesis and storage of squalene. The strategy has enabled a record yield of squalene at 2.6 mg/g fresh weight without compromising plant growth. Confocal fluorescent microscopy imaging, stimulated Raman scattering microscopy, and droplet composition analysis confirmed the formation of synthetic storage droplet in chloroplast. The co-compartmentation of synthetic storage droplet with a targeted metabolic pathway engineering represents a new strategy for enhancing bioproduct yield.
Inferring Fitness Effects from Time-Resolved Sequence Data with a Delay-Deterministic Model.
Nené, Nuno R; Dunham, Alistair S; Illingworth, Christopher J R
2018-05-01
A common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the nondeterministic properties of mutation in a finite population. We propose an alternative approach that acts to correct for this error, and which we denote the delay-deterministic model. Applying our model to a simple evolutionary system, we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model. Copyright © 2018 Nené et al.
Stochasticity and determinism in models of hematopoiesis.
Kimmel, Marek
2014-01-01
This chapter represents a novel view of modeling in hematopoiesis, synthesizing both deterministic and stochastic approaches. Whereas the stochastic models work in situations where chance dominates, for example when the number of cells is small, or under random mutations, the deterministic models are more important for large-scale, normal hematopoiesis. New types of models are on the horizon. These models attempt to account for distributed environments such as hematopoietic niches and their impact on dynamics. Mixed effects of such structures and chance events are largely unknown and constitute both a challenge and promise for modeling. Our discussion is presented under the separate headings of deterministic and stochastic modeling; however, the connections between both are frequently mentioned. Four case studies are included to elucidate important examples. We also include a primer of deterministic and stochastic dynamics for the reader's use.
Stochastic Petri Net extension of a yeast cell cycle model.
Mura, Ivan; Csikász-Nagy, Attila
2008-10-21
This paper presents the definition, solution and validation of a stochastic model of the budding yeast cell cycle, based on Stochastic Petri Nets (SPN). A specific family of SPNs is selected for building a stochastic version of a well-established deterministic model. We describe the procedure followed in defining the SPN model from the deterministic ODE model, a procedure that can be largely automated. The validation of the SPN model is conducted with respect to both the results provided by the deterministic one and the experimental results available from literature. The SPN model catches the behavior of the wild type budding yeast cells and a variety of mutants. We show that the stochastic model matches some characteristics of budding yeast cells that cannot be found with the deterministic model. The SPN model fine-tunes the simulation results, enriching the breadth and the quality of its outcome.
NASA Astrophysics Data System (ADS)
Venkatakrishnan, Vaidehi
1995-01-01
Physical and mathematical models provide a systematic means of looking at biological systems. Radioactive tracer kinetic studies open a unique window to study complex tracee systems such as protein metabolism in humans. This research deals with compartmental modeling of tracer kinetic data on leucine and apolipoprotein metabolism obtained using an endogenous tritiated leucine tracer administered as a bolus, and application of compartmental modeling techniques for dosimetric evaluation of metabolic studies of radioiodinated apolipoproteins. Dr. Waldo R. Fisher, Department of Medicine, was the coordinating research supervisor and the work was carried out in his laboratory. A compartmental model for leucine kinetics in humans has been developed that emphasizes its recycling pathways which were examined over two weeks. This model builds on a previously published model of Cobelli et al, that analyzed leucine kinetic data up to only eight hours. The proposed model includes different routes for re-entry of leucine from protein breakdown into plasma accounting for proteins which turn over at different rates. This new model successfully incorporates published models of three secretory proteins: albumin, apoA-I, and VLDL apoB, in toto thus increasing its validity and utility. The published model of apoA-I, based on an exogenous radioiodinated tracer, was examined with data obtained using an endogenous leucine tracer using compartmental techniques. The analysis concludes that the major portion of apoA-I enters plasma by a fast pathway but the major fraction of apoA-I in plasma resides with a second slow pathway; further the study is suggestive of a precursor-product relationship between the two plasma apoA-I pools. The possible relevance of the latter suggestion to the aberrant kinetics of apoA-I in Tangier disease is discussed. The analysis of apoA-II data resulted in similar conclusions. A methodology for evaluating the dosimetry of radioiodinated apolipoproteins by combining kinetic models of iodine and apolipoprotein metabolism has been developed. Residence times for source organs, whole body, thyroid, bladder, and red bone marrow obtained with this analysis, were used to calculate the cumulated activities and thus doses arising from these organs. The influence of the duration of the thyroid blocking period using stable iodine on the dose to the thyroid has been demonstrated.
Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan D.; ...
2015-03-17
Despite growing recognition that deterministic and stochastic factors simultaneously influence bacterial communities, little is known about mechanisms shifting their relative importance. To better understand underlying mechanisms, we developed a conceptual model linking ecosystem development during primary succession to shifts in the stochastic/deterministic balance. To evaluate the conceptual model we coupled spatiotemporal data on soil bacterial communities with environmental conditions spanning 105 years of salt marsh development. At the local scale there was a progression from stochasticity to determinism due to Na accumulation with increasing ecosystem age, supporting a main element of the conceptual model. At the regional-scale, soil organic mattermore » (SOM) governed the relative influence of stochasticity and the type of deterministic ecological selection, suggesting scale-dependency in how deterministic ecological selection is imposed. Analysis of a new ecological simulation model supported these conceptual inferences. Looking forward, we propose an extended conceptual model that integrates primary and secondary succession in microbial systems.« less
Nilpotent singularities and dynamics in an SIR type of compartmental model with hospital resources
NASA Astrophysics Data System (ADS)
Shan, Chunhua; Yi, Yingfei; Zhu, Huaiping
2016-03-01
An SIR type of compartmental model with a standard incidence rate and a nonlinear recovery rate was formulated to study the impact of available resources of public health system especially the number of hospital beds. Cusp, focus and elliptic type of nilpotent singularities of codimension 3 are discovered and analyzed in this three dimensional model. Complex dynamics of disease transmission including multi-steady states and multi-periodicity are revealed by bifurcation analysis. Large-amplitude oscillations found in our model provide a more reasonable explanation for disease recurrence. With clinical data, our studies have practical implications for the prevention and control of infectious diseases.
NASA Astrophysics Data System (ADS)
García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.
2018-07-01
In the past few decades, it has been recognized that 1 / f fluctuations are ubiquitous in nature. The most widely used mathematical models to capture the long-term memory properties of 1 / f fluctuations have been stochastic fractal models. However, physical systems do not usually consist of just stochastic fractal dynamics, but they often also show some degree of deterministic behavior. The present paper proposes a model based on fractal stochastic and deterministic components that can provide a valuable basis for the study of complex systems with long-term correlations. The fractal stochastic component is assumed to be a fractional Brownian motion process and the deterministic component is assumed to be a band-limited signal. We also provide a method that, under the assumptions of this model, is able to characterize the fractal stochastic component and to provide an estimate of the deterministic components present in a given time series. The method is based on a Bayesian wavelet shrinkage procedure that exploits the self-similar properties of the fractal processes in the wavelet domain. This method has been validated over simulated signals and over real signals with economical and biological origin. Real examples illustrate how our model may be useful for exploring the deterministic-stochastic duality of complex systems, and uncovering interesting patterns present in time series.
Drug delivery optimization through Bayesian networks.
Bellazzi, R.
1992-01-01
This paper describes how Bayesian Networks can be used in combination with compartmental models to plan Recombinant Human Erythropoietin (r-HuEPO) delivery in the treatment of anemia of chronic uremic patients. Past measurements of hematocrit or hemoglobin concentration in a patient during the therapy can be exploited to adjust the parameters of a compartmental model of the erythropoiesis. This adaptive process allows more accurate patient-specific predictions, and hence a more rational dosage planning. We describe a drug delivery optimization protocol, based on our approach. Some results obtained on real data are presented. PMID:1482938
Cognitive Diagnostic Analysis Using Hierarchically Structured Skills
ERIC Educational Resources Information Center
Su, Yu-Lan
2013-01-01
This dissertation proposes two modified cognitive diagnostic models (CDMs), the deterministic, inputs, noisy, "and" gate with hierarchy (DINA-H) model and the deterministic, inputs, noisy, "or" gate with hierarchy (DINO-H) model. Both models incorporate the hierarchical structures of the cognitive skills in the model estimation…
Multi-Scale Computational Models for Electrical Brain Stimulation
Seo, Hyeon; Jun, Sung C.
2017-01-01
Electrical brain stimulation (EBS) is an appealing method to treat neurological disorders. To achieve optimal stimulation effects and a better understanding of the underlying brain mechanisms, neuroscientists have proposed computational modeling studies for a decade. Recently, multi-scale models that combine a volume conductor head model and multi-compartmental models of cortical neurons have been developed to predict stimulation effects on the macroscopic and microscopic levels more precisely. As the need for better computational models continues to increase, we overview here recent multi-scale modeling studies; we focused on approaches that coupled a simplified or high-resolution volume conductor head model and multi-compartmental models of cortical neurons, and constructed realistic fiber models using diffusion tensor imaging (DTI). Further implications for achieving better precision in estimating cellular responses are discussed. PMID:29123476
Co-Compartmentation of Terpene Biosynthesis and Storage via Synthetic Droplet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Cheng; Kim, YongKyoung; Zeng, Yining
Traditional bioproduct engineering focuses on pathway optimization, yet is often complicated by product inhibition, downstream consumption, and the toxicity of certain products. Here, we present the co-compartmentation of biosynthesis and storage via a synthetic droplet as an effective new strategy to improve the bioproduct yield, with squalene as a model compound. A hydrophobic protein was designed and introduced into the tobacco chloroplast to generate a synthetic droplet for terpene storage. Simultaneously, squalene biosynthesis enzymes were introduced to chloroplasts together with the droplet-forming protein to co-compartmentalize the biosynthesis and storage of squalene. The strategy has enabled a record yield of squalenemore » at 2.6 mg/g fresh weight without compromising plant growth. Confocal fluorescent microscopy imaging, stimulated Raman scattering microscopy, and droplet composition analysis confirmed the formation of synthetic storage droplet in chloroplast. The co-compartmentation of synthetic storage droplet with a targeted metabolic pathway engineering represents a new strategy for enhancing bioproduct yield.« less
Co-Compartmentation of Terpene Biosynthesis and Storage via Synthetic Droplet
Zhao, Cheng; Kim, YongKyoung; Zeng, Yining; ...
2018-02-13
Traditional bioproduct engineering focuses on pathway optimization, yet is often complicated by product inhibition, downstream consumption, and the toxicity of certain products. Here, we present the co-compartmentation of biosynthesis and storage via a synthetic droplet as an effective new strategy to improve the bioproduct yield, with squalene as a model compound. A hydrophobic protein was designed and introduced into the tobacco chloroplast to generate a synthetic droplet for terpene storage. Simultaneously, squalene biosynthesis enzymes were introduced to chloroplasts together with the droplet-forming protein to co-compartmentalize the biosynthesis and storage of squalene. The strategy has enabled a record yield of squalenemore » at 2.6 mg/g fresh weight without compromising plant growth. Confocal fluorescent microscopy imaging, stimulated Raman scattering microscopy, and droplet composition analysis confirmed the formation of synthetic storage droplet in chloroplast. The co-compartmentation of synthetic storage droplet with a targeted metabolic pathway engineering represents a new strategy for enhancing bioproduct yield.« less
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.
Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M
2016-12-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.
A compartmental-spatial system dynamics approach to ground water modeling.
Roach, Jesse; Tidwell, Vince
2009-01-01
High-resolution, spatially distributed ground water flow models can prove unsuitable for the rapid, interactive analysis that is increasingly demanded to support a participatory decision environment. To address this shortcoming, we extend the idea of multiple cell (Bear 1979) and compartmental (Campana and Simpson 1984) ground water models developed within the context of spatial system dynamics (Ahmad and Simonovic 2004) for rapid scenario analysis. We term this approach compartmental-spatial system dynamics (CSSD). The goal is to balance spatial aggregation necessary to achieve a real-time integrative and interactive decision environment while maintaining sufficient model complexity to yield a meaningful representation of the regional ground water system. As a test case, a 51-compartment CSSD model was built and calibrated from a 100,0001 cell MODFLOW (McDonald and Harbaugh 1988) model of the Albuquerque Basin in central New Mexico (McAda and Barroll 2002). Seventy-seven percent of historical drawdowns predicted by the MODFLOW model were within 1 m of the corresponding CSSD estimates, and in 80% of the historical model run years the CSSD model estimates of river leakage, reservoir leakage, ground water flow to agricultural drains, and riparian evapotranspiration were within 30% of the corresponding estimates from McAda and Barroll (2002), with improved model agreement during the scenario period. Comparisons of model results demonstrate both advantages and limitations of the CCSD model approach.
ERIC Educational Resources Information Center
Chow, Meyrick; Chan, Lawrence
2010-01-01
Information technology (IT) has the potential to improve the clinical learning environment. The extent to which IT enhances or detracts from healthcare professionals' role performance can be expected to affect both student learning and patient outcomes. This study evaluated nursing students' satisfaction with a novel compartmental Picture…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapin, M.A.; Mahaffie, M.J.; Tiller, G.M.
1996-12-31
Economics of most deep-water development projects require large reservoir volumes to be drained with relatively few wells. The presence of reservoir compartments must therefore be detected and planned for in a pre-development stage. We have used 3-D seismic data to constrain large-scale, deterministic reservoir bodies in a 3-D architecture model of Pliocene-turbidite sands of the {open_quotes}E{close_quotes} or {open_quotes}Pink{close_quotes} reservoir, Prospect Mars, Mississippi Canyon Areas 763 and 807, Gulf of Mexico. Reservoir compartmentalization is influenced by stratigraphic shingling, which in turn is caused by low accommodation space predentin the upper portion of a ponded seismic sequence within a salt withdrawal mini-basin.more » The accumulation is limited by updip onlap onto a condensed section marl, and by lateral truncation by a large scale submarine erosion surface. Compartments were suggested by RFT pressure variations and by geochemical analysis of RFT fluid samples. A geological interpretation derived from high-resolution 3-D seismic and three wells was linked to 3-D architecture models through seismic inversion, resulting in a reservoir all available data. Distinguishing subtle stratigraphical shingles from faults was accomplished by detailed, loop-level mapping, and was important to characterize the different types of reservoir compartments. Seismic inversion was used to detune the seismic amplitude, adjust sandbody thickness, and update the rock properties. Recent development wells confirm the architectural style identified. This modeling project illustrates how high-quality seismic data and architecture models can be combined in a pre-development phase of a prospect, in order to optimize well placement.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapin, M.A.; Mahaffie, M.J.; Tiller, G.M.
1996-01-01
Economics of most deep-water development projects require large reservoir volumes to be drained with relatively few wells. The presence of reservoir compartments must therefore be detected and planned for in a pre-development stage. We have used 3-D seismic data to constrain large-scale, deterministic reservoir bodies in a 3-D architecture model of Pliocene-turbidite sands of the [open quotes]E[close quotes] or [open quotes]Pink[close quotes] reservoir, Prospect Mars, Mississippi Canyon Areas 763 and 807, Gulf of Mexico. Reservoir compartmentalization is influenced by stratigraphic shingling, which in turn is caused by low accommodation space predentin the upper portion of a ponded seismic sequence withinmore » a salt withdrawal mini-basin. The accumulation is limited by updip onlap onto a condensed section marl, and by lateral truncation by a large scale submarine erosion surface. Compartments were suggested by RFT pressure variations and by geochemical analysis of RFT fluid samples. A geological interpretation derived from high-resolution 3-D seismic and three wells was linked to 3-D architecture models through seismic inversion, resulting in a reservoir all available data. Distinguishing subtle stratigraphical shingles from faults was accomplished by detailed, loop-level mapping, and was important to characterize the different types of reservoir compartments. Seismic inversion was used to detune the seismic amplitude, adjust sandbody thickness, and update the rock properties. Recent development wells confirm the architectural style identified. This modeling project illustrates how high-quality seismic data and architecture models can be combined in a pre-development phase of a prospect, in order to optimize well placement.« less
Ngonghala, Calistus N; Teboh-Ewungkem, Miranda I; Ngwa, Gideon A
2015-06-01
We derive and study a deterministic compartmental model for malaria transmission with varying human and mosquito populations. Our model considers disease-related deaths, asymptomatic immune humans who are also infectious, as well as mosquito demography, reproduction and feeding habits. Analysis of the model reveals the existence of a backward bifurcation and persistent limit cycles whose period and size is determined by two threshold parameters: the vectorial basic reproduction number Rm, and the disease basic reproduction number R0, whose size can be reduced by reducing Rm. We conclude that malaria dynamics are indeed oscillatory when the methodology of explicitly incorporating the mosquito's demography, feeding and reproductive patterns is considered in modeling the mosquito population dynamics. A sensitivity analysis reveals important control parameters that can affect the magnitudes of Rm and R0, threshold quantities to be taken into consideration when designing control strategies. Both Rm and the intrinsic period of oscillation are shown to be highly sensitive to the mosquito's birth constant λm and the mosquito's feeding success probability pw. Control of λm can be achieved by spraying, eliminating breeding sites or moving them away from human habitats, while pw can be controlled via the use of mosquito repellant and insecticide-treated bed-nets. The disease threshold parameter R0 is shown to be highly sensitive to pw, and the intrinsic period of oscillation is also sensitive to the rate at which reproducing mosquitoes return to breeding sites. A global sensitivity and uncertainty analysis reveals that the ability of the mosquito to reproduce and uncertainties in the estimations of the rates at which exposed humans become infectious and infectious humans recover from malaria are critical in generating uncertainties in the disease classes.
Weinberg, Seth H.; Smith, Gregory D.
2012-01-01
Cardiac myocyte calcium signaling is often modeled using deterministic ordinary differential equations (ODEs) and mass-action kinetics. However, spatially restricted “domains” associated with calcium influx are small enough (e.g., 10−17 liters) that local signaling may involve 1–100 calcium ions. Is it appropriate to model the dynamics of subspace calcium using deterministic ODEs or, alternatively, do we require stochastic descriptions that account for the fundamentally discrete nature of these local calcium signals? To address this question, we constructed a minimal Markov model of a calcium-regulated calcium channel and associated subspace. We compared the expected value of fluctuating subspace calcium concentration (a result that accounts for the small subspace volume) with the corresponding deterministic model (an approximation that assumes large system size). When subspace calcium did not regulate calcium influx, the deterministic and stochastic descriptions agreed. However, when calcium binding altered channel activity in the model, the continuous deterministic description often deviated significantly from the discrete stochastic model, unless the subspace volume is unrealistically large and/or the kinetics of the calcium binding are sufficiently fast. This principle was also demonstrated using a physiologically realistic model of calmodulin regulation of L-type calcium channels introduced by Yue and coworkers. PMID:23509597
Documenting Models for Interoperability and Reusability (proceedings)
Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration be...
Documenting Models for Interoperability and Reusability
Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration be...
Zhang, Xinyu; Zhong, Lin; Romero-Severson, Ethan; Alam, Shah Jamal; Henry, Christopher J; Volz, Erik M; Koopman, James S
2012-11-01
A deterministic compartmental model was explored that relaxed the unrealistic assumption in most HIV transmission models that behaviors of individuals are constant over time. A simple model was formulated to better explain the effects observed. Individuals had a high and a low contact rate and went back and forth between them. This episodic risk behavior interacted with the short period of high transmissibility during acute HIV infection to cause dramatic increases in prevalence as the differences between high and low contact rates increased and as the duration of high risk better matched the duration of acute HIV infection. These same changes caused a considerable increase in the fraction of all transmissions that occurred during acute infection. These strong changes occurred despite a constant total number of contacts and a constant total transmission potential from acute infection. Two phenomena played a strong role in generating these effects. First, people were infected more often during their high contact rate phase and they remained with high contact rates during the highly contagious acute infection stage. Second, when individuals with previously low contact rates moved into an episodic high-risk period, they were more likely to be susceptible and thus provided more high contact rate susceptible individuals who could get infected. These phenomena make test and treat control strategies less effective and could cause some behavioral interventions to increase transmission. Signature effects on genetic patterns between HIV strains could make it possible to determine whether these episodic risk effects are acting in a population.
NASA Astrophysics Data System (ADS)
Falk, Martin; Naumova, Natasha; Fudenberg, Geoffrey; Feodorova, Yana; Imakaev, Maxim; Dekker, Job; Solovei, Irina; Mirny, Leonid
The organization of interphase nuclei differs dramatically across cell types in a functionally-relevant fashion. A striking example is found in the rod photoreceptors of nocturnal mammals, where the conventional nuclear organization is inverted. In particular, in murine rods, constitutive heterochromatin is packed into a single chromocenter in the nuclear center, which is encircled by a shell of facultative heterochromatin and then by an outermost shell of euchromatin. Surprisingly, Hi-C maps of conventional and inverted nuclei display remarkably similar compartmentalization between heterochromatin and euchromatin. Here, we simulate a de novo polymer model that is capable of replicating both conventional and inverted geometries while preserving the patterns of compartmentalization as observed by Hi-C. In this model, chromatin is a polymer composed of three classes of monomers arranged in blocks representing constitutive heterochromatin, facultative heterochromatin, and euchromatin. Different classes of monomers have different levels of attraction to each other and to the nuclear lamina. Our results indicate that preferential interactions between facultative heterochromatin and constitutive heterochromatin provide a possible mechanism to explain nuclear inversion when association with the lamina is lost.
Modelling the transmission of healthcare associated infections: a systematic review
2013-01-01
Background Dynamic transmission models are increasingly being used to improve our understanding of the epidemiology of healthcare-associated infections (HCAI). However, there has been no recent comprehensive review of this emerging field. This paper summarises how mathematical models have informed the field of HCAI and how methods have developed over time. Methods MEDLINE, EMBASE, Scopus, CINAHL plus and Global Health databases were systematically searched for dynamic mathematical models of HCAI transmission and/or the dynamics of antimicrobial resistance in healthcare settings. Results In total, 96 papers met the eligibility criteria. The main research themes considered were evaluation of infection control effectiveness (64%), variability in transmission routes (7%), the impact of movement patterns between healthcare institutes (5%), the development of antimicrobial resistance (3%), and strain competitiveness or co-colonisation with different strains (3%). Methicillin-resistant Staphylococcus aureus was the most commonly modelled HCAI (34%), followed by vancomycin resistant enterococci (16%). Other common HCAIs, e.g. Clostridum difficile, were rarely investigated (3%). Very few models have been published on HCAI from low or middle-income countries. The first HCAI model has looked at antimicrobial resistance in hospital settings using compartmental deterministic approaches. Stochastic models (which include the role of chance in the transmission process) are becoming increasingly common. Model calibration (inference of unknown parameters by fitting models to data) and sensitivity analysis are comparatively uncommon, occurring in 35% and 36% of studies respectively, but their application is increasing. Only 5% of models compared their predictions to external data. Conclusions Transmission models have been used to understand complex systems and to predict the impact of control policies. Methods have generally improved, with an increased use of stochastic models, and more advanced methods for formal model fitting and sensitivity analyses. Insights gained from these models could be broadened to a wider range of pathogens and settings. Improvements in the availability of data and statistical methods could enhance the predictive ability of models. PMID:23809195
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology
Gao, Fei; Li, Ye; Novak, Igor L.; Slepchenko, Boris M.
2016-01-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium ‘sparks’ as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell. PMID:27959915
Simons, Margaret; Saha, Rajib; Amiour, Nardjis; Kumar, Akhil; Guillard, Lenaïg; Clément, Gilles; Miquel, Martine; Li, Zhenni; Mouille, Gregory; Lea, Peter J.; Hirel, Bertrand; Maranas, Costas D.
2014-01-01
Maize (Zea mays) is an important C4 plant due to its widespread use as a cereal and energy crop. A second-generation genome-scale metabolic model for the maize leaf was created to capture C4 carbon fixation and investigate nitrogen (N) assimilation by modeling the interactions between the bundle sheath and mesophyll cells. The model contains gene-protein-reaction relationships, elemental and charge-balanced reactions, and incorporates experimental evidence pertaining to the biomass composition, compartmentalization, and flux constraints. Condition-specific biomass descriptions were introduced that account for amino acids, fatty acids, soluble sugars, proteins, chlorophyll, lignocellulose, and nucleic acids as experimentally measured biomass constituents. Compartmentalization of the model is based on proteomic/transcriptomic data and literature evidence. With the incorporation of information from the MetaCrop and MaizeCyc databases, this updated model spans 5,824 genes, 8,525 reactions, and 9,153 metabolites, an increase of approximately 4 times the size of the earlier iRS1563 model. Transcriptomic and proteomic data have also been used to introduce regulatory constraints in the model to simulate an N-limited condition and mutants deficient in glutamine synthetase, gln1-3 and gln1-4. Model-predicted results achieved 90% accuracy when comparing the wild type grown under an N-complete condition with the wild type grown under an N-deficient condition. PMID:25248718
Hybrid deterministic/stochastic simulation of complex biochemical systems.
Lecca, Paola; Bagagiolo, Fabio; Scarpa, Marina
2017-11-21
In a biological cell, cellular functions and the genetic regulatory apparatus are implemented and controlled by complex networks of chemical reactions involving genes, proteins, and enzymes. Accurate computational models are indispensable means for understanding the mechanisms behind the evolution of a complex system, not always explored with wet lab experiments. To serve their purpose, computational models, however, should be able to describe and simulate the complexity of a biological system in many of its aspects. Moreover, it should be implemented by efficient algorithms requiring the shortest possible execution time, to avoid enlarging excessively the time elapsing between data analysis and any subsequent experiment. Besides the features of their topological structure, the complexity of biological networks also refers to their dynamics, that is often non-linear and stiff. The stiffness is due to the presence of molecular species whose abundance fluctuates by many orders of magnitude. A fully stochastic simulation of a stiff system is computationally time-expensive. On the other hand, continuous models are less costly, but they fail to capture the stochastic behaviour of small populations of molecular species. We introduce a new efficient hybrid stochastic-deterministic computational model and the software tool MoBioS (MOlecular Biology Simulator) implementing it. The mathematical model of MoBioS uses continuous differential equations to describe the deterministic reactions and a Gillespie-like algorithm to describe the stochastic ones. Unlike the majority of current hybrid methods, the MoBioS algorithm divides the reactions' set into fast reactions, moderate reactions, and slow reactions and implements a hysteresis switching between the stochastic model and the deterministic model. Fast reactions are approximated as continuous-deterministic processes and modelled by deterministic rate equations. Moderate reactions are those whose reaction waiting time is greater than the fast reaction waiting time but smaller than the slow reaction waiting time. A moderate reaction is approximated as a stochastic (deterministic) process if it was classified as a stochastic (deterministic) process at the time at which it crosses the threshold of low (high) waiting time. A Gillespie First Reaction Method is implemented to select and execute the slow reactions. The performances of MoBios were tested on a typical example of hybrid dynamics: that is the DNA transcription regulation. The simulated dynamic profile of the reagents' abundance and the estimate of the error introduced by the fully deterministic approach were used to evaluate the consistency of the computational model and that of the software tool.
Deterministic models for traffic jams
NASA Astrophysics Data System (ADS)
Nagel, Kai; Herrmann, Hans J.
1993-10-01
We study several deterministic one-dimensional traffic models. For integer positions and velocities we find the typical high and low density phases separated by a simple transition. If positions and velocities are continuous variables the model shows self-organized critically driven by the slowest car.
Abrams , Robert H.; Loague, Keith
2000-01-01
This paper, the first of two parts [see Abrams and Loague, this issue], takes the compartmentalized approach for the geochemical evolution of redox zones presented by Abrams et al. [1998] and embeds it within a solute transport framework. In this paper the compartmentalized approach is generalized to facilitate the description of its incorporation into a solute transport simulator. An equivalent formulation is developed which removes any discontinuities that may occur when switching compartments. Rate‐limited redox reactions are modeled with a modified Monod relationship that allows either the organic substrate or the electron acceptor to be the rate‐limiting reactant. Thermodynamic constraints are used to inhibit lower‐energy redox reactions from occurring under infeasible geochemical conditions without imposing equilibrium on the lower‐energy reactions. The procedure used allows any redox reaction to be simulated as being kinetically limited or thermodynamically limited, depending on local geochemical conditions. Empirical reaction inhibition methods are not needed. The sequential iteration approach (SIA), a technique which allows the number of solute transport equations to be reduced, is adopted to solve the coupled geochemical/solute transport problem. When the compartmentalized approach is embedded within the SIA, with the total analytical concentration of each component as the dependent variable in the transport equation, it is possible to reduce the number of transport equations even further than with the unmodified SIA. A one‐dimensional, coupled geochemical/solute transport simulation is presented in which redox zones evolve dynamically in time and space. The compartmentalized solute transport (COMPTRAN) model described in this paper enables the development of redox zones to be simulated under both kinetic and thermodynamic constraints. The modular design of COMPTRAN facilitates the use of many different, preexisting solute transport and geochemical codes. The companion paper [Abrams and Loague, this issue] presents examples of the application of COMPTRAN to field‐scale problems.
Model Hierarchies in Edge-Based Compartmental Modeling for Infectious Disease Spread
Miller, Joel C.; Volz, Erik M.
2012-01-01
We consider the family of edge-based compartmental models for epidemic spread developed in [11]. These models allow for a range of complex behaviors, and in particular allow us to explicitly incorporate duration of a contact into our mathematical models. Our focus here is to identify conditions under which simpler models may be substituted for more detailed models, and in so doing we define a hierarchy of epidemic models. In particular we provide conditions under which it is appropriate to use the standard mass action SIR model, and we show what happens when these conditions fail. Using our hierarchy, we provide a procedure leading to the choice of the appropriate model for a given population. Our result about the convergence of models to the Mass Action model gives clear, rigorous conditions under which the Mass Action model is accurate. PMID:22911242
Alam, Shah Jamal; Zhang, Xinyu; Romero-Severson, Ethan Obie; Henry, Christopher; Zhong, Lin; Volz, Erik M.; Brenner, Bluma G.; Koopman, James S.
2013-01-01
Episodic high-risk sexual behavior is common and can have a profound effect on HIV transmission. In a model of HIV transmission among men who have sex with men (MSM), changing the frequency, duration and contact rates of high-risk episodes can take endemic prevalence from zero to 50% and more than double transmissions during acute HIV infection (AHI). Undirected test and treat could be inefficient in the presence of strong episodic risk effects. Partner services approaches that use a variety of control options will be likely to have better effects under these conditions, but the question remains: What data will reveal if a population is experiencing episodic risk effects? HIV sequence data from Montreal reveals genetic clusters whose size distribution stabilizes over time and reflects the size distribution of acute infection outbreaks (AIOs). Surveillance provides complementary behavioral data. In order to use both types of data efficiently, it is essential to examine aspects of models that affect both the episodic risk effects and the shape of transmission trees. As a demonstration, we use a deterministic compartmental model of episodic risk to explore the determinants of the fraction of transmissions during acute HIV infection (AHI) at the endemic equilibrium. We use a corresponding individual-based model to observe AIO size distributions and patterns of transmission within AIO. Episodic risk parameters determining whether AHI transmission trees had longer chains, more clustered transmissions from single individuals, or different mixes of these were explored. Encouragingly for parameter estimation, AIO size distributions reflected the frequency of transmissions from acute infection across divergent parameter sets. Our results show that episodic risk dynamics influence both the size and duration of acute infection outbreaks, thus providing a possible link between genetic cluster size distributions and episodic risk dynamics. PMID:23438430
Population profiling in China by gender and age: implication for HIV incidences.
Pan, Yuanyi; Wu, Jianhong
2009-11-18
With the world's largest population, HIV spread in China has been closely watched and widely studied by its government and the international community. One important factor that might contribute to the epidemic is China's numerous surplus of men, due to its imbalanced sex ratio in newborns. However, the sex ratio in the human population is often assumed to be 1:1 in most studies of sexually transmitted diseases (STDs). Here, a mathematical model is proposed to estimate the population size in each gender and within different stages of reproduction and sexual activities. This population profiling by age and gender will assist in more precise prediction of HIV incidences. The total population is divided into 6 subgroups by gender and age. A deterministic compartmental model is developed to describe birth, death, age and the interactions among different subgroups, with a focus on the preference for newborn boys and its impact for the sex ratios. Data from 2003 to 2007 is used to estimate model parameters, and simulations predict short-term and long-term population profiles. The population of China will go to a descending track around 2030. Despite the possible underestimated number of newborns in the last couple of years, model-based simulations show that there will be about 28 million male individuals in 2055 without female partners during their sexually active stages. The birth rate in China must be increased to keep the population viable. But increasing the birth rate without balancing the sex ratio in newborns is problematic, as this will generate a large number of surplus males. Besides other social, economic and psychological issues, the impact of this surplus of males on STD incidences, including HIV infections, must be dealt with as early as possible.
Estimation of pharmacokinetic parameters from non-compartmental variables using Microsoft Excel.
Dansirikul, Chantaratsamon; Choi, Malcolm; Duffull, Stephen B
2005-06-01
This study was conducted to develop a method, termed 'back analysis (BA)', for converting non-compartmental variables to compartment model dependent pharmacokinetic parameters for both one- and two-compartment models. A Microsoft Excel spreadsheet was implemented with the use of Solver and visual basic functions. The performance of the BA method in estimating pharmacokinetic parameter values was evaluated by comparing the parameter values obtained to a standard modelling software program, NONMEM, using simulated data. The results show that the BA method was reasonably precise and provided low bias in estimating fixed and random effect parameters for both one- and two-compartment models. The pharmacokinetic parameters estimated from the BA method were similar to those of NONMEM estimation.
Self-Concept Structure and the Quality of Self-Knowledge.
Showers, Carolin J; Ditzfeld, Christopher P; Zeigler-Hill, Virgil
2015-10-01
This article explores the hidden vulnerability of individuals with compartmentalized self-concept structures by linking research on self-organization to related models of self-functioning. Across three studies, college students completed self-descriptive card sorts as a measure of self-concept structure and either the Contingencies of Self-Worth Scale, Likert ratings of perceived authenticity of self-aspects, or a response latency measure of self-esteem accessibility. In all, there were 382 participants (247 females; 77% White, 6% Hispanic, 5% Black, 5% Asian, 4% Native American, and 3% other). Consistent with their unstable self-evaluations, compartmentalized individuals report greater contingencies of self-worth and describe their experience of multiple self-aspects as less authentic than do individuals with integrative self-organization. Compartmentalized individuals also make global self-evaluations more slowly than do integrative individuals. Together with previous findings on self-clarity, these results suggest that compartmentalized individuals may experience difficulties in how they know the self, whereas individuals with integrative self-organization may display greater continuity and evaluative consistency across self-aspects, with easier access to evaluative self-knowledge. © 2014 Wiley Periodicals, Inc.
Self-Concept Structure and the Quality of Self-Knowledge
Showers, Carolin J.; Ditzfeld, Christopher P.; Zeigler-Hill, Virgil
2014-01-01
Objective Explores the hidden vulnerability of individuals with compartmentalized self-concept structures by linking research on self-organization to related models of self functioning. Method Across three studies, college students completed self-descriptive card sorts as a measure of self-concept structure and either the Contingencies of Self-Worth Scale; Likert ratings of perceived authenticity of self-aspects; or a response latency measure of self-esteem accessibility. In all, there were 382 participants (247 females; 77% White, 6% Hispanic, 5% Black, 5% Asian, 4% Native American, and 3% Other). Results Consistent with their unstable self-evaluations, compartmentalized individuals report greater contingencies of self-worth and describe their experience of multiple self-aspects as less authentic than do individuals with integrative self-organization. Compartmentalized individuals also make global self-evaluations more slowly than do integrative individuals. Conclusions Together with previous findings on self-clarity, these results suggest that compartmentalized individuals may experience difficulties in how they know the self, whereas individuals with integrative self-organization may display greater continuity and evaluative consistency across self-aspects, with easier access to evaluative self-knowledge. PMID:25180616
Realistic Simulation for Body Area and Body-To-Body Networks
Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D’Errico, Raffaele
2016-01-01
In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices. PMID:27104537
Realistic Simulation for Body Area and Body-To-Body Networks.
Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D'Errico, Raffaele
2016-04-20
In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices.
Korostil, Igor A; Peters, Gareth W; Law, Matthew G; Regan, David G
2013-04-08
Deterministic dynamic compartmental transmission models (DDCTMs) of human papillomavirus (HPV) transmission have been used in a number of studies to estimate the potential impact of HPV vaccination programs. In most cases, the models were built under the assumption that an individual who cleared HPV infection develops (life-long) natural immunity against re-infection with the same HPV type (this is known as SIR scenario). This assumption was also made by two Australian modelling studies evaluating the impact of the National HPV Vaccination Program to assist in the health-economic assessment of male vaccination. An alternative view denying natural immunity after clearance (SIS scenario) was only presented in one study, although neither scenario has been supported by strong evidence. Some recent findings, however, provide arguments in favour of SIS. We developed HPV transmission models implementing life-time (SIR), limited, and non-existent (SIS) natural immunity. For each model we estimated the herd immunity effect of the ongoing Australian HPV vaccination program and its extension to cover males. Given the Australian setting, we aimed to clarify the extent to which the choice of model structure would influence estimation of this effect. A statistically robust and efficient calibration methodology was applied to ensure credibility of our results. We observed that for non-SIR models the herd immunity effect measured in relative reductions in HPV prevalence in the unvaccinated population was much more pronounced than for the SIR model. For example, with vaccine efficacy of 95% for females and 90% for males, the reductions for HPV-16 were 3% in females and 28% in males for the SIR model, and at least 30% (females) and 60% (males) for non-SIR models. The magnitude of these differences implies that evaluations of the impact of vaccination programs using DDCTMs should incorporate several model structures until our understanding of natural immunity is improved. Copyright © 2013 Elsevier Ltd. All rights reserved.
Branovets, Jelena; Sepp, Mervi; Kotlyarova, Svetlana; Jepihhina, Natalja; Sokolova, Niina; Aksentijevic, Dunja; Lygate, Craig A.; Neubauer, Stefan; Birkedal, Rikke
2013-01-01
Disruption of the creatine kinase (CK) system in hearts of CK-deficient mice leads to changes in the ultrastructure and regulation of mitochondrial respiration. We expected to see similar changes in creatine-deficient mice, which lack the enzyme guanidinoacetate methyltransferase (GAMT) to produce creatine. The aim of this study was to characterize the changes in cardiomyocyte mitochondrial organization, regulation of respiration, and intracellular compartmentation associated with GAMT deficiency. Three-dimensional mitochondrial organization was assessed by confocal microscopy. On populations of permeabilized cardiomyocytes, we recorded ADP and ATP kinetics of respiration, competition between mitochondria and pyruvate kinase for ADP produced by ATPases, ADP kinetics of endogenous pyruvate kinase, and ATP kinetics of ATPases. These data were analyzed by mathematical models to estimate intracellular compartmentation. Quantitative analysis of morphological and kinetic data as well as derived model fits showed no difference between GAMT-deficient and wild-type mice. We conclude that inactivation of the CK system by GAMT deficiency does not alter mitochondrial organization and intracellular compartmentation in relaxed cardiomyocytes. Thus, our results suggest that the healthy heart is able to preserve cardiac function at a basal level in the absence of CK-facilitated energy transfer without compromising intracellular organization and the regulation of mitochondrial energy homeostasis. This raises questions on the importance of the CK system as a spatial energy buffer in unstressed cardiomyocytes. PMID:23792673
The Total Exposure Model (TEM) uses deterministic and stochastic methods to estimate the exposure of a person performing daily activities of eating, drinking, showering, and bathing. There were 250 time histories generated, by subject with activities, for the three exposure ro...
Individual-based modelling and control of bovine brucellosis
NASA Astrophysics Data System (ADS)
Nepomuceno, Erivelton G.; Barbosa, Alípio M.; Silva, Marcos X.; Perc, Matjaž
2018-05-01
We present a theoretical approach to control bovine brucellosis. We have used individual-based modelling, which is a network-type alternative to compartmental models. Our model thus considers heterogeneous populations, and spatial aspects such as migration among herds and control actions described as pulse interventions are also easily implemented. We show that individual-based modelling reproduces the mean field behaviour of an equivalent compartmental model. Details of this process, as well as flowcharts, are provided to facilitate the reproduction of the presented results. We further investigate three numerical examples using real parameters of herds in the São Paulo state of Brazil, in scenarios which explore eradication, continuous and pulsed vaccination and meta-population effects. The obtained results are in good agreement with the expected behaviour of this disease, which ultimately showcases the effectiveness of our theory.
Using stochastic models to incorporate spatial and temporal variability [Exercise 14
Carolyn Hull Sieg; Rudy M. King; Fred Van Dyke
2003-01-01
To this point, our analysis of population processes and viability in the western prairie fringed orchid has used only deterministic models. In this exercise, we conduct a similar analysis, using a stochastic model instead. This distinction is of great importance to population biology in general and to conservation biology in particular. In deterministic models,...
Stochastic and deterministic models for agricultural production networks.
Bai, P; Banks, H T; Dediu, S; Govan, A Y; Last, M; Lloyd, A L; Nguyen, H K; Olufsen, M S; Rempala, G; Slenning, B D
2007-07-01
An approach to modeling the impact of disturbances in an agricultural production network is presented. A stochastic model and its approximate deterministic model for averages over sample paths of the stochastic system are developed. Simulations, sensitivity and generalized sensitivity analyses are given. Finally, it is shown how diseases may be introduced into the network and corresponding simulations are discussed.
Mechanisms underlying subunit independence in pyramidal neuron dendrites
Behabadi, Bardia F.; Mel, Bartlett W.
2014-01-01
Pyramidal neuron (PN) dendrites compartmentalize voltage signals and can generate local spikes, which has led to the proposal that their dendrites act as independent computational subunits within a multilayered processing scheme. However, when a PN is strongly activated, back-propagating action potentials (bAPs) sweeping outward from the soma synchronize dendritic membrane potentials many times per second. How PN dendrites maintain the independence of their voltage-dependent computations, despite these repeated voltage resets, remains unknown. Using a detailed compartmental model of a layer 5 PN, and an improved method for quantifying subunit independence that incorporates a more accurate model of dendritic integration, we first established that the output of each dendrite can be almost perfectly predicted by the intensity and spatial configuration of its own synaptic inputs, and is nearly invariant to the rate of bAP-mediated “cross-talk” from other dendrites over a 100-fold range. Then, through an analysis of conductance, voltage, and current waveforms within the model cell, we identify three biophysical mechanisms that together help make independent dendritic computation possible in a firing neuron, suggesting that a major subtype of neocortical neuron has been optimized for layered, compartmentalized processing under in-vivo–like spiking conditions. PMID:24357611
Characterizing Uncertainty and Variability in PBPK Models ...
Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variability; such characterization for PBPK model predictions represents a continuing challenge to both modelers and users. Current practices show significant progress in specifying deterministic biological models and the non-deterministic (often statistical) models, estimating their parameters using diverse data sets from multiple sources, and using them to make predictions and characterize uncertainty and variability. The International Workshop on Uncertainty and Variability in PBPK Models, held Oct 31-Nov 2, 2006, sought to identify the state-of-the-science in this area and recommend priorities for research and changes in practice and implementation. For the short term, these include: (1) multidisciplinary teams to integrate deterministic and non-deterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through more complete documentation of the model structure(s) and parameter values, the results of sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include: (1) theoretic and practical methodological impro
Estimating the epidemic threshold on networks by deterministic connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Kezan, E-mail: lkzzr@sohu.com; Zhu, Guanghu; Fu, Xinchu
2014-12-15
For many epidemic networks some connections between nodes are treated as deterministic, while the remainder are random and have different connection probabilities. By applying spectral analysis to several constructed models, we find that one can estimate the epidemic thresholds of these networks by investigating information from only the deterministic connections. Nonetheless, in these models, generic nonuniform stochastic connections and heterogeneous community structure are also considered. The estimation of epidemic thresholds is achieved via inequalities with upper and lower bounds, which are found to be in very good agreement with numerical simulations. Since these deterministic connections are easier to detect thanmore » those stochastic connections, this work provides a feasible and effective method to estimate the epidemic thresholds in real epidemic networks.« less
Effect of sample volume on metastable zone width and induction time
NASA Astrophysics Data System (ADS)
Kubota, Noriaki
2012-04-01
The metastable zone width (MSZW) and the induction time, measured for a large sample (say>0.1 L) are reproducible and deterministic, while, for a small sample (say<1 mL), these values are irreproducible and stochastic. Such behaviors of MSZW and induction time were theoretically discussed both with stochastic and deterministic models. Equations for the distribution of stochastic MSZW and induction time were derived. The average values of stochastic MSZW and induction time both decreased with an increase in sample volume, while, the deterministic MSZW and induction time remained unchanged. Such different behaviors with variation in sample volume were explained in terms of detection sensitivity of crystallization events. The average values of MSZW and induction time in the stochastic model were compared with the deterministic MSZW and induction time, respectively. Literature data reported for paracetamol aqueous solution were explained theoretically with the presented models.
Failed rib region prediction in a human body model during crash events with precrash braking.
Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S
2018-02-28
The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.
Ibrahim, Ahmad M.; Wilson, Paul P.H.; Sawan, Mohamed E.; ...
2015-06-30
The CADIS and FW-CADIS hybrid Monte Carlo/deterministic techniques dramatically increase the efficiency of neutronics modeling, but their use in the accurate design analysis of very large and geometrically complex nuclear systems has been limited by the large number of processors and memory requirements for their preliminary deterministic calculations and final Monte Carlo calculation. Three mesh adaptivity algorithms were developed to reduce the memory requirements of CADIS and FW-CADIS without sacrificing their efficiency improvement. First, a macromaterial approach enhances the fidelity of the deterministic models without changing the mesh. Second, a deterministic mesh refinement algorithm generates meshes that capture as muchmore » geometric detail as possible without exceeding a specified maximum number of mesh elements. Finally, a weight window coarsening algorithm decouples the weight window mesh and energy bins from the mesh and energy group structure of the deterministic calculations in order to remove the memory constraint of the weight window map from the deterministic mesh resolution. The three algorithms were used to enhance an FW-CADIS calculation of the prompt dose rate throughout the ITER experimental facility. Using these algorithms resulted in a 23.3% increase in the number of mesh tally elements in which the dose rates were calculated in a 10-day Monte Carlo calculation and, additionally, increased the efficiency of the Monte Carlo simulation by a factor of at least 3.4. The three algorithms enabled this difficult calculation to be accurately solved using an FW-CADIS simulation on a regular computer cluster, eliminating the need for a world-class super computer.« less
Zanzonico, Pat; Carrasquillo, Jorge A; Pandit-Taskar, Neeta; O'Donoghue, Joseph A; Humm, John L; Smith-Jones, Peter; Ruan, Shutian; Divgi, Chaitanya; Scott, Andrew M; Kemeny, Nancy E; Fong, Yuman; Wong, Douglas; Scheinberg, David; Ritter, Gerd; Jungbluth, Achem; Old, Lloyd J; Larson, Steven M
2015-10-01
The molecular specificity of monoclonal antibodies (mAbs) directed against tumor antigens has proven effective for targeted therapy of human cancers, as shown by a growing list of successful antibody-based drug products. We describe a novel, nonlinear compartmental model using PET-derived data to determine the "best-fit" parameters and model-derived quantities for optimizing biodistribution of intravenously injected (124)I-labeled antitumor antibodies. As an example of this paradigm, quantitative image and kinetic analyses of anti-A33 humanized mAb (also known as "A33") were performed in 11 colorectal cancer patients. Serial whole-body PET scans of (124)I-labeled A33 and blood samples were acquired and the resulting tissue time-activity data for each patient were fit to a nonlinear compartmental model using the SAAM II computer code. Excellent agreement was observed between fitted and measured parameters of tumor uptake, "off-target" uptake in bowel mucosa, blood clearance, tumor antigen levels, and percent antigen occupancy. This approach should be generally applicable to antibody-antigen systems in human tumors for which the masses of antigen-expressing tumor and of normal tissues can be estimated and for which antibody kinetics can be measured with PET. Ultimately, based on each patient's resulting "best-fit" nonlinear model, a patient-specific optimum mAb dose (in micromoles, for example) may be derived.
Rotenone and paraquat perturb dopamine metabolism: a computational analysis of pesticide toxicity
Qi, Zhen; Miller, Gary W.; Voit, Eberhard O.
2014-01-01
Pesticides, such as rotenone and paraquat, are suspected in the pathogenesis of Parkinson’s disease (PD), whose hallmark is the progressive loss of dopaminergic neurons in the substantia nigra pars compacta. Thus, compounds expected to play a role in the pathogenesis of PD will likely impact the function of dopaminergic neurons. To explore the relationship between pesticide exposure and dopaminergic toxicity, we developed a custom-tailored mathematical model of dopamine metabolism and utilized it to infer potential mechanisms underlying the toxicity of rotenone and paraquat, asking how these pesticides perturb specific processes. We performed two types of analyses, which are conceptually different and complement each other. The first analysis, a purely algebraic reverse engineering approach, analytically and deterministically computes the altered profile of enzyme activities that characterize the effects of a pesticide. The second method consists of large-scale Monte Carlo simulations that statistically reveal possible mechanisms of pesticides. The results from the reverse engineering approach show that rotenone and paraquat exposures lead to distinctly different flux perturbations. Rotenone seems to affect all fluxes associated with dopamine compartmentalization, whereas paraquat exposure perturbs fluxes associated with dopamine and its breakdown metabolites. The statistical results of the Monte-Carlo analysis suggest several specific mechanisms. The findings are interesting, because no a priori assumptions are made regarding specific pesticide actions, and all parameters characterizing the processes in the dopamine model are treated in an unbiased manner. Our results show how approaches from computational systems biology can help identify mechanisms underlying the toxicity of pesticide exposure. PMID:24269752
Where to deploy pre-exposure prophylaxis (PrEP) in sub-Saharan Africa?
Verguet, Stéphane; Stalcup, Meg; Walsh, Julia A
2013-12-01
Two randomised controlled trials showed that pre-exposure prophylaxis (PrEP) reduces HIV transmission between heterosexual men and women. We model the potential impact on transmission and cost-effectiveness of providing PrEP in sub-Saharan Africa. We use a deterministic, compartmental model of HIV transmission to evaluate the potential of a 5-year PrEP intervention targeting the adult population of 42 sub-Saharan African countries. We examine the incremental impact of adding PrEP at pre-existing levels of male circumcision and antiretroviral therapy (ART). The base case assumes efficacy of 68%; adherence at 80%; country coverage at 10% of the HIV-uninfected adult population; and annual costs of PrEP and ART at US$200 and US$880 per person, respectively. After 5 years, 390,000 HIV infections (95% UR 190,000 to 630,000) would be prevented, 24% of these in South Africa. HIV infections averted per 100 000 people (adult) would range from 500 in Lesotho to 10 in Somalia. Incremental cost-effectiveness would be US$5800/disability-adjusted life year (DALY) (95% UR 3100 to 13500). Cost-effectiveness would range from US$500/DALY in Lesotho to US$44 600/DALY in Eritrea. In a general adult population, PrEP is a high-cost intervention which will have maximum impact and be cost-effective only in countries that have high levels of HIV burden and low levels of male circumcision in the population. Hence, PrEP will likely be most effective in Southern Africa as a targeted intervention added to existing strategies to control the HIV pandemic.
Plipat, Nottasorn; Spicknall, Ian H; Koopman, James S; Eisenberg, Joseph Ns
2013-12-17
Methicillin-resistant Staphylococcus aureus (MRSA) is a major cause of healthcare-associated infections. An important control strategy is hand hygiene; however, non-compliance has been a major problem in healthcare settings. Furthermore, modeling studies have suggested that the law of diminishing return applies to hand hygiene. Other additional control strategies such as environmental cleaning may be warranted, given that MRSA-positive individuals constantly shed contaminated desquamated skin particles to the environment. We constructed and analyzed a deterministic environmental compartmental model of MRSA fate, transport, and exposure between two hypothetical hospital rooms: one with a colonized patient, shedding MRSA; another with an uncolonized patient, susceptible to exposure. Healthcare workers (HCWs), acting solely as vectors, spread MRSA from one patient room to the other. Although porous surfaces became highly contaminated, their low transfer efficiency limited the exposure dose to HCWs and the uncolonized patient. Conversely, the high transfer efficiency of nonporous surfaces allows greater MRSA transfer when touched. In the colonized patient's room, HCW exposure occurred more predominantly through the indirect (patient to surfaces to HCW) mode compared to the direct (patient to HCW) mode. In contrast, in the uncolonized patient's room, patient exposure was more predominant in the direct (HCW to patient) mode compared to the indirect (HCW to surfaces to patient) mode. Surface wiping decreased MRSA exposure to the uncolonized patient more than daily surface decontamination. This was because wiping allowed higher cleaning frequency and cleaned more total surface area per day. Environmental cleaning should be considered as an integral component of MRSA infection control in hospitals. Given the previously under-appreciated role of surface contamination in MRSA transmission, this intervention mode can contribute to an effective multiple barrier approach in concert with hand hygiene.
Deterministic and stochastic models for middle east respiratory syndrome (MERS)
NASA Astrophysics Data System (ADS)
Suryani, Dessy Rizki; Zevika, Mona; Nuraini, Nuning
2018-03-01
World Health Organization (WHO) data stated that since September 2012, there were 1,733 cases of Middle East Respiratory Syndrome (MERS) with 628 death cases that occurred in 27 countries. MERS was first identified in Saudi Arabia in 2012 and the largest cases of MERS outside Saudi Arabia occurred in South Korea in 2015. MERS is a disease that attacks the respiratory system caused by infection of MERS-CoV. MERS-CoV transmission occurs directly through direct contact between infected individual with non-infected individual or indirectly through contaminated object by the free virus. Suspected, MERS can spread quickly because of the free virus in environment. Mathematical modeling is used to illustrate the transmission of MERS disease using deterministic model and stochastic model. Deterministic model is used to investigate the temporal dynamic from the system to analyze the steady state condition. Stochastic model approach using Continuous Time Markov Chain (CTMC) is used to predict the future states by using random variables. From the models that were built, the threshold value for deterministic models and stochastic models obtained in the same form and the probability of disease extinction can be computed by stochastic model. Simulations for both models using several of different parameters are shown, and the probability of disease extinction will be compared with several initial conditions.
Fausett, Sarah R; Brunet, Lisa J; Klingensmith, John
2014-07-01
Esophageal atresia with tracheoesophageal fistula (EA/TEF) is a serious human birth defect, in which the esophagus ends before reaching the stomach, and is aberrantly connected with the trachea. Several mouse models of EA/TEF have recently demonstrated that proper dorsal/ventral (D/V) patterning of the primitive anterior foregut endoderm is essential for correct compartmentalization of the trachea and esophagus. Here we elucidate the pathogenic mechanisms underlying the EA/TEF that occurs in mice lacking the BMP antagonist Noggin, which display correct dorsal/ventral patterning. To clarify the mechanism of this malformation, we use spatiotemporal manipulation of Noggin and BMP receptor 1A conditional alleles during foregut development. Surprisingly, we find that the expression of Noggin in the compartmentalizing endoderm is not required to generate distinct tracheal and esophageal tubes. Instead, we show that Noggin and BMP signaling attenuation are required in the early notochord to correctly resolve notochord cells from the dorsal foregut endoderm, which in turn, appears to be a prerequisite for foregut compartmentalization. Collectively, our findings support an emerging model for a mechanism underlying EA/TEF in which impaired notochord resolution from the early endoderm causes the foregut to be hypo-cellular just prior to the critical period of compartmentalization. Our further characterizations suggest that Noggin may regulate a cell rearrangement process that involves reciprocal E-cadherin and Zeb1 expression in the resolving notochord cells. Copyright © 2014. Published by Elsevier Inc.
Aspen succession in the Intermountain West: A deterministic model
Dale L. Bartos; Frederick R. Ward; George S. Innis
1983-01-01
A deterministic model of succession in aspen forests was developed using existing data and intuition. The degree of uncertainty, which was determined by allowing the parameter values to vary at random within limits, was larger than desired. This report presents results of an analysis of model sensitivity to changes in parameter values. These results have indicated...
Rewiring and regulation of cross-compartmentalized metabolism in protists
Ginger, Michael L.; McFadden, Geoffrey I.; Michels, Paul A. M.
2010-01-01
Plastid acquisition, endosymbiotic associations, lateral gene transfer, organelle degeneracy or even organelle loss influence metabolic capabilities in many different protists. Thus, metabolic diversity is sculpted through the gain of new metabolic functions and moderation or loss of pathways that are often essential in the majority of eukaryotes. What is perhaps less apparent to the casual observer is that the sub-compartmentalization of ubiquitous pathways has been repeatedly remodelled during eukaryotic evolution, and the textbook pictures of intermediary metabolism established for animals, yeast and plants are not conserved in many protists. Moreover, metabolic remodelling can strongly influence the regulatory mechanisms that control carbon flux through the major metabolic pathways. Here, we provide an overview of how core metabolism has been reorganized in various unicellular eukaryotes, focusing in particular on one near universal catabolic pathway (glycolysis) and one ancient anabolic pathway (isoprenoid biosynthesis). For the example of isoprenoid biosynthesis, the compartmentalization of this process in protists often appears to have been influenced by plastid acquisition and loss, whereas for glycolysis several unexpected modes of compartmentalization have emerged. Significantly, the example of trypanosomatid glycolysis illustrates nicely how mathematical modelling and systems biology can be used to uncover or understand novel modes of pathway regulation. PMID:20124348
Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow
NASA Astrophysics Data System (ADS)
Gupta, Atma Ram; Kumar, Ashwani
2017-12-01
Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.
Santermans, Eva; Robesyn, Emmanuel; Ganyani, Tapiwa; Sudre, Bertrand; Faes, Christel; Quinten, Chantal; Van Bortel, Wim; Haber, Tom; Kovac, Thomas; Van Reeth, Frank; Testa, Marco; Hens, Niel; Plachouras, Diamantis
2016-01-01
The Ebola outbreak in West Africa has infected at least 27,443 individuals and killed 11,207, based on data until 24 June, 2015, released by the World Health Organization (WHO). This outbreak has been characterised by extensive geographic spread across the affected countries Guinea, Liberia and Sierra Leone, and by localized hotspots within these countries. The rapid recognition and quantitative assessment of localised areas of higher transmission can inform the optimal deployment of public health resources. A variety of mathematical models have been used to estimate the evolution of this epidemic, and some have pointed out the importance of the spatial heterogeneity apparent from incidence maps. However, little is known about the district-level transmission. Given that many response decisions are taken at sub-national level, the current study aimed to investigate the spatial heterogeneity by using a different modelling framework, built on publicly available data at district level. Furthermore, we assessed whether this model could quantify the effect of intervention measures and provide predictions at a local level to guide public health action. We used a two-stage modelling approach: a) a flexible spatiotemporal growth model across all affected districts and b) a deterministic SEIR compartmental model per district whenever deemed appropriate. Our estimates show substantial differences in the evolution of the outbreak in the various regions of Guinea, Liberia and Sierra Leone, illustrating the importance of monitoring the outbreak at district level. We also provide an estimate of the time-dependent district-specific effective reproduction number, as a quantitative measure to compare transmission between different districts and give input for informed decisions on control measures and resource allocation. Prediction and assessing the impact of control measures proved to be difficult without more accurate data. In conclusion, this study provides us a useful tool at district level for public health, and illustrates the importance of collecting and sharing data.
The Transcriptional Regulator CBP Has Defined Spatial Associations within Interphase Nuclei
McManus, Kirk J; Stephens, David A; Adams, Niall M; Islam, Suhail A; Freemont, Paul S; Hendzel, Michael J
2006-01-01
It is becoming increasingly clear that nuclear macromolecules and macromolecular complexes are compartmentalized through binding interactions into an apparent three-dimensionally ordered structure. This ordering, however, does not appear to be deterministic to the extent that chromatin and nonchromatin structures maintain a strict 3-D arrangement. Rather, spatial ordering within the cell nucleus appears to conform to stochastic rather than deterministic spatial relationships. The stochastic nature of organization becomes particularly problematic when any attempt is made to describe the spatial relationship between proteins involved in the regulation of the genome. The CREB–binding protein (CBP) is one such transcriptional regulator that, when visualised by confocal microscopy, reveals a highly punctate staining pattern comprising several hundred individual foci distributed within the nuclear volume. Markers for euchromatic sequences have similar patterns. Surprisingly, in most cases, the predicted one-to-one relationship between transcription factor and chromatin sequence is not observed. Consequently, to understand whether spatial relationships that are not coincident are nonrandom and potentially biologically important, it is necessary to develop statistical approaches. In this study, we report on the development of such an approach and apply it to understanding the role of CBP in mediating chromatin modification and transcriptional regulation. We have used nearest-neighbor distance measurements and probability analyses to study the spatial relationship between CBP and other nuclear subcompartments enriched in transcription factors, chromatin, and splicing factors. Our results demonstrate that CBP has an order of spatial association with other nuclear subcompartments. We observe closer associations between CBP and RNA polymerase II–enriched foci and SC35 speckles than nascent RNA or specific acetylated histones. Furthermore, we find that CBP has a significantly higher probability of being close to its known in vivo substrate histone H4 lysine 5 compared with the closely related H4 lysine 12. This study demonstrates that complex relationships not described by colocalization exist in the interphase nucleus and can be characterized and quantified. The subnuclear distribution of CBP is difficult to reconcile with a model where chromatin organization is the sole determinant of the nuclear organization of proteins that regulate transcription but is consistent with a close link between spatial associations and nuclear functions. PMID:17054391
Dosimetric assessment from 212Pb inhalation at a thorium purification plant.
Campos, M P; Pecequilo, B R S
2004-01-01
At the Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, Brazil, there is a facility (thorium purification plant) where materials with high thorium concentrations are manipulated. In order to estimate afterwards the lung cancer risk for the workers, the thoron daughter (212Pb) levels were assessed and the committed effective and lung committed equivalent doses for workers in place. A total of 28 air filter samples were measured by total alpha counting through the modified Kusnetz method, to determine the 212Pb concentraion. The committed effective dose and lung committed equivalent dose due to 212Pb inhalation were derived from compartmental analysis following the ICRP 66 lung compartmental model, and ICRP 67 lead metabolic model.
Neuromuscular junction in a microfluidic device.
Park, Hyun Sung; Liu, Su; McDonald, John; Thakor, Nitish; Yang, In Hong
2013-01-01
Malfunctions at the site of neuromuscular junction (NMJ) of post-injuries or diseases are major barriers to recovery of function. The ability to efficiently derive motor neurons (MN) from embryonic stem cells has indicated promise toward the development of new therapies in increasing functional outcomes post injury. Recent advances in micro-technologies have provided advanced culture platforms allowing compartmentalization of sub-cellular components of neurons. In this study, we combined these advances in science and technology to develop a compartmentalized in vitro NMJ model. The developed NMJ system is between mouse embryonic stem cell (mESC)-derived MNs and c2c12 myotubes cultured in a compartmentalized polydimethylsiloxane (PDMS) microfluidic device. While some functional in vitro NMJ systems have been reported, this system would further contribute to research in NMJ-related diseases by providing a system to study the site of action of NMJ aimed at improving promoting better functional recovery.
The MATCHIT Automaton: Exploiting Compartmentalization for the Synthesis of Branched Polymers
Weyland, Mathias S.; Fellermann, Harold; Hadorn, Maik; Sorek, Daniel; Lancet, Doron; Rasmussen, Steen; Füchslin, Rudolf M.
2013-01-01
We propose an automaton, a theoretical framework that demonstrates how to improve the yield of the synthesis of branched chemical polymer reactions. This is achieved by separating substeps of the path of synthesis into compartments. We use chemical containers (chemtainers) to carry the substances through a sequence of fixed successive compartments. We describe the automaton in mathematical terms and show how it can be configured automatically in order to synthesize a given branched polymer target. The algorithm we present finds an optimal path of synthesis in linear time. We discuss how the automaton models compartmentalized structures found in cells, such as the endoplasmic reticulum and the Golgi apparatus, and we show how this compartmentalization can be exploited for the synthesis of branched polymers such as oligosaccharides. Lastly, we show examples of artificial branched polymers and discuss how the automaton can be configured to synthesize them with maximal yield. PMID:24489601
Price-Dynamics of Shares and Bohmian Mechanics: Deterministic or Stochastic Model?
NASA Astrophysics Data System (ADS)
Choustova, Olga
2007-02-01
We apply the mathematical formalism of Bohmian mechanics to describe dynamics of shares. The main distinguishing feature of the financial Bohmian model is the possibility to take into account market psychology by describing expectations of traders by the pilot wave. We also discuss some objections (coming from conventional financial mathematics of stochastic processes) against the deterministic Bohmian model. In particular, the objection that such a model contradicts to the efficient market hypothesis which is the cornerstone of the modern market ideology. Another objection is of pure mathematical nature: it is related to the quadratic variation of price trajectories. One possibility to reply to this critique is to consider the stochastic Bohm-Vigier model, instead of the deterministic one. We do this in the present note.
Koldsø, Heidi; Reddy, Tyler; Fowler, Philip W; Duncan, Anna L; Sansom, Mark S P
2016-09-01
The cytoskeleton underlying cell membranes may influence the dynamic organization of proteins and lipids within the bilayer by immobilizing certain transmembrane (TM) proteins and forming corrals within the membrane. Here, we present coarse-grained resolution simulations of a biologically realistic membrane model of asymmetrically organized lipids and TM proteins. We determine the effects of a model of cytoskeletal immobilization of selected membrane proteins using long time scale coarse-grained molecular dynamics simulations. By introducing compartments with varying degrees of restraints within the membrane models, we are able to reveal how compartmentalization caused by cytoskeletal immobilization leads to reduced and anomalous diffusional mobility of both proteins and lipids. This in turn results in a reduced rate of protein dimerization within the membrane and of hopping of membrane proteins between compartments. These simulations provide a molecular realization of hierarchical models often invoked to explain single-molecule imaging studies of membrane proteins.
Fractional kinetics of compartmental systems: first approach with use digraph-based method
NASA Astrophysics Data System (ADS)
Markowski, Konrad Andrzej
2017-08-01
In the last two decades, integral and differential calculus of a fractional order has become a subject of great interest in different areas of physics, biology, economics and other sciences. The idea of such a generalization was mentioned in 1695 by Leibniz and L'Hospital. The first definition of the fractional derivative was introduced by Liouville and Riemann at the end of the 19th century. Fractional calculus was found to be a very useful tool for modelling the behaviour of many materials and systems. In this paper fractional calculus was applied to pharmacokinetic compartmental model. For introduced model determine all possible quasi-positive realisation based on one-dimensional digraph theory. The proposed method was discussed and illustrated in detail with some numerical examples.
ADP Compartmentation Analysis Reveals Coupling between Pyruvate Kinase and ATPases in Heart Muscle
Sepp, Mervi; Vendelin, Marko; Vija, Heiki; Birkedal, Rikke
2010-01-01
Abstract Cardiomyocytes have intracellular diffusion restrictions, which spatially compartmentalize ADP and ATP. However, the models that predict diffusion restrictions have used data sets generated in rat heart permeabilized fibers, where diffusion distances may be heterogeneous. This is avoided by using isolated, permeabilized cardiomyocytes. The aim of this work was to analyze the intracellular diffusion of ATP and ADP in rat permeabilized cardiomyocytes. To do this, we measured respiration rate, ATPase rate, and ADP concentration in the surrounding solution. The data were analyzed using mathematical models that reflect different levels of cell compartmentalization. In agreement with previous studies, we found significant diffusion restriction by the mitochondrial outer membrane and confirmed a functional coupling between mitochondria and a fraction of ATPases in the cell. In addition, our experimental data show that considerable activity of endogenous pyruvate kinase (PK) remains in the cardiomyocytes after permeabilization. A fraction of ATPases were inactive without ATP feedback by this endogenous PK. When analyzing the data, we were able to reproduce the measurements only with the mathematical models that include a tight coupling between the fraction of endogenous PK and ATPases. To our knowledge, this is the first time such a strong coupling of PK to ATPases has been demonstrated in permeabilized cardiomyocytes. PMID:20550890
Modeling the hepatitis A epidemiological transition in Brazil and Mexico
Van Effelterre, Thierry; Guignard, Adrienne; Marano, Cinzia; Rojas, Rosalba; Jacobsen, Kathryn H.
2017-01-01
ABSTRACT Background: Many low- to middle-income countries have completed or are in the process of transitioning from high or intermediate to low endemicity for hepatitis A virus (HAV). Because the risk of severe hepatitis A disease increases with age at infection, decreased incidence that leaves older children and adults susceptible to HAV infection may actually increase the population-level burden of disease from HAV. Mathematical models can be helpful for projecting future epidemiological profiles for HAV. Methods: An age-specific deterministic, dynamic compartmental transmission model with stratification by setting (rural versus urban) was calibrated with country-specific data on demography, urbanization, and seroprevalence of anti-HAV antibodies. HAV transmission was modeled as a function of setting-specific access to safe water. The model was then used to project various HAV-related epidemiological outcomes in Brazil and in Mexico from 1950 to 2050. Results: The projected epidemiological outcomes were qualitatively similar in the 2 countries. The age at the midpoint of population immunity (AMPI) increased considerably and the mean age of symptomatic HAV cases shifted from childhood to early adulthood. The projected overall incidence rate of HAV infections decreased by about two thirds as safe water access improved. However, the incidence rate of symptomatic HAV infections remained roughly the same over the projection period. The incidence rates of HAV infections (all and symptomatic alone) were projected to become similar in rural and urban settings in the next decades. Conclusion: This model featuring population age structure, urbanization and access to safe water as key contributors to the epidemiological transition for HAV was previously validated with data from Thailand and fits equally well with data from Latin American countries. Assuming no introduction of a vaccination program over the projection period, both Brazil and Mexico were projected to experience a continued decrease in HAV incidence rates without any substantial decrease in the incidence rates of symptomatic HAV infections. PMID:28481680
Modeling the hepatitis A epidemiological transition in Brazil and Mexico.
Van Effelterre, Thierry; Guignard, Adrienne; Marano, Cinzia; Rojas, Rosalba; Jacobsen, Kathryn H
2017-08-03
Many low- to middle-income countries have completed or are in the process of transitioning from high or intermediate to low endemicity for hepatitis A virus (HAV). Because the risk of severe hepatitis A disease increases with age at infection, decreased incidence that leaves older children and adults susceptible to HAV infection may actually increase the population-level burden of disease from HAV. Mathematical models can be helpful for projecting future epidemiological profiles for HAV. An age-specific deterministic, dynamic compartmental transmission model with stratification by setting (rural versus urban) was calibrated with country-specific data on demography, urbanization, and seroprevalence of anti-HAV antibodies. HAV transmission was modeled as a function of setting-specific access to safe water. The model was then used to project various HAV-related epidemiological outcomes in Brazil and in Mexico from 1950 to 2050. The projected epidemiological outcomes were qualitatively similar in the 2 countries. The age at the midpoint of population immunity (AMPI) increased considerably and the mean age of symptomatic HAV cases shifted from childhood to early adulthood. The projected overall incidence rate of HAV infections decreased by about two thirds as safe water access improved. However, the incidence rate of symptomatic HAV infections remained roughly the same over the projection period. The incidence rates of HAV infections (all and symptomatic alone) were projected to become similar in rural and urban settings in the next decades. This model featuring population age structure, urbanization and access to safe water as key contributors to the epidemiological transition for HAV was previously validated with data from Thailand and fits equally well with data from Latin American countries. Assuming no introduction of a vaccination program over the projection period, both Brazil and Mexico were projected to experience a continued decrease in HAV incidence rates without any substantial decrease in the incidence rates of symptomatic HAV infections.
Tularosa Basin Play Fairway Analysis Data and Models
Nash, Greg
2017-07-11
This submission includes raster datasets for each layer of evidence used for weights of evidence analysis as well as the deterministic play fairway analysis (PFA). Data representative of heat, permeability and groundwater comprises some of the raster datasets. Additionally, the final deterministic PFA model is provided along with a certainty model. All of these datasets are best used with an ArcGIS software package, specifically Spatial Data Modeler.
Compartmental model of 18F-choline
NASA Astrophysics Data System (ADS)
Janzen, T.; Tavola, F.; Giussani, A.; Cantone, M. C.; Uusijärvi, H.; Mattsson, S.; Zankl, M.; Petoussi-Henß, N.; Hoeschen, C.
2010-03-01
The MADEIRA Project (Minimizing Activity and Dose with Enhanced Image quality by Radiopharmaceutical Administrations), aims to improve the efficacy and safety of 3D functional imaging by optimizing, among others, the knowledge of the temporal variation of the radiopharmaceuticals' uptake in and clearance from tumor and healthy tissues. With the help of compartmental modeling it is intended to optimize the time schedule for data collection and improve the evaluation of the organ doses to the patients. Administration of 18F-choline to screen for recurrence or the occurrence of metastases in prostate cancer patients is one of the diagnostic applications under consideration in the frame of the project. PET and CT images have been acquired up to four hours after injection of 18F-choline. Additionally blood and urine samples have been collected and measured in a gamma counter. The radioactivity concentration in different organs and data of plasma clearance and elimination into urine were used to set-up a compartmental model of the biokinetics of the radiopharmaceutical. It features a central compartment (blood) exchanging with organs. The structure describes explicitly liver, kidneys, spleen, plasma and bladder as separate units with a forcing function approach. The model is presented together with an evaluation of the individual and population kinetic parameters, and a revised time schedule for data collection is proposed. This optimized time schedule will be validated in a further set of patient studies.
A Stochastic Tick-Borne Disease Model: Exploring the Probability of Pathogen Persistence.
Maliyoni, Milliward; Chirove, Faraimunashe; Gaff, Holly D; Govinder, Keshlan S
2017-09-01
We formulate and analyse a stochastic epidemic model for the transmission dynamics of a tick-borne disease in a single population using a continuous-time Markov chain approach. The stochastic model is based on an existing deterministic metapopulation tick-borne disease model. We compare the disease dynamics of the deterministic and stochastic models in order to determine the effect of randomness in tick-borne disease dynamics. The probability of disease extinction and that of a major outbreak are computed and approximated using the multitype Galton-Watson branching process and numerical simulations, respectively. Analytical and numerical results show some significant differences in model predictions between the stochastic and deterministic models. In particular, we find that a disease outbreak is more likely if the disease is introduced by infected deer as opposed to infected ticks. These insights demonstrate the importance of host movement in the expansion of tick-borne diseases into new geographic areas.
O'Sullivan, Finbarr; Muzi, Mark; Spence, Alexander M; Mankoff, David M; O'Sullivan, Janet N; Fitzgerald, Niall; Newman, George C; Krohn, Kenneth A
2009-06-01
Kinetic analysis is used to extract metabolic information from dynamic positron emission tomography (PET) uptake data. The theory of indicator dilutions, developed in the seminal work of Meier and Zierler (1954), provides a probabilistic framework for representation of PET tracer uptake data in terms of a convolution between an arterial input function and a tissue residue. The residue is a scaled survival function associated with tracer residence in the tissue. Nonparametric inference for the residue, a deconvolution problem, provides a novel approach to kinetic analysis-critically one that is not reliant on specific compartmental modeling assumptions. A practical computational technique based on regularized cubic B-spline approximation of the residence time distribution is proposed. Nonparametric residue analysis allows formal statistical evaluation of specific parametric models to be considered. This analysis needs to properly account for the increased flexibility of the nonparametric estimator. The methodology is illustrated using data from a series of cerebral studies with PET and fluorodeoxyglucose (FDG) in normal subjects. Comparisons are made between key functionals of the residue, tracer flux, flow, etc., resulting from a parametric (the standard two-compartment of Phelps et al. 1979) and a nonparametric analysis. Strong statistical evidence against the compartment model is found. Primarily these differences relate to the representation of the early temporal structure of the tracer residence-largely a function of the vascular supply network. There are convincing physiological arguments against the representations implied by the compartmental approach but this is the first time that a rigorous statistical confirmation using PET data has been reported. The compartmental analysis produces suspect values for flow but, notably, the impact on the metabolic flux, though statistically significant, is limited to deviations on the order of 3%-4%. The general advantage of the nonparametric residue analysis is the ability to provide a valid kinetic quantitation in the context of studies where there may be heterogeneity or other uncertainty about the accuracy of a compartmental model approximation of the tissue residue.
Abrams , Robert H.; Loague, Keith; Kent, Douglas B.
1998-01-01
The work reported here is the first part of a larger effort focused on efficient numerical simulation of redox zone development in contaminated aquifers. The sequential use of various electron acceptors, which is governed by the energy yield of each reaction, gives rise to redox zones. The large difference in energy yields between the various redox reactions leads to systems of equations that are extremely ill-conditioned. These equations are very difficult to solve, especially in the context of coupled fluid flow, solute transport, and geochemical simulations. We have developed a general, rational method to solve such systems where we focus on the dominant reactions, compartmentalizing them in a manner that is analogous to the redox zones that are often observed in the field. The compartmentalized approach allows us to easily solve a complex geochemical system as a function of time and energy yield, laying the foundation for our ongoing work in which we couple the reaction network, for the development of redox zones, to a model of subsurface fluid flow and solute transport. Our method (1) solves the numerical system without evoking a redox parameter, (2) improves the numerical stability of redox systems by choosing which compartment and thus which reaction network to use based upon the concentration ratios of key constituents, (3) simulates the development of redox zones as a function of time without the use of inhibition factors or switching functions, and (4) can reduce the number of transport equations that need to be solved in space and time. We show through the use of various model performance evaluation statistics that the appropriate compartment choice under different geochemical conditions leads to numerical solutions without significant error. The compartmentalized approach described here facilitates the next phase of this effort where we couple the redox zone reaction network to models of fluid flow and solute transport.
Mongiat, Lucas Alberto; Schwarzacher, Stephan Wolfgang
2017-01-01
Compartmental models are the theoretical tool of choice for understanding single neuron computations. However, many models are incomplete, built ad hoc and require tuning for each novel condition rendering them of limited usability. Here, we present T2N, a powerful interface to control NEURON with Matlab and TREES toolbox, which supports generating models stable over a broad range of reconstructed and synthetic morphologies. We illustrate this for a novel, highly detailed active model of dentate granule cells (GCs) replicating a wide palette of experiments from various labs. By implementing known differences in ion channel composition and morphology, our model reproduces data from mouse or rat, mature or adult-born GCs as well as pharmacological interventions and epileptic conditions. This work sets a new benchmark for detailed compartmental modeling. T2N is suitable for creating robust models useful for large-scale networks that could lead to novel predictions. We discuss possible T2N application in degeneracy studies. PMID:29165247
Combining Deterministic structures and stochastic heterogeneity for transport modeling
NASA Astrophysics Data System (ADS)
Zech, Alraune; Attinger, Sabine; Dietrich, Peter; Teutsch, Georg
2017-04-01
Contaminant transport in highly heterogeneous aquifers is extremely challenging and subject of current scientific debate. Tracer plumes often show non-symmetric but highly skewed plume shapes. Predicting such transport behavior using the classical advection-dispersion-equation (ADE) in combination with a stochastic description of aquifer properties requires a dense measurement network. This is in contrast to the available information for most aquifers. A new conceptual aquifer structure model is presented which combines large-scale deterministic information and the stochastic approach for incorporating sub-scale heterogeneity. The conceptual model is designed to allow for a goal-oriented, site specific transport analysis making use of as few data as possible. Thereby the basic idea is to reproduce highly skewed tracer plumes in heterogeneous media by incorporating deterministic contrasts and effects of connectivity instead of using unimodal heterogeneous models with high variances. The conceptual model consists of deterministic blocks of mean hydraulic conductivity which might be measured by pumping tests indicating values differing in orders of magnitudes. A sub-scale heterogeneity is introduced within every block. This heterogeneity can be modeled as bimodal or log-normal distributed. The impact of input parameters, structure and conductivity contrasts is investigated in a systematic manor. Furthermore, some first successful implementation of the model was achieved for the well known MADE site.
Modeling the within-host dynamics of cholera: bacterial-viral interaction.
Wang, Xueying; Wang, Jin
2017-08-01
Novel deterministic and stochastic models are proposed in this paper for the within-host dynamics of cholera, with a focus on the bacterial-viral interaction. The deterministic model is a system of differential equations describing the interaction among the two types of vibrios and the viruses. The stochastic model is a system of Markov jump processes that is derived based on the dynamics of the deterministic model. The multitype branching process approximation is applied to estimate the extinction probability of bacteria and viruses within a human host during the early stage of the bacterial-viral infection. Accordingly, a closed-form expression is derived for the disease extinction probability, and analytic estimates are validated with numerical simulations. The local and global dynamics of the bacterial-viral interaction are analysed using the deterministic model, and the result indicates that there is a sharp disease threshold characterized by the basic reproduction number [Formula: see text]: if [Formula: see text], vibrios ingested from the environment into human body will not cause cholera infection; if [Formula: see text], vibrios will grow with increased toxicity and persist within the host, leading to human cholera. In contrast, the stochastic model indicates, more realistically, that there is always a positive probability of disease extinction within the human host.
Parallel Stochastic discrete event simulation of calcium dynamics in neuron.
Ishlam Patoary, Mohammad Nazrul; Tropper, Carl; McDougal, Robert A; Zhongwei, Lin; Lytton, William W
2017-09-26
The intra-cellular calcium signaling pathways of a neuron depends on both biochemical reactions and diffusions. Some quasi-isolated compartments (e.g. spines) are so small and calcium concentrations are so low that one extra molecule diffusing in by chance can make a nontrivial difference in its concentration (percentage-wise). These rare events can affect dynamics discretely in such way that they cannot be evaluated by a deterministic simulation. Stochastic models of such a system provide a more detailed understanding of these systems than existing deterministic models because they capture their behavior at a molecular level. Our research focuses on the development of a high performance parallel discrete event simulation environment, Neuron Time Warp (NTW), which is intended for use in the parallel simulation of stochastic reaction-diffusion systems such as intra-calcium signaling. NTW is integrated with NEURON, a simulator which is widely used within the neuroscience community. We simulate two models, a calcium buffer and a calcium wave model. The calcium buffer model is employed in order to verify the correctness and performance of NTW by comparing it to a serial deterministic simulation in NEURON. We also derived a discrete event calcium wave model from a deterministic model using the stochastic IP3R structure.
Design and validation of diffusion MRI models of white matter
NASA Astrophysics Data System (ADS)
Jelescu, Ileana O.; Budde, Matthew D.
2017-11-01
Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open questions and converge towards consensus.
Design and validation of diffusion MRI models of white matter
Jelescu, Ileana O.; Budde, Matthew D.
2018-01-01
Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open questions and converge towards consensus. PMID:29755979
Developing a physiologically based approach for modeling plutonium decorporation therapy with DTPA.
Kastl, Manuel; Giussani, Augusto; Blanchardon, Eric; Breustedt, Bastian; Fritsch, Paul; Hoeschen, Christoph; Lopez, Maria Antonia
2014-11-01
To develop a physiologically based compartmental approach for modeling plutonium decorporation therapy with the chelating agent Diethylenetriaminepentaacetic acid (Ca-DTPA/Zn-DTPA). Model calculations were performed using the software package SAAM II (©The Epsilon Group, Charlottesville, Virginia, USA). The Luciani/Polig compartmental model with age-dependent description of the bone recycling processes was used for the biokinetics of plutonium. The Luciani/Polig model was slightly modified in order to account for the speciation of plutonium in blood and for the different affinities for DTPA of the present chemical species. The introduction of two separate blood compartments, describing low-molecular-weight complexes of plutonium (Pu-LW) and transferrin-bound plutonium (Pu-Tf), respectively, and one additional compartment describing plutonium in the interstitial fluids was performed successfully. The next step of the work is the modeling of the chelation process, coupling the physiologically modified structure with the biokinetic model for DTPA. RESULTS of animal studies performed under controlled conditions will enable to better understand the principles of the involved mechanisms.
Physiologically based pharmacokinetic (PBPK) models are compartmental models that describe the uptake and distribution of drugs and chemicals throughout the body. They can be structured so that model parameters (i.e., physiological and chemical-specific) reflect biological charac...
Modeling the Ebola zoonotic dynamics: Interplay between enviroclimatic factors and bat ecology
Johnson, Kaylynn
2017-01-01
Understanding Ebola necessarily requires the characterization of the ecology of its main enzootic reservoir, i.e. bats, and its interplay with seasonal and enviroclimatic factors. Here we present a SIR compartmental model where we implement a bidirectional coupling between the available resources and the dynamics of the bat population in order to understand their migration patterns. Our compartmental modeling approach and simulations include transport terms to account for bats mobility and spatiotemporal climate variability. We hypothesize that environmental pressure is the main driving force for bats’ migration and our results reveal the appearance of sustained migratory waves of Ebola virus infected bats coupled to resources availability. Ultimately, our study can be relevant to predict hot spots of Ebola outbreaks in space and time and suggest conservation policies to mitigate the risk of spillovers. PMID:28604813
Parameter Estimation in Epidemiology: from Simple to Complex Dynamics
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Ballesteros, Sebastién; Boto, João Pedro; Kooi, Bob W.; Mateus, Luís; Stollenwerk, Nico
2011-09-01
We revisit the parameter estimation framework for population biological dynamical systems, and apply it to calibrate various models in epidemiology with empirical time series, namely influenza and dengue fever. When it comes to more complex models like multi-strain dynamics to describe the virus-host interaction in dengue fever, even most recently developed parameter estimation techniques, like maximum likelihood iterated filtering, come to their computational limits. However, the first results of parameter estimation with data on dengue fever from Thailand indicate a subtle interplay between stochasticity and deterministic skeleton. The deterministic system on its own already displays complex dynamics up to deterministic chaos and coexistence of multiple attractors.
Walter C. Shortle
2000-01-01
CODIT is an acronym for compartmentalization of decay in trees. It is a simple model system originally designed to help forest managers understand the patterns of discoloration and decay in living trees.
The Diffusion Model Is Not a Deterministic Growth Model: Comment on Jones and Dzhafarov (2014)
Smith, Philip L.; Ratcliff, Roger; McKoon, Gail
2015-01-01
Jones and Dzhafarov (2014) claim that several current models of speeded decision making in cognitive tasks, including the diffusion model, can be viewed as special cases of other general models or model classes. The general models can be made to match any set of response time (RT) distribution and accuracy data exactly by a suitable choice of parameters and so are unfalsifiable. The implication of their claim is that models like the diffusion model are empirically testable only by artificially restricting them to exclude unfalsifiable instances of the general model. We show that Jones and Dzhafarov’s argument depends on enlarging the class of “diffusion” models to include models in which there is little or no diffusion. The unfalsifiable models are deterministic or near-deterministic growth models, from which the effects of within-trial variability have been removed or in which they are constrained to be negligible. These models attribute most or all of the variability in RT and accuracy to across-trial variability in the rate of evidence growth, which is permitted to be distributed arbitrarily and to vary freely across experimental conditions. In contrast, in the standard diffusion model, within-trial variability in evidence is the primary determinant of variability in RT. Across-trial variability, which determines the relative speed of correct responses and errors, is theoretically and empirically constrained. Jones and Dzhafarov’s attempt to include the diffusion model in a class of models that also includes deterministic growth models misrepresents and trivializes it and conveys a misleading picture of cognitive decision-making research. PMID:25347314
Fatigue in isometric contraction in a single muscle fibre: a compartmental calcium ion flow model.
Kothiyal, K P; Ibramsha, M
1986-01-01
Fatigue in muscle is a complex biological phenomenon which has so far eluded a definite explanation. Many biochemical and physiological models have been suggested in the literature to account for the decrement in the ability of muscle to sustain a given level of force for a long time. Some of these models have been critically analysed in this paper and are shown to be not able to explain all the experimental observations. A new compartmental model based on the intracellular calcium ion movement in muscle is proposed to study the mechanical responses of a muscle fibre. Computer simulation is performed to obtain model responses in isometric contraction to an impulse and a train of stimuli of long duration. The simulated curves have been compared with experimentally observed mechanical responses of the semitendinosus muscle fibre of Rana pipiens. The comparison of computed and observed responses indicates that the proposed calcium ion model indeed accounts very well for the muscle fatigue.
BREATH MEASUREMENT AND MODELS TO ASSESS VOC DERMAL ABSORPTION IN WATER
Dermal exposure to volatile organic compounds (VOCs) in water results from environmental contamination of surface, ground-, and drinking waters. This exposure occurs both in occupational and residential settings. Compartmental models incorporating body burden measurements have ...
On the usage of ultrasound computational models for decision making under ambiguity
NASA Astrophysics Data System (ADS)
Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron
2018-04-01
Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.
({The) Solar System Large Planets influence on a new Maunder Miniμm}
NASA Astrophysics Data System (ADS)
Yndestad, Harald; Solheim, Jan-Erik
2016-04-01
In 1890´s G. Spörer and E. W. Maunder (1890) reported that the solar activity stopped in a period of 70 years from 1645 to 1715. Later a reconstruction of the solar activity confirms the grand minima Maunder (1640-1720), Spörer (1390-1550), Wolf (1270-1340), and the minima Oort (1010-1070) and Dalton (1785-1810) since the year 1000 A.D. (Usoskin et al. 2007). These minimum periods have been associated with less irradiation from the Sun and cold climate periods on Earth. An identification of a three grand Maunder type periods and two Dalton type periods in a period thousand years, indicates that sooner or later there will be a colder climate on Earth from a new Maunder- or Dalton- type period. The cause of these minimum periods, are not well understood. An expected new Maunder-type period is based on the properties of solar variability. If the solar variability has a deterministic element, we can estimate better a new Maunder grand minimum. A random solar variability can only explain the past. This investigation is based on the simple idea that if the solar variability has a deterministic property, it must have a deterministic source, as a first cause. If this deterministic source is known, we can compute better estimates the next expected Maunder grand minimum period. The study is based on a TSI ACRIM data series from 1700, a TSI ACRIM data series from 1000 A.D., sunspot data series from 1611 and a Solar Barycenter orbit data series from 1000. The analysis method is based on a wavelet spectrum analysis, to identify stationary periods, coincidence periods and their phase relations. The result shows that the TSI variability and the sunspots variability have deterministic oscillations, controlled by the large planets Jupiter, Uranus and Neptune, as the first cause. A deterministic model of TSI variability and sunspot variability confirms the known minimum and grand minimum periods since 1000. From this deterministic model we may expect a new Maunder type sunspot minimum period from about 2018 to 2055. The deterministic model of a TSI ACRIM data series from 1700 computes a new Maunder type grand minimum period from 2015 to 2071. A model of the longer TSI ACRIM data series from 1000 computes a new Dalton to Maunder type minimum irradiation period from 2047 to 2068.
Moran, Nancy E; Cichon, Morgan J; Riedl, Kenneth M; Grainger, Elizabeth M; Schwartz, Steven J; Novotny, Janet A; Erdman, John W; Clinton, Steven K
2015-12-01
Lycopene, which is a red carotenoid in tomatoes, has been hypothesized to mediate disease-preventive effects associated with tomato consumption. Lycopene is consumed primarily as the all-trans geometric isomer in foods, whereas human plasma and tissues show greater proportions of cis isomers. With the use of compartmental modeling and stable isotope technology, we determined whether endogenous all-trans-to-cis-lycopene isomerization or isomeric-bioavailability differences underlie the greater proportion of lycopene cis isomers in human tissues than in tomato foods. Healthy men (n = 4) and women (n = 4) consumed (13)C-lycopene (10.2 mg; 82% all-trans and 18% cis), and plasma was collected over 28 d. Unlabeled and (13)C-labeled total lycopene and lycopene-isomer plasma concentrations, which were measured with the use of high-performance liquid chromatography-mass spectrometry, were fit to a 7-compartment model. Subjects absorbed a mean ± SEM of 23% ± 6% of the lycopene. The proportion of plasma cis-(13)C-lycopene isomers increased over time, and all-trans had a shorter half-life than that of cis isomers (5.3 ± 0.3 and 8.8 ± 0.6 d, respectively; P < 0.001) and an earlier time to reach maximal plasma concentration than that of cis isomers (28 ± 7 and 48 ± 9 h, respectively). A compartmental model that allowed for interindividual differences in cis- and all-trans-lycopene bioavailability and endogenous trans-to-cis-lycopene isomerization was predictive of plasma (13)C and unlabeled cis- and all-trans-lycopene concentrations. Although the bioavailability of cis (24.5% ± 6%) and all-trans (23.2% ± 8%) isomers did not differ, endogenous isomerization (0.97 ± 0.25 μmol/d in the fast-turnover tissue lycopene pool) drove tissue and plasma isomeric profiles. (13)C-Lycopene combined with physiologic compartmental modeling provides a strategy for following complex in vivo metabolic processes in humans and reveals that postabsorptive trans-to-cis-lycopene isomerization, and not the differential bioavailability of isomers, drives tissue and plasma enrichment of cis-lycopene. This trial was registered at clinicaltrials.gov as NCT01692340. © 2015 American Society for Nutrition.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
Cost-effectiveness of vaccination against herpes zoster in adults aged over 60 years in Belgium.
Bilcke, Joke; Marais, Christiaan; Ogunjimi, Benson; Willem, Lander; Hens, Niel; Beutels, Philippe
2012-01-11
To assess the cost-effectiveness of vaccinating all or subgroups of adults aged 60 to 85 years against herpes zoster. A deterministic compartmental static model was developed (in freeware R), in which cohorts can acquire herpes zoster according to their age in years. Surveys and database analyses were conducted to obtain as much as possible Belgian age-specific estimates for input parameters. Direct costs and Quality-Adjusted Life-Year (QALY) losses were estimated as a function of standardised Severity Of Illness (SOI) scores (i.e. as a function of the duration and severity of herpes zoster disease). Uncertainty about the average SOI score for a person with herpes zoster, the duration of protection from the vaccine, and the population that can benefit from the vaccine, exerts a major impact on the results: under assumptions least in favour of vaccination, vaccination is not cost-effective (i.e. incremental cost per QALY gained >€48,000 for all ages considered) at the expected vaccine price of €90 per dose. At the same price, but under assumptions most in favour of vaccination, vaccination is found to be cost-effective (i.e. incremental cost per QALY gained <€5500 for all ages considered). Vaccination of age cohort 60 seems more cost-effective than vaccination of any older age cohort in Belgium. If the vaccine price per dose drops to €45, HZ vaccination of adults aged 60-64 years is likely to be cost-effective in Belgium, even under assumptions least in favour of vaccination. Unlike previous studies, our analysis acknowledged major methodological and model uncertainties simultaneously and presented outcomes for 26 different target ages at which vaccination can be considered (ages 60-85). Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Xiangdong; Li, Qingze; Pan, Jianxin
2018-06-01
Modern medical studies show that chemotherapy can help most cancer patients, especially for those diagnosed early, to stabilize their disease conditions from months to years, which means the population of tumor cells remained nearly unchanged in quite a long time after fighting against immune system and drugs. In order to better understand the dynamics of tumor-immune responses under chemotherapy, deterministic and stochastic differential equation models are constructed to characterize the dynamical change of tumor cells and immune cells in this paper. The basic dynamical properties, such as boundedness, existence and stability of equilibrium points, are investigated in the deterministic model. Extended stochastic models include stochastic differential equations (SDEs) model and continuous-time Markov chain (CTMC) model, which accounts for the variability in cellular reproduction, growth and death, interspecific competitions, and immune response to chemotherapy. The CTMC model is harnessed to estimate the extinction probability of tumor cells. Numerical simulations are performed, which confirms the obtained theoretical results.
Impacts of Considering Climate Variability on Investment Decisions in Ethiopia
NASA Astrophysics Data System (ADS)
Strzepek, K.; Block, P.; Rosegrant, M.; Diao, X.
2005-12-01
In Ethiopia, climate extremes, inducing droughts or floods, are not unusual. Monitoring the effects of these extremes, and climate variability in general, is critical for economic prediction and assessment of the country's future welfare. The focus of this study involves adding climate variability to a deterministic, mean climate-driven agro-economic model, in an attempt to understand its effects and degree of influence on general economic prediction indicators for Ethiopia. Four simulations are examined, including a baseline simulation and three investment strategies: simulations of irrigation investment, roads investment, and a combination investment of both irrigation and roads. The deterministic model is transformed into a stochastic model by dynamically adding year-to-year climate variability through climate-yield factors. Nine sets of actual, historic, variable climate data are individually assembled and implemented into the 12-year stochastic model simulation, producing an ensemble of economic prediction indicators. This ensemble allows for a probabilistic approach to planning and policy making, allowing decision makers to consider risk. The economic indicators from the deterministic and stochastic approaches, including rates of return to investments, are significantly different. The predictions of the deterministic model appreciably overestimate the future welfare of Ethiopia; the predictions of the stochastic model, utilizing actual climate data, tend to give a better semblance of what may be expected. Inclusion of climate variability is vital for proper analysis of the predictor values from this agro-economic model.
Abstract Trichloroethylene (TCE) is an industrial chemical and an environmental contaminant. TCE and its metabolites may be carcinogenic and affect human health. Physiologically based pharmacokinetic (PBPK) models that differ in compartmentalization are developed for TCE metabo...
Collaborative Understanding of Cyanobacteria in Lake Ecosystems
ERIC Educational Resources Information Center
Greer, Meredith L.; Ewing, Holly A.; Cottingham, Kathryn L.; Weathers, Kathleen C.
2013-01-01
We describe a collaboration between mathematicians and ecologists studying the cyanobacterium "Gloeotrichia echinulata" and its possible role in eutrophication of New England lakes. The mathematics includes compartmental modeling, differential equations, difference equations, and testing models against high-frequency data. The ecology…
NASA Astrophysics Data System (ADS)
Ramos, José A.; Mercère, Guillaume
2016-12-01
In this paper, we present an algorithm for identifying two-dimensional (2D) causal, recursive and separable-in-denominator (CRSD) state-space models in the Roesser form with deterministic-stochastic inputs. The algorithm implements the N4SID, PO-MOESP and CCA methods, which are well known in the literature on 1D system identification, but here we do so for the 2D CRSD Roesser model. The algorithm solves the 2D system identification problem by maintaining the constraint structure imposed by the problem (i.e. Toeplitz and Hankel) and computes the horizontal and vertical system orders, system parameter matrices and covariance matrices of a 2D CRSD Roesser model. From a computational point of view, the algorithm has been presented in a unified framework, where the user can select which of the three methods to use. Furthermore, the identification task is divided into three main parts: (1) computing the deterministic horizontal model parameters, (2) computing the deterministic vertical model parameters and (3) computing the stochastic components. Specific attention has been paid to the computation of a stabilised Kalman gain matrix and a positive real solution when required. The efficiency and robustness of the unified algorithm have been demonstrated via a thorough simulation example.
NASA Astrophysics Data System (ADS)
Fischer, P.; Jardani, A.; Lecoq, N.
2018-02-01
In this paper, we present a novel inverse modeling method called Discrete Network Deterministic Inversion (DNDI) for mapping the geometry and property of the discrete network of conduits and fractures in the karstified aquifers. The DNDI algorithm is based on a coupled discrete-continuum concept to simulate numerically water flows in a model and a deterministic optimization algorithm to invert a set of observed piezometric data recorded during multiple pumping tests. In this method, the model is partioned in subspaces piloted by a set of parameters (matrix transmissivity, and geometry and equivalent transmissivity of the conduits) that are considered as unknown. In this way, the deterministic optimization process can iteratively correct the geometry of the network and the values of the properties, until it converges to a global network geometry in a solution model able to reproduce the set of data. An uncertainty analysis of this result can be performed from the maps of posterior uncertainties on the network geometry or on the property values. This method has been successfully tested for three different theoretical and simplified study cases with hydraulic responses data generated from hypothetical karstic models with an increasing complexity of the network geometry, and of the matrix heterogeneity.
Stochastic simulations on a model of circadian rhythm generation.
Miura, Shigehiro; Shimokawa, Tetsuya; Nomura, Taishin
2008-01-01
Biological phenomena are often modeled by differential equations, where states of a model system are described by continuous real values. When we consider concentrations of molecules as dynamical variables for a set of biochemical reactions, we implicitly assume that numbers of the molecules are large enough so that their changes can be regarded as continuous and they are described deterministically. However, for a system with small numbers of molecules, changes in their numbers are apparently discrete and molecular noises become significant. In such cases, models with deterministic differential equations may be inappropriate, and the reactions must be described by stochastic equations. In this study, we focus a clock gene expression for a circadian rhythm generation, which is known as a system involving small numbers of molecules. Thus it is appropriate for the system to be modeled by stochastic equations and analyzed by methodologies of stochastic simulations. The interlocked feedback model proposed by Ueda et al. as a set of deterministic ordinary differential equations provides a basis of our analyses. We apply two stochastic simulation methods, namely Gillespie's direct method and the stochastic differential equation method also by Gillespie, to the interlocked feedback model. To this end, we first reformulated the original differential equations back to elementary chemical reactions. With those reactions, we simulate and analyze the dynamics of the model using two methods in order to compare them with the dynamics obtained from the original deterministic model and to characterize dynamics how they depend on the simulation methodologies.
NASA Astrophysics Data System (ADS)
Lye, Ribin; Tan, James Peng Lung; Cheong, Siew Ann
2012-11-01
We describe a bottom-up framework, based on the identification of appropriate order parameters and determination of phase diagrams, for understanding progressively refined agent-based models and simulations of financial markets. We illustrate this framework by starting with a deterministic toy model, whereby N independent traders buy and sell M stocks through an order book that acts as a clearing house. The price of a stock increases whenever it is bought and decreases whenever it is sold. Price changes are updated by the order book before the next transaction takes place. In this deterministic model, all traders based their buy decisions on a call utility function, and all their sell decisions on a put utility function. We then make the agent-based model more realistic, by either having a fraction fb of traders buy a random stock on offer, or a fraction fs of traders sell a random stock in their portfolio. Based on our simulations, we find that it is possible to identify useful order parameters from the steady-state price distributions of all three models. Using these order parameters as a guide, we find three phases: (i) the dead market; (ii) the boom market; and (iii) the jammed market in the phase diagram of the deterministic model. Comparing the phase diagrams of the stochastic models against that of the deterministic model, we realize that the primary effect of stochasticity is to eliminate the dead market phase.
Distinguishing between stochasticity and determinism: Examples from cell cycle duration variability.
Pearl Mizrahi, Sivan; Sandler, Oded; Lande-Diner, Laura; Balaban, Nathalie Q; Simon, Itamar
2016-01-01
We describe a recent approach for distinguishing between stochastic and deterministic sources of variability, focusing on the mammalian cell cycle. Variability between cells is often attributed to stochastic noise, although it may be generated by deterministic components. Interestingly, lineage information can be used to distinguish between variability and determinism. Analysis of correlations within a lineage of the mammalian cell cycle duration revealed its deterministic nature. Here, we discuss the sources of such variability and the possibility that the underlying deterministic process is due to the circadian clock. Finally, we discuss the "kicked cell cycle" model and its implication on the study of the cell cycle in healthy and cancerous tissues. © 2015 WILEY Periodicals, Inc.
Tuszyński, Paweł K.; Polak, Sebastian; Jachowicz, Renata; Mendyk, Aleksander; Dohnal, Jiří
2015-01-01
Different batches of atorvastatin, represented by two immediate release formulation designs, were studied using a novel dynamic dissolution apparatus, simulating stomach and small intestine. A universal dissolution method was employed which simulated the physiology of human gastrointestinal tract, including the precise chyme transit behavior and biorelevant conditions. The multicompartmental dissolution data allowed direct observation and qualitative discrimination of the differences resulting from highly pH dependent dissolution behavior of the tested batches. Further evaluation of results was performed using IVIVC/IVIVR development. While satisfactory correlation could not be achieved using a conventional deconvolution based-model, promising results were obtained through the use of a nonconventional approach exploiting the complex compartmental dissolution data. PMID:26120580
NASA Astrophysics Data System (ADS)
Aumiller, William M.; Keating, Christine D.
2016-02-01
Biological cells are highly organized, with numerous subcellular compartments. Phosphorylation has been hypothesized as a means to control the assembly/disassembly of liquid-like RNA- and protein-rich intracellular bodies, or liquid organelles, that lack delimiting membranes. Here, we demonstrate that charge-mediated phase separation, or complex coacervation, of RNAs with cationic peptides can generate simple model liquid organelles capable of reversibly compartmentalizing biomolecules. Formation and dissolution of these liquid bodies was controlled by changes in peptide phosphorylation state using a kinase/phosphatase enzyme pair. The droplet-generating phase transition responded to modification of even a single serine residue. Electrostatic interactions between the short cationic peptides and the much longer polyanionic RNAs drove phase separation. Coacervates were also formed on silica beads, a primitive model for localization at specific intracellular sites. This work supports phosphoregulation of complex coacervation as a viable mechanism for dynamic intracellular compartmentalization in membraneless organelles.
Billard, L; Dayananda, P W A
2014-03-01
Stochastic population processes have received a lot of attention over the years. One approach focuses on compartmental modeling. Billard and Dayananda (2012) developed one such multi-stage model for epidemic processes in which the possibility that individuals can die at any stage from non-disease related causes was also included. This extra feature is of particular interest to the insurance and health-care industries among others especially when the epidemic is HIV/AIDS. Rather than working with numbers of individuals in each stage, they obtained distributional results dealing with the waiting time any one individual spent in each stage given the initial stage. In this work, the impact of the HIV/AIDS epidemic on several functions relevant to these industries (such as adjustments to premiums) is investigated. Theoretical results are derived, followed by a numerical study. Copyright © 2014 Elsevier Inc. All rights reserved.
Winskill, Peter; Harrison, Wendy E; French, Michael D; Dixon, Matthew A; Abela-Ridder, Bernadette; Basáñez, María-Gloria
2017-02-09
The pork tapeworm, Taenia solium, and associated human infections, taeniasis, cysticercosis and neurocysticercosis, are serious public health problems, especially in developing countries. The World Health Organization (WHO) has set goals for having a validated strategy for control and elimination of T. solium taeniasis/cysticercosis by 2015 and interventions scaled-up in selected countries by 2020. Timely achievement of these internationally-endorsed targets requires that the relative benefits and effectiveness of potential interventions be explored rigorously within a quantitative framework. A deterministic, compartmental transmission model (EPICYST) was developed to capture the dynamics of the taeniasis/cysticercosis disease system in the human and pig hosts. Cysticercosis prevalence in humans, an outcome of high epidemiological and clinical importance, was explicitly modelled. A next generation matrix approach was used to derive an expression for the basic reproduction number, R 0 . A full sensitivity analysis was performed using a methodology based on Latin-hypercube sampling partial rank correlation coefficient index. EPICYST outputs indicate that chemotherapeutic intervention targeted at humans or pigs would be highly effective at reducing taeniasis and cysticercosis prevalence when applied singly, with annual chemotherapy of humans and pigs resulting, respectively, in 94 and 74% of human cysticercosis cases averted. Improved sanitation, meat inspection and animal husbandry are less effective but are still able to reduce prevalence singly or in combination. The value of R 0 for taeniasis was estimated at 1.4 (95% Credible Interval: 0.5-3.6). Human- and pig-targeted drug-focussed interventions appear to be the most efficacious approach from the options currently available. The model presented is a forward step towards developing an informed control and elimination strategy for cysticercosis. Together with its validation against field data, EPICYST will be a valuable tool to help reach the WHO goals and to conduct economic evaluations of interventions in varying epidemiological settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorooshian, S.; Bales, R.C.; Gupta, V.K.
1992-02-01
In order to better understand the implications of acid deposition in watershed systems in the Sierra Nevada, the California Air Resources Board (CARB) initiated an intensive integrated watershed study at Emerald Lake in Sequoia National Park. The comprehensive nature of the data obtained from these studies provided an opportunity to develop a quantitative description of how watershed characteristics and inputs to the watershed influence within-watershed fluxes, chemical composition of streams and lakes, and, therefore, biotic processes. Two different but closely-related modeling approaches were followed. In the first, the emphasis was placed on the development of systems-theoretic models. In the secondmore » approach, development of a compartmental model was undertaken. The systems-theoretic effort results in simple time-series models that allow the consideration of the stochastic properties of model errors. The compartmental model (the University of Arizona Alpine Hydrochemical Model (AHM)) is a comprehensive and detailed description of the various interacting physical and chemical processes occurring on the watershed.« less
The threshold of a stochastic delayed SIR epidemic model with temporary immunity
NASA Astrophysics Data System (ADS)
Liu, Qun; Chen, Qingmei; Jiang, Daqing
2016-05-01
This paper is concerned with the asymptotic properties of a stochastic delayed SIR epidemic model with temporary immunity. Sufficient conditions for extinction and persistence in the mean of the epidemic are established. The threshold between persistence in the mean and extinction of the epidemic is obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number R0 of the deterministic system.
TRIM.FaTE is a spatially explicit, compartmental mass balance model that describes the movement and transformation of pollutants over time, through a user-defined, bounded system that includes both biotic and abiotic compartments.
A queuing model for road traffic simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.
We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme.
Effect of nonlinearity in hybrid kinetic Monte Carlo-continuum models.
Balter, Ariel; Lin, Guang; Tartakovsky, Alexandre M
2012-01-01
Recently there has been interest in developing efficient ways to model heterogeneous surface reactions with hybrid computational models that couple a kinetic Monte Carlo (KMC) model for a surface to a finite-difference model for bulk diffusion in a continuous domain. We consider two representative problems that validate a hybrid method and show that this method captures the combined effects of nonlinearity and stochasticity. We first validate a simple deposition-dissolution model with a linear rate showing that the KMC-continuum hybrid agrees with both a fully deterministic model and its analytical solution. We then study a deposition-dissolution model including competitive adsorption, which leads to a nonlinear rate, and show that in this case the KMC-continuum hybrid and fully deterministic simulations do not agree. However, we are able to identify the difference as a natural result of the stochasticity coming from the KMC surface process. Because KMC captures inherent fluctuations, we consider it to be more realistic than a purely deterministic model. Therefore, we consider the KMC-continuum hybrid to be more representative of a real system.
Effect of Nonlinearity in Hybrid Kinetic Monte Carlo-Continuum Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balter, Ariel I.; Lin, Guang; Tartakovsky, Alexandre M.
2012-04-23
Recently there has been interest in developing efficient ways to model heterogeneous surface reactions with hybrid computational models that couple a KMC model for a surface to a finite difference model for bulk diffusion in a continuous domain. We consider two representative problems that validate a hybrid method and also show that this method captures the combined effects of nonlinearity and stochasticity. We first validate a simple deposition/dissolution model with a linear rate showing that the KMC-continuum hybrid agrees with both a fully deterministic model and its analytical solution. We then study a deposition/dissolution model including competitive adsorption, which leadsmore » to a nonlinear rate, and show that, in this case, the KMC-continuum hybrid and fully deterministic simulations do not agree. However, we are able to identify the difference as a natural result of the stochasticity coming from the KMC surface process. Because KMC captures inherent fluctuations, we consider it to be more realistic than a purely deterministic model. Therefore, we consider the KMC-continuum hybrid to be more representative of a real system.« less
Hahl, Sayuri K; Kremling, Andreas
2016-01-01
In the mathematical modeling of biochemical reactions, a convenient standard approach is to use ordinary differential equations (ODEs) that follow the law of mass action. However, this deterministic ansatz is based on simplifications; in particular, it neglects noise, which is inherent to biological processes. In contrast, the stochasticity of reactions is captured in detail by the discrete chemical master equation (CME). Therefore, the CME is frequently applied to mesoscopic systems, where copy numbers of involved components are small and random fluctuations are thus significant. Here, we compare those two common modeling approaches, aiming at identifying parallels and discrepancies between deterministic variables and possible stochastic counterparts like the mean or modes of the state space probability distribution. To that end, a mathematically flexible reaction scheme of autoregulatory gene expression is translated into the corresponding ODE and CME formulations. We show that in the thermodynamic limit, deterministic stable fixed points usually correspond well to the modes in the stationary probability distribution. However, this connection might be disrupted in small systems. The discrepancies are characterized and systematically traced back to the magnitude of the stoichiometric coefficients and to the presence of nonlinear reactions. These factors are found to synergistically promote large and highly asymmetric fluctuations. As a consequence, bistable but unimodal, and monostable but bimodal systems can emerge. This clearly challenges the role of ODE modeling in the description of cellular signaling and regulation, where some of the involved components usually occur in low copy numbers. Nevertheless, systems whose bimodality originates from deterministic bistability are found to sustain a more robust separation of the two states compared to bimodal, but monostable systems. In regulatory circuits that require precise coordination, ODE modeling is thus still expected to provide relevant indications on the underlying dynamics.
Deterministic Mean-Field Ensemble Kalman Filtering
Law, Kody J. H.; Tembine, Hamidou; Tempone, Raul
2016-05-03
The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. In this paper, a density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence κ between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d
Yin, Shen; Gao, Huijun; Qiu, Jianbin; Kaynak, Okyay
2017-11-01
Data-driven fault detection plays an important role in industrial systems due to its applicability in case of unknown physical models. In fault detection, disturbances must be taken into account as an inherent characteristic of processes. Nevertheless, fault detection for nonlinear processes with deterministic disturbances still receive little attention, especially in data-driven field. To solve this problem, a just-in-time learning-based data-driven (JITL-DD) fault detection method for nonlinear processes with deterministic disturbances is proposed in this paper. JITL-DD employs JITL scheme for process description with local model structures to cope with processes dynamics and nonlinearity. The proposed method provides a data-driven fault detection solution for nonlinear processes with deterministic disturbances, and owns inherent online adaptation and high accuracy of fault detection. Two nonlinear systems, i.e., a numerical example and a sewage treatment process benchmark, are employed to show the effectiveness of the proposed method.
Deterministic Mean-Field Ensemble Kalman Filtering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Law, Kody J. H.; Tembine, Hamidou; Tempone, Raul
The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. In this paper, a density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence κ between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d
Chao, Lin; Rang, Camilla Ulla; Proenca, Audrey Menegaz; Chao, Jasper Ubirajara
2016-01-01
Non-genetic phenotypic variation is common in biological organisms. The variation is potentially beneficial if the environment is changing. If the benefit is large, selection can favor the evolution of genetic assimilation, the process by which the expression of a trait is transferred from environmental to genetic control. Genetic assimilation is an important evolutionary transition, but it is poorly understood because the fitness costs and benefits of variation are often unknown. Here we show that the partitioning of damage by a mother bacterium to its two daughters can evolve through genetic assimilation. Bacterial phenotypes are also highly variable. Because gene-regulating elements can have low copy numbers, the variation is attributed to stochastic sampling. Extant Escherichia coli partition asymmetrically and deterministically more damage to the old daughter, the one receiving the mother’s old pole. By modeling in silico damage partitioning in a population, we show that deterministic asymmetry is advantageous because it increases fitness variance and hence the efficiency of natural selection. However, we find that symmetrical but stochastic partitioning can be similarly beneficial. To examine why bacteria evolved deterministic asymmetry, we modeled the effect of damage anchored to the mother’s old pole. While anchored damage strengthens selection for asymmetry by creating additional fitness variance, it has the opposite effect on symmetry. The difference results because anchored damage reinforces the polarization of partitioning in asymmetric bacteria. In symmetric bacteria, it dilutes the polarization. Thus, stochasticity alone may have protected early bacteria from damage, but deterministic asymmetry has evolved to be equally important in extant bacteria. We estimate that 47% of damage partitioning is deterministic in E. coli. We suggest that the evolution of deterministic asymmetry from stochasticity offers an example of Waddington’s genetic assimilation. Our model is able to quantify the evolution of the assimilation because it characterizes the fitness consequences of variation. PMID:26761487
Chao, Lin; Rang, Camilla Ulla; Proenca, Audrey Menegaz; Chao, Jasper Ubirajara
2016-01-01
Non-genetic phenotypic variation is common in biological organisms. The variation is potentially beneficial if the environment is changing. If the benefit is large, selection can favor the evolution of genetic assimilation, the process by which the expression of a trait is transferred from environmental to genetic control. Genetic assimilation is an important evolutionary transition, but it is poorly understood because the fitness costs and benefits of variation are often unknown. Here we show that the partitioning of damage by a mother bacterium to its two daughters can evolve through genetic assimilation. Bacterial phenotypes are also highly variable. Because gene-regulating elements can have low copy numbers, the variation is attributed to stochastic sampling. Extant Escherichia coli partition asymmetrically and deterministically more damage to the old daughter, the one receiving the mother's old pole. By modeling in silico damage partitioning in a population, we show that deterministic asymmetry is advantageous because it increases fitness variance and hence the efficiency of natural selection. However, we find that symmetrical but stochastic partitioning can be similarly beneficial. To examine why bacteria evolved deterministic asymmetry, we modeled the effect of damage anchored to the mother's old pole. While anchored damage strengthens selection for asymmetry by creating additional fitness variance, it has the opposite effect on symmetry. The difference results because anchored damage reinforces the polarization of partitioning in asymmetric bacteria. In symmetric bacteria, it dilutes the polarization. Thus, stochasticity alone may have protected early bacteria from damage, but deterministic asymmetry has evolved to be equally important in extant bacteria. We estimate that 47% of damage partitioning is deterministic in E. coli. We suggest that the evolution of deterministic asymmetry from stochasticity offers an example of Waddington's genetic assimilation. Our model is able to quantify the evolution of the assimilation because it characterizes the fitness consequences of variation.
A hybrid model for predicting carbon monoxide from vehicular exhausts in urban environments
NASA Astrophysics Data System (ADS)
Gokhale, Sharad; Khare, Mukesh
Several deterministic-based air quality models evaluate and predict the frequently occurring pollutant concentration well but, in general, are incapable of predicting the 'extreme' concentrations. In contrast, the statistical distribution models overcome the above limitation of the deterministic models and predict the 'extreme' concentrations. However, the environmental damages are caused by both extremes as well as by the sustained average concentration of pollutants. Hence, the model should predict not only 'extreme' ranges but also the 'middle' ranges of pollutant concentrations, i.e. the entire range. Hybrid modelling is one of the techniques that estimates/predicts the 'entire range' of the distribution of pollutant concentrations by combining the deterministic based models with suitable statistical distribution models ( Jakeman, et al., 1988). In the present paper, a hybrid model has been developed to predict the carbon monoxide (CO) concentration distributions at one of the traffic intersections, Income Tax Office (ITO), in the Delhi city, where the traffic is heterogeneous in nature and meteorology is 'tropical'. The model combines the general finite line source model (GFLSM) as its deterministic, and log logistic distribution (LLD) model, as its statistical components. The hybrid (GFLSM-LLD) model is then applied at the ITO intersection. The results show that the hybrid model predictions match with that of the observed CO concentration data within the 5-99 percentiles range. The model is further validated at different street location, i.e. Sirifort roadway. The validation results show that the model predicts CO concentrations fairly well ( d=0.91) in 10-95 percentiles range. The regulatory compliance is also developed to estimate the probability of exceedance of hourly CO concentration beyond the National Ambient Air Quality Standards (NAAQS) of India. It consists of light vehicles, heavy vehicles, three- wheelers (auto rickshaws) and two-wheelers (scooters, motorcycles, etc).
Dini-Andreote, Francisco; Stegen, James C; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-03-17
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages--which provide a larger spatiotemporal scale relative to within stage analyses--revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended--and experimentally testable--conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems.
Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-01-01
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages—which provide a larger spatiotemporal scale relative to within stage analyses—revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended—and experimentally testable—conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems. PMID:25733885
Nuclear envelope rupture: little holes, big openings.
Hatch, Emily M
2018-06-01
The nuclear envelope (NE), which is a critical barrier between the DNA and the cytosol, is capable of extensive dynamic membrane remodeling events in interphase. One of these events, interphase NE rupture and repair, can occur in both normal and disease states and results in the loss of nucleus compartmentalization. NE rupture is not lethal, but new research indicates that it could have broad impacts on genome stability and activate innate immune responses. These observations suggest a new model for how changes in NE structure could be pathogenic in cancer, laminopathies, and autoinflammatory syndromes, and redefine the functions of nucleus compartmentalization. Copyright © 2018 Elsevier Ltd. All rights reserved.
A Compartmental Model for Zika Virus with Dynamic Human and Vector Populations
Lee, Eva K; Liu, Yifan; Pietz, Ferdinand H
2016-01-01
The Zika virus (ZIKV) outbreak in South American countries and its potential association with microcephaly in newborns and Guillain-Barré Syndrome led the World Health Organization to declare a Public Health Emergency of International Concern. To understand the ZIKV disease dynamics and evaluate the effectiveness of different containment strategies, we propose a compartmental model with a vector-host structure for ZIKV. The model utilizes logistic growth in human population and dynamic growth in vector population. Using this model, we derive the basic reproduction number to gain insight on containment strategies. We contrast the impact and influence of different parameters on the virus trend and outbreak spread. We also evaluate different containment strategies and their combination effects to achieve early containment by minimizing total infections. This result can help decision makers select and invest in the strategies most effective to combat the infection spread. The decision-support tool demonstrates the importance of “digital disease surveillance” in response to waves of epidemics including ZIKV, Dengue, Ebola and cholera. PMID:28269870
Stochastic Multi-Timescale Power System Operations With Variable Wind Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Hongyu; Krad, Ibrahim; Florita, Anthony
This paper describes a novel set of stochastic unit commitment and economic dispatch models that consider stochastic loads and variable generation at multiple operational timescales. The stochastic model includes four distinct stages: stochastic day-ahead security-constrained unit commitment (SCUC), stochastic real-time SCUC, stochastic real-time security-constrained economic dispatch (SCED), and deterministic automatic generation control (AGC). These sub-models are integrated together such that they are continually updated with decisions passed from one to another. The progressive hedging algorithm (PHA) is applied to solve the stochastic models to maintain the computational tractability of the proposed models. Comparative case studies with deterministic approaches are conductedmore » in low wind and high wind penetration scenarios to highlight the advantages of the proposed methodology, one with perfect forecasts and the other with current state-of-the-art but imperfect deterministic forecasts. The effectiveness of the proposed method is evaluated with sensitivity tests using both economic and reliability metrics to provide a broader view of its impact.« less
Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.
Gomez, Christophe; Hartung, Niklas
2018-01-01
Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.
Robust Sensitivity Analysis for Multi-Attribute Deterministic Hierarchical Value Models
2002-03-01
such as weighted sum method, weighted 5 product method, and the Analytic Hierarchy Process ( AHP ). This research focuses on only weighted sum...different groups. They can be termed as deterministic, stochastic, or fuzzy multi-objective decision methods if they are classified according to the...weighted product model (WPM), and analytic hierarchy process ( AHP ). His method attempts to identify the most important criteria weight and the most
Dynamic analysis of a stochastic rumor propagation model
NASA Astrophysics Data System (ADS)
Jia, Fangju; Lv, Guangying
2018-01-01
The rapid development of the Internet, especially the emergence of the social networks, leads rumor propagation into a new media era. In this paper, we are concerned with a stochastic rumor propagation model. Sufficient conditions for extinction and persistence in the mean of the rumor are established. The threshold between persistence in the mean and extinction of the rumor is obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number R0 of the deterministic system.
Dermal exposure to volatile organic compounds (VOCs) in water results from environmental contamination of surface, ground-, and drinking waters. This exposure occurs both in occupational and residential settings. Compartmental models incorporating body burden measurements have ...
NASA Astrophysics Data System (ADS)
Sinner, K.; Teasley, R. L.
2016-12-01
Groundwater models serve as integral tools for understanding flow processes and informing stakeholders and policy makers in management decisions. Historically, these models tended towards a deterministic nature, relying on historical data to predict and inform future decisions based on model outputs. This research works towards developing a stochastic method of modeling recharge inputs from pipe main break predictions in an existing groundwater model, which subsequently generates desired outputs incorporating future uncertainty rather than deterministic data. The case study for this research is the Barton Springs segment of the Edwards Aquifer near Austin, Texas. Researchers and water resource professionals have modeled the Edwards Aquifer for decades due to its high water quality, fragile ecosystem, and stakeholder interest. The original case study and model that this research is built upon was developed as a co-design problem with regional stakeholders and the model outcomes are generated specifically for communication with policy makers and managers. Recently, research in the Barton Springs segment demonstrated a significant contribution of urban, or anthropogenic, recharge to the aquifer, particularly during dry period, using deterministic data sets. Due to social and ecological importance of urban water loss to recharge, this study develops an evaluation method to help predicted pipe breaks and their related recharge contribution within the Barton Springs segment of the Edwards Aquifer. To benefit groundwater management decision processes, the performance measures captured in the model results, such as springflow, head levels, storage, and others, were determined by previous work in elicitation of problem framing to determine stakeholder interests and concerns. The results of the previous deterministic model and the stochastic model are compared to determine gains to stakeholder knowledge through the additional modeling
Saito, Hiroshi; Katahira, Kentaro; Okanoya, Kazuo; Okada, Masato
2014-01-01
The decision making behaviors of humans and animals adapt and then satisfy an "operant matching law" in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.
Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar
2015-06-01
Current toxicity protocols relate measures of systemic exposure (i.e. AUC, Cmax) as obtained by non-compartmental analysis to observed toxicity. A complicating factor in this practice is the potential bias in the estimates defining safe drug exposure. Moreover, it prevents the assessment of variability. The objective of the current investigation was therefore (a) to demonstrate the feasibility of applying nonlinear mixed effects modelling for the evaluation of toxicokinetics and (b) to assess the bias and accuracy in summary measures of systemic exposure for each method. Here, simulation scenarios were evaluated, which mimic toxicology protocols in rodents. To ensure differences in pharmacokinetic properties are accounted for, hypothetical drugs with varying disposition properties were considered. Data analysis was performed using non-compartmental methods and nonlinear mixed effects modelling. Exposure levels were expressed as area under the concentration versus time curve (AUC), peak concentrations (Cmax) and time above a predefined threshold (TAT). Results were then compared with the reference values to assess the bias and precision of parameter estimates. Higher accuracy and precision were observed for model-based estimates (i.e. AUC, Cmax and TAT), irrespective of group or treatment duration, as compared with non-compartmental analysis. Despite the focus of guidelines on establishing safety thresholds for the evaluation of new molecules in humans, current methods neglect uncertainty, lack of precision and bias in parameter estimates. The use of nonlinear mixed effects modelling for the analysis of toxicokinetics provides insight into variability and should be considered for predicting safe exposure in humans.
Models to capture the potential for disease transmission in domestic sheep flocks.
Schley, David; Whittle, Sophie; Taylor, Michael; Kiss, Istvan Zoltan
2012-09-15
Successful control of livestock diseases requires an understanding of how they spread amongst animals and between premises. Mathematical models can offer important insight into the dynamics of disease, especially when built upon experimental and/or field data. Here the dynamics of a range of epidemiological models are explored in order to determine which models perform best in capturing real-world heterogeneities at sufficient resolution. Individual based network models are considered together with one- and two-class compartmental models, for which the final epidemic size is calculated as a function of the probability of disease transmission occurring during a given physical contact between two individuals. For numerical results the special cases of a viral disease with a fast recovery rate (foot-and-mouth disease) and a bacterial disease with a slow recovery rate (brucellosis) amongst sheep are considered. Quantitative results from observational studies of physical contact amongst domestic sheep are applied and results from the differently structured flocks (ewes with newborn lambs, ewes with nearly weaned lambs and ewes only) compared. These indicate that the breeding cycle leads to significant changes in the expected basic reproduction ratio of diseases. The observed heterogeneity of contacts amongst animals is best captured by full network simulations, although simple compartmental models describe the key features of an outbreak but, as expected, often overestimate the speed of an outbreak. Here the weights of contacts are heterogeneous, with many low weight links. However, due to the well-connected nature of the networks, this has little effect and differences between models remain small. These results indicate that simple compartmental models can be a useful tool for modelling real-world flocks; their applicability will be greater still for more homogeneously mixed livestock, which could be promoted by higher intensity farming practices. Copyright © 2012 Elsevier B.V. All rights reserved.
Proposing a Compartmental Model for Leprosy and Parameterizing Using Regional Incidence in Brazil.
Smith, Rebecca Lee
2016-08-01
Hansen's disease (HD), or leprosy, is still considered a public health risk in much of Brazil. Understanding the dynamics of the infection at a regional level can aid in identification of targets to improve control. A compartmental continuous-time model for leprosy dynamics was designed based on understanding of the biology of the infection. The transmission coefficients for the model and the rate of detection were fit for each region using Approximate Bayesian Computation applied to paucibacillary and multibacillary incidence data over the period of 2000 to 2010, and model fit was validated on incidence data from 2011 to 2012. Regional variation was noted in detection rate, with cases in the Midwest estimated to be infectious for 10 years prior to detection compared to 5 years for most other regions. Posterior predictions for the model estimated that elimination of leprosy as a public health risk would require, on average, 44-45 years in the three regions with the highest prevalence. The model is easily adaptable to other settings, and can be studied to determine the efficacy of improved case finding on leprosy control.
TRIM.FaTE Public Reference Library Documentation
TRIM.FaTE is a spatially explicit, compartmental mass balance model that describes the movement and transformation of pollutants over time, through a user-defined, bounded system that includes both biotic and abiotic compartments.
Sturge-Apple, Melissa L; Davies, Patrick T; Cicchetti, Dante; Fittoria, Michael G
2014-11-01
The present study incorporates a person-based approach to identify spillover and compartmentalization patterns of interpartner conflict and maternal parenting practices in an ethnically diverse sample of 192 2-year-old children and their mothers who had experienced higher levels of socioeconomic risk. In addition, we tested whether sociocontextual variables were differentially predictive of theses profiles and examined how interpartner-parenting profiles were associated with children's physiological and psychological adjustment over time. As expected, latent class analyses extracted three primary profiles of functioning: adequate functioning, spillover, and compartmentalizing families. Furthermore, interpartner-parenting profiles were differentially associated with both sociocontextual predictors and children's adjustment trajectories. The findings highlight the developmental utility of incorporating person-based approaches to models of interpartner conflict and maternal parenting practices.
Transit-time and age distributions for nonlinear time-dependent compartmental systems.
Metzler, Holger; Müller, Markus; Sierra, Carlos A
2018-02-06
Many processes in nature are modeled using compartmental systems (reservoir/pool/box systems). Usually, they are expressed as a set of first-order differential equations describing the transfer of matter across a network of compartments. The concepts of age of matter in compartments and the time required for particles to transit the system are important diagnostics of these models with applications to a wide range of scientific questions. Until now, explicit formulas for transit-time and age distributions of nonlinear time-dependent compartmental systems were not available. We compute densities for these types of systems under the assumption of well-mixed compartments. Assuming that a solution of the nonlinear system is available at least numerically, we show how to construct a linear time-dependent system with the same solution trajectory. We demonstrate how to exploit this solution to compute transit-time and age distributions in dependence on given start values and initial age distributions. Furthermore, we derive equations for the time evolution of quantiles and moments of the age distributions. Our results generalize available density formulas for the linear time-independent case and mean-age formulas for the linear time-dependent case. As an example, we apply our formulas to a nonlinear and a linear version of a simple global carbon cycle model driven by a time-dependent input signal which represents fossil fuel additions. We derive time-dependent age distributions for all compartments and calculate the time it takes to remove fossil carbon in a business-as-usual scenario.
Abrams , Robert H.; Loague, Keith
2000-01-01
This paper, the second of two parts [see Abrams and Loague, this issue], reports the field‐scale application of COMPTRAN (compartmentalized solute transport model) for simulating the development of redox zones. COMPTRAN is fully developed and described in the companion paper. Redox zones, which are often delineated by the relative concentrations of dissolved oxygen, have been observed around the globe. The distribution of other redox‐sensitive species is affected by redox zonation. At the U.S. Geological Survey's Cape Cod research site, an anoxic zone containing high concentrations of dissolved iron has been observed. Field data were abstracted from the Cape Cod site for the one‐dimensional and two‐dimensional COMPTRAN simulations reported in this paper. The purpose of the concept‐development simulations was to demonstrate that the compartmentalized approach reported by Abrams et al. [1998] can be linked with a solute transport model to simulate field‐scale phenomena. The results presented in this paper show that COMPTRAN successfully simulated the development of redox zones at the field scale, including trends in pH and alkalinity. Thermodynamic constraints were used to prevent lower‐energy redox reactions from occurring under infeasible geochemical conditions without imposing equilibrium among all redox species. Empirical methods of reaction inhibition were not needed for the simulations conducted for this study. COMPTRAN can be extended easily to include additional compartments and reactions and is capable of handling complex velocity fields in more than one dimension.
Field Testing of Compartmentalization Methods for Multifamily Construction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ueno, K.; Lstiburek, J. W.
2015-03-01
The 2012 International Energy Conservation Code (IECC) has an airtightness requirement of 3 air changes per hour at 50 Pascals test pressure (3 ACH50) for single-family and multifamily construction (in climate zones 3–8). The Leadership in Energy & Environmental Design certification program and ASHRAE Standard 189 have comparable compartmentalization requirements. ASHRAE Standard 62.2 will soon be responsible for all multifamily ventilation requirements (low rise and high rise); it has an exceptionally stringent compartmentalization requirement. These code and program requirements are driving the need for easier and more effective methods of compartmentalization in multifamily buildings.
Semiconductor nanostructures for artificial photosynthesis
NASA Astrophysics Data System (ADS)
Yang, Peidong
2012-02-01
Nanowires, with their unique capability to bridge the nanoscopic and macroscopic worlds, have already been demonstrated as important materials for different energy conversion. One emerging and exciting direction is their application for solar to fuel conversion. The generation of fuels by the direct conversion of solar energy in a fully integrated system is an attractive goal, but no such system has been demonstrated that shows the required efficiency, is sufficiently durable, or can be manufactured at reasonable cost. One of the most critical issues in solar water splitting is the development of a suitable photoanode with high efficiency and long-term durability in an aqueous environment. Semiconductor nanowires represent an important class of nanostructure building block for direct solar-to-fuel application because of their high surface area, tunable bandgap and efficient charge transport and collection. Nanowires can be readily designed and synthesized to deterministically incorporate heterojunctions with improved light absorption, charge separation and vectorial transport. Meanwhile, it is also possible to selectively decorate different oxidation or reduction catalysts onto specific segments of the nanowires to mimic the compartmentalized reactions in natural photosynthesis. In this talk, I will highlight several recent examples in this lab using semiconductor nanowires and their heterostructures for the purpose of direct solar water splitting.
Role of demographic stochasticity in a speciation model with sexual reproduction
NASA Astrophysics Data System (ADS)
Lafuerza, Luis F.; McKane, Alan J.
2016-03-01
Recent theoretical studies have shown that demographic stochasticity can greatly increase the tendency of asexually reproducing phenotypically diverse organisms to spontaneously evolve into localized clusters, suggesting a simple mechanism for sympatric speciation. Here we study the role of demographic stochasticity in a model of competing organisms subject to assortative mating. We find that in models with sexual reproduction, noise can also lead to the formation of phenotypic clusters in parameter ranges where deterministic models would lead to a homogeneous distribution. In some cases, noise can have a sizable effect, rendering the deterministic modeling insufficient to understand the phenotypic distribution.
KINETIC MODEL OF FLUORIDE METABOLISM IN THE RABBIT
Sodium fluoride, in small doses, was given to rabbits intravenously or by stomach tube, and the appearance of fluoride in the blood and urine was then monitored frequently over the next 10 hours. Compartmental analysis of the data yielded a kinetic model of fluoride metabolism co...
Audi, Said; Poellmann, Michael; Zhu, Xiaoguang; Li, Zhixin; Zhao, Ming
2007-11-01
It was recently demonstrated that the radiolabeled C2A domain of synaptotagmin I accumulates avidly in the area at risk after ischemia and reperfusion. The objective was to quantitatively characterize the dynamic uptake of radiolabeled C2A in normal and ischemically injured myocardia using a compartmental model. To induce acute myocardial infarction, the left descending coronary artery was ligated for 18 min, followed by reperfusion. [99mTc]C2A-GST or its inactivated form, [99mTc]C2A-GST-NHS, was injected intravenously at 2 h after reperfusion. A group of four rats was sacrificed at 10, 30, 60 and 180 after injection. Uptake of [99mTc]C2A-GST and [99mTc]C2A-GST-NHS in the area at risk and in the normal myocardium were determined by gamma counting. A compartmental model was developed to quantitatively interpret myocardial uptake kinetic data. The model consists of two physical spaces (vascular space and tissue space), with plasma activity as input. The model allows for [99mTc]C2A-GST and [99mTc]C2A-GST-NHS diffusion between vascular and tissue spaces, as well as for [99mTc]C2A-GST sequestration in vascular and tissue spaces via specific binding. [99mTc]C2A-GST uptake in the area at risk was significantly higher than that for [99mTc]C2A-GST-NHS at all time points. The compartmental model separated [99mTc]C2A-GST uptake in the area at risk due to passive retention from that due to specific binding. The maximum amount of [99mTc]C2A-GST that could be sequestered in the area at risk due to specific binding was estimated at a total of 0.048 nmol/g tissue. The rate of [99mTc]C2A-GST sequestration within the tissue space of the area at risk was 0.012 ml/min. Modeling results also revealed that the diffusion rate of radiotracer between vascular and tissue spaces is the limiting factor of [99mTc]C2A-GST sequestration within the tissue space of the area at risk. [99mTc]C2A-GST is sequestered in the ischemically injured myocardium in a well-defined dynamic profile. Model parameters will be valuable indicators for gauging and guiding the development of future-generation molecular probes.
The construction of next-generation matrices for compartmental epidemic models.
Diekmann, O; Heesterbeek, J A P; Roberts, M G
2010-06-06
The basic reproduction number (0) is arguably the most important quantity in infectious disease epidemiology. The next-generation matrix (NGM) is the natural basis for the definition and calculation of (0) where finitely many different categories of individuals are recognized. We clear up confusion that has been around in the literature concerning the construction of this matrix, specifically for the most frequently used so-called compartmental models. We present a detailed easy recipe for the construction of the NGM from basic ingredients derived directly from the specifications of the model. We show that two related matrices exist which we define to be the NGM with large domain and the NGM with small domain. The three matrices together reflect the range of possibilities encountered in the literature for the characterization of (0). We show how they are connected and how their construction follows from the basic model ingredients, and establish that they have the same non-zero eigenvalues, the largest of which is the basic reproduction number (0). Although we present formal recipes based on linear algebra, we encourage the construction of the NGM by way of direct epidemiological reasoning, using the clear interpretation of the elements of the NGM and of the model ingredients. We present a selection of examples as a practical guide to our methods. In the appendix we present an elementary but complete proof that (0) defined as the dominant eigenvalue of the NGM for compartmental systems and the Malthusian parameter r, the real-time exponential growth rate in the early phase of an outbreak, are connected by the properties that (0) > 1 if and only if r > 0, and (0) = 1 if and only if r = 0.
The construction of next-generation matrices for compartmental epidemic models
Diekmann, O.; Heesterbeek, J. A. P.; Roberts, M. G.
2010-01-01
The basic reproduction number ℛ0 is arguably the most important quantity in infectious disease epidemiology. The next-generation matrix (NGM) is the natural basis for the definition and calculation of ℛ0 where finitely many different categories of individuals are recognized. We clear up confusion that has been around in the literature concerning the construction of this matrix, specifically for the most frequently used so-called compartmental models. We present a detailed easy recipe for the construction of the NGM from basic ingredients derived directly from the specifications of the model. We show that two related matrices exist which we define to be the NGM with large domain and the NGM with small domain. The three matrices together reflect the range of possibilities encountered in the literature for the characterization of ℛ0. We show how they are connected and how their construction follows from the basic model ingredients, and establish that they have the same non-zero eigenvalues, the largest of which is the basic reproduction number ℛ0. Although we present formal recipes based on linear algebra, we encourage the construction of the NGM by way of direct epidemiological reasoning, using the clear interpretation of the elements of the NGM and of the model ingredients. We present a selection of examples as a practical guide to our methods. In the appendix we present an elementary but complete proof that ℛ0 defined as the dominant eigenvalue of the NGM for compartmental systems and the Malthusian parameter r, the real-time exponential growth rate in the early phase of an outbreak, are connected by the properties that ℛ0 > 1 if and only if r > 0, and ℛ0 = 1 if and only if r = 0. PMID:19892718
Robust planning of dynamic wireless charging infrastructure for battery electric buses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zhaocai; Song, Ziqi
Battery electric buses with zero tailpipe emissions have great potential in improving environmental sustainability and livability of urban areas. However, the problems of high cost and limited range associated with on-board batteries have substantially limited the popularity of battery electric buses. The technology of dynamic wireless power transfer (DWPT), which provides bus operators with the ability to charge buses while in motion, may be able to effectively alleviate the drawbacks of electric buses. In this paper, we address the problem of simultaneously selecting the optimal location of the DWPT facilities and designing the optimal battery sizes of electric buses formore » a DWPT electric bus system. The problem is first constructed as a deterministic model in which the uncertainty of energy consumption and travel time of electric buses is neglected. The methodology of robust optimization (RO) is then adopted to address the uncertainty of energy consumption and travel time. The affinely adjustable robust counterpart (AARC) of the deterministic model is developed, and its equivalent tractable mathematical programming is derived. Both the deterministic model and the robust model are demonstrated with a real-world bus system. The results of our study demonstrate that the proposed deterministic model can effectively determine the allocation of DWPT facilities and the battery sizes of electric buses for a DWPT electric bus system; and the robust model can further provide optimal designs that are robust against the uncertainty of energy consumption and travel time for electric buses.« less
Robust planning of dynamic wireless charging infrastructure for battery electric buses
Liu, Zhaocai; Song, Ziqi
2017-10-01
Battery electric buses with zero tailpipe emissions have great potential in improving environmental sustainability and livability of urban areas. However, the problems of high cost and limited range associated with on-board batteries have substantially limited the popularity of battery electric buses. The technology of dynamic wireless power transfer (DWPT), which provides bus operators with the ability to charge buses while in motion, may be able to effectively alleviate the drawbacks of electric buses. In this paper, we address the problem of simultaneously selecting the optimal location of the DWPT facilities and designing the optimal battery sizes of electric buses formore » a DWPT electric bus system. The problem is first constructed as a deterministic model in which the uncertainty of energy consumption and travel time of electric buses is neglected. The methodology of robust optimization (RO) is then adopted to address the uncertainty of energy consumption and travel time. The affinely adjustable robust counterpart (AARC) of the deterministic model is developed, and its equivalent tractable mathematical programming is derived. Both the deterministic model and the robust model are demonstrated with a real-world bus system. The results of our study demonstrate that the proposed deterministic model can effectively determine the allocation of DWPT facilities and the battery sizes of electric buses for a DWPT electric bus system; and the robust model can further provide optimal designs that are robust against the uncertainty of energy consumption and travel time for electric buses.« less
Multi-parametric variational data assimilation for hydrological forecasting
NASA Astrophysics Data System (ADS)
Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.
2017-12-01
Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.
Martinez, Alexander S.; Faist, Akasha M.
2016-01-01
Background Understanding patterns of biodiversity is a longstanding challenge in ecology. Similar to other biotic groups, arthropod community structure can be shaped by deterministic and stochastic processes, with limited understanding of what moderates the relative influence of these processes. Disturbances have been noted to alter the relative influence of deterministic and stochastic processes on community assembly in various study systems, implicating ecological disturbances as a potential moderator of these forces. Methods Using a disturbance gradient along a 5-year chronosequence of insect-induced tree mortality in a subalpine forest of the southern Rocky Mountains, Colorado, USA, we examined changes in community structure and relative influences of deterministic and stochastic processes in the assembly of aboveground (surface and litter-active species) and belowground (species active in organic and mineral soil layers) arthropod communities. Arthropods were sampled for all years of the chronosequence via pitfall traps (aboveground community) and modified Winkler funnels (belowground community) and sorted to morphospecies. Community structure of both communities were assessed via comparisons of morphospecies abundance, diversity, and composition. Assembly processes were inferred from a mixture of linear models and matrix correlations testing for community associations with environmental properties, and from null-deviation models comparing observed vs. expected levels of species turnover (Beta diversity) among samples. Results Tree mortality altered community structure in both aboveground and belowground arthropod communities, but null models suggested that aboveground communities experienced greater relative influences of deterministic processes, while the relative influence of stochastic processes increased for belowground communities. Additionally, Mantel tests and linear regression models revealed significant associations between the aboveground arthropod communities and vegetation and soil properties, but no significant association among belowground arthropod communities and environmental factors. Discussion Our results suggest context-dependent influences of stochastic and deterministic community assembly processes across different fractions of a spatially co-occurring ground-dwelling arthropod community following disturbance. This variation in assembly may be linked to contrasting ecological strategies and dispersal rates within above- and below-ground communities. Our findings add to a growing body of evidence indicating concurrent influences of stochastic and deterministic processes in community assembly, and highlight the need to consider potential variation across different fractions of biotic communities when testing community ecology theory and considering conservation strategies. PMID:27761333
Modern Workflows for Fracture Rock Hydrogeology
NASA Astrophysics Data System (ADS)
Doe, T.
2015-12-01
Discrete Fracture Network (DFN) is a numerical simulation approach that represents a conducting fracture network using geologically realistic geometries and single-conductor hydraulic and transport properties. In terms of diffusion analogues, equivalent porous media derive from heat conduction in continuous media, while DFN simulation is more similar to electrical flow and diffusion in circuits with discrete pathways. DFN modeling grew out of pioneering work of David Snow in the late 1960s with additional impetus in the 1970's from the development of the development of stochastic approaches for describing of fracture geometric and hydrologic properties. Research in underground test facilities for radioactive waste disposal developed the necessary linkages between characterization technologies and simulation as well as bringing about a hybrid deterministic stochastic approach. Over the past 40 years DFN simulation and characterization methods have moved from the research environment into practical, commercial application. The key geologic, geophysical and hydrologic tools provide the required DFN inputs of conductive fracture intensity, orientation, and transmissivity. Flow logging either using downhole tool or by detailed packer testing identifies the locations of conducting features in boreholes, and image logging provides information on the geology and geometry of the conducting features. Multi-zone monitoring systems isolate the individual conductors, and with subsequent drilling and characterization perturbations help to recognize connectivity and compartmentalization in the fracture network. Tracer tests and core analysis provide critical information on the transport properties especially matrix diffusion unidentified conducting pathways. Well test analyses incorporating flow dimension boundary effects provide further constraint on the conducting geometry of the fracture network.
NASA Astrophysics Data System (ADS)
Wang, Fengyu
Traditional deterministic reserve requirements rely on ad-hoc, rule of thumb methods to determine adequate reserve in order to ensure a reliable unit commitment. Since congestion and uncertainties exist in the system, both the quantity and the location of reserves are essential to ensure system reliability and market efficiency. The modeling of operating reserves in the existing deterministic reserve requirements acquire the operating reserves on a zonal basis and do not fully capture the impact of congestion. The purpose of a reserve zone is to ensure that operating reserves are spread across the network. Operating reserves are shared inside each reserve zone, but intra-zonal congestion may block the deliverability of operating reserves within a zone. Thus, improving reserve policies such as reserve zones may improve the location and deliverability of reserve. As more non-dispatchable renewable resources are integrated into the grid, it will become increasingly difficult to predict the transfer capabilities and the network congestion. At the same time, renewable resources require operators to acquire more operating reserves. With existing deterministic reserve requirements unable to ensure optimal reserve locations, the importance of reserve location and reserve deliverability will increase. While stochastic programming can be used to determine reserve by explicitly modelling uncertainties, there are still scalability as well as pricing issues. Therefore, new methods to improve existing deterministic reserve requirements are desired. One key barrier of improving existing deterministic reserve requirements is its potential market impacts. A metric, quality of service, is proposed in this thesis to evaluate the price signal and market impacts of proposed hourly reserve zones. Three main goals of this thesis are: 1) to develop a theoretical and mathematical model to better locate reserve while maintaining the deterministic unit commitment and economic dispatch structure, especially with the consideration of renewables, 2) to develop a market settlement scheme of proposed dynamic reserve policies such that the market efficiency is improved, 3) to evaluate the market impacts and price signal of the proposed dynamic reserve policies.
Moran, Nancy E; Cichon, Morgan J; Riedl, Kenneth M; Grainger, Elizabeth M; Schwartz, Steven J; Novotny, Janet A; Erdman, John W; Clinton, Steven K
2015-01-01
Background: Lycopene, which is a red carotenoid in tomatoes, has been hypothesized to mediate disease-preventive effects associated with tomato consumption. Lycopene is consumed primarily as the all-trans geometric isomer in foods, whereas human plasma and tissues show greater proportions of cis isomers. Objective: With the use of compartmental modeling and stable isotope technology, we determined whether endogenous all-trans-to-cis-lycopene isomerization or isomeric-bioavailability differences underlie the greater proportion of lycopene cis isomers in human tissues than in tomato foods. Design: Healthy men (n = 4) and women (n = 4) consumed 13C-lycopene (10.2 mg; 82% all-trans and 18% cis), and plasma was collected over 28 d. Unlabeled and 13C-labeled total lycopene and lycopene-isomer plasma concentrations, which were measured with the use of high-performance liquid chromatography–mass spectrometry, were fit to a 7-compartment model. Results: Subjects absorbed a mean ± SEM of 23% ± 6% of the lycopene. The proportion of plasma cis-13C-lycopene isomers increased over time, and all-trans had a shorter half-life than that of cis isomers (5.3 ± 0.3 and 8.8 ± 0.6 d, respectively; P < 0.001) and an earlier time to reach maximal plasma concentration than that of cis isomers (28 ± 7 and 48 ± 9 h, respectively). A compartmental model that allowed for interindividual differences in cis- and all-trans-lycopene bioavailability and endogenous trans-to-cis-lycopene isomerization was predictive of plasma 13C and unlabeled cis- and all-trans-lycopene concentrations. Although the bioavailability of cis (24.5% ± 6%) and all-trans (23.2% ± 8%) isomers did not differ, endogenous isomerization (0.97 ± 0.25 μmol/d in the fast-turnover tissue lycopene pool) drove tissue and plasma isomeric profiles. Conclusion: 13C-Lycopene combined with physiologic compartmental modeling provides a strategy for following complex in vivo metabolic processes in humans and reveals that postabsorptive trans-to-cis-lycopene isomerization, and not the differential bioavailability of isomers, drives tissue and plasma enrichment of cis-lycopene. This trial was registered at clinicaltrials.gov as NCT01692340. PMID:26561629
Study on the evaluation method for fault displacement based on characterized source model
NASA Astrophysics Data System (ADS)
Tonagi, M.; Takahama, T.; Matsumoto, Y.; Inoue, N.; Irikura, K.; Dalguer, L. A.
2016-12-01
In IAEA Specific Safety Guide (SSG) 9 describes that probabilistic methods for evaluating fault displacement should be used if no sufficient basis is provided to decide conclusively that the fault is not capable by using the deterministic methodology. In addition, International Seismic Safety Centre compiles as ANNEX to realize seismic hazard for nuclear facilities described in SSG-9 and shows the utility of the deterministic and probabilistic evaluation methods for fault displacement. In Japan, it is required that important nuclear facilities should be established on ground where fault displacement will not arise when earthquakes occur in the future. Under these situations, based on requirements, we need develop evaluation methods for fault displacement to enhance safety in nuclear facilities. We are studying deterministic and probabilistic methods with tentative analyses using observed records such as surface fault displacement and near-fault strong ground motions of inland crustal earthquake which fault displacements arose. In this study, we introduce the concept of evaluation methods for fault displacement. After that, we show parts of tentative analysis results for deterministic method as follows: (1) For the 1999 Chi-Chi earthquake, referring slip distribution estimated by waveform inversion, we construct a characterized source model (Miyake et al., 2003, BSSA) which can explain observed near-fault broad band strong ground motions. (2) Referring a characterized source model constructed in (1), we study an evaluation method for surface fault displacement using hybrid method, which combines particle method and distinct element method. At last, we suggest one of the deterministic method to evaluate fault displacement based on characterized source model. This research was part of the 2015 research project `Development of evaluating method for fault displacement` by the Secretariat of Nuclear Regulation Authority (S/NRA), Japan.
Yampolsky, Maya A.; Amiot, Catherine E.; de la Sablonnière, Roxane
2013-01-01
Understanding the experiences of multicultural individuals is vital in our diverse populations. Multicultural people often need to navigate the different norms and values associated with their multiple cultural identities. Recent research on multicultural identification has focused on how individuals with multiple cultural groups manage these different identities within the self, and how this process predicts well-being. The current study built on this research by using a qualitative method to examine the process of configuring one's identities within the self. The present study employed three of the four different multiple identity configurations in Amiot et al. (2007) cognitive-developmental model of social identity integration: categorization, where people identify with one of their cultural groups over others; compartmentalization, where individuals maintain multiple, separate identities within themselves; and integration, where people link their multiple cultural identities. Life narratives were used to investigate the relationship between each of these configurations and well-being, as indicated by narrative coherence. It was expected that individuals with integrated cultural identities would report greater narrative coherence than individuals who compartmentalized and categorized their cultural identities. For all twenty-two participants, identity integration was significantly and positively related to narrative coherence, while compartmentalization was significantly and negatively related to narrative coherence. ANOVAs revealed that integrated and categorized participants reported significantly greater narrative coherence than compartmentalized participants. These findings are discussed in light of previous research on multicultural identity integration. PMID:23504407
Effect of Uncertainty on Deterministic Runway Scheduling
NASA Technical Reports Server (NTRS)
Gupta, Gautam; Malik, Waqar; Jung, Yoon C.
2012-01-01
Active runway scheduling involves scheduling departures for takeoffs and arrivals for runway crossing subject to numerous constraints. This paper evaluates the effect of uncertainty on a deterministic runway scheduler. The evaluation is done against a first-come- first-serve scheme. In particular, the sequence from a deterministic scheduler is frozen and the times adjusted to satisfy all separation criteria; this approach is tested against FCFS. The comparison is done for both system performance (throughput and system delay) and predictability, and varying levels of congestion are considered. The modeling of uncertainty is done in two ways: as equal uncertainty in availability at the runway as for all aircraft, and as increasing uncertainty for later aircraft. Results indicate that the deterministic approach consistently performs better than first-come-first-serve in both system performance and predictability.
Total Risk Integrated Methodology (TRIM) - TRIM.FaTE
TRIM.FaTE is a spatially explicit, compartmental mass balance model that describes the movement and transformation of pollutants over time, through a user-defined, bounded system that includes both biotic and abiotic compartments.
Stochastic and deterministic multiscale models for systems biology: an auxin-transport case study.
Twycross, Jamie; Band, Leah R; Bennett, Malcolm J; King, John R; Krasnogor, Natalio
2010-03-26
Stochastic and asymptotic methods are powerful tools in developing multiscale systems biology models; however, little has been done in this context to compare the efficacy of these methods. The majority of current systems biology modelling research, including that of auxin transport, uses numerical simulations to study the behaviour of large systems of deterministic ordinary differential equations, with little consideration of alternative modelling frameworks. In this case study, we solve an auxin-transport model using analytical methods, deterministic numerical simulations and stochastic numerical simulations. Although the three approaches in general predict the same behaviour, the approaches provide different information that we use to gain distinct insights into the modelled biological system. We show in particular that the analytical approach readily provides straightforward mathematical expressions for the concentrations and transport speeds, while the stochastic simulations naturally provide information on the variability of the system. Our study provides a constructive comparison which highlights the advantages and disadvantages of each of the considered modelling approaches. This will prove helpful to researchers when weighing up which modelling approach to select. In addition, the paper goes some way to bridging the gap between these approaches, which in the future we hope will lead to integrative hybrid models.
Cao, Pengxing; Tan, Xiahui; Donovan, Graham; Sanderson, Michael J; Sneyd, James
2014-08-01
The inositol trisphosphate receptor ([Formula: see text]) is one of the most important cellular components responsible for oscillations in the cytoplasmic calcium concentration. Over the past decade, two major questions about the [Formula: see text] have arisen. Firstly, how best should the [Formula: see text] be modeled? In other words, what fundamental properties of the [Formula: see text] allow it to perform its function, and what are their quantitative properties? Secondly, although calcium oscillations are caused by the stochastic opening and closing of small numbers of [Formula: see text], is it possible for a deterministic model to be a reliable predictor of calcium behavior? Here, we answer these two questions, using airway smooth muscle cells (ASMC) as a specific example. Firstly, we show that periodic calcium waves in ASMC, as well as the statistics of calcium puffs in other cell types, can be quantitatively reproduced by a two-state model of the [Formula: see text], and thus the behavior of the [Formula: see text] is essentially determined by its modal structure. The structure within each mode is irrelevant for function. Secondly, we show that, although calcium waves in ASMC are generated by a stochastic mechanism, [Formula: see text] stochasticity is not essential for a qualitative prediction of how oscillation frequency depends on model parameters, and thus deterministic [Formula: see text] models demonstrate the same level of predictive capability as do stochastic models. We conclude that, firstly, calcium dynamics can be accurately modeled using simplified [Formula: see text] models, and, secondly, to obtain qualitative predictions of how oscillation frequency depends on parameters it is sufficient to use a deterministic model.
Density of septic systems in watersheds has been identified as a contributor to pathogen loading in streams. At present, little work has been done to provide simple models to assist in evaluating groundwater loading for pathogen TMDLs. A compartmental model is being developed for...
Transmission dynamics and elimination potential of zoonotic tuberculosis in morocco
Justus Bless, Philipp; Crump, Lisa; Lohmann, Petra; Laager, Mirjam; Chitnis, Nakul; Zinsstag, Jakob
2017-01-01
Bovine tuberculosis (BTB) is an endemic zoonosis in Morocco caused by Mycobacterium bovis, which infects many domestic animals and is transmitted to humans through consumption of raw milk or from contact with infected animals. The prevalence of BTB in Moroccan cattle is estimated at 18%, and 33% at the individual and the herd level respectively, but the human M. bovis burden needs further clarification. The current control strategy based on test and slaughter should be improved through local context adaptation taking into account a suitable compensation in order to reduce BTB prevalence in Morocco and decrease the disease burden in humans and animals. We established a simple compartmental deterministic mathematical model for BTB transmission in cattle and humans to provide a general understanding of BTB, in particular regarding transmission to humans. Differential equations were used to model the different pathways between the compartments for cattle and humans. Scenarios of test and slaughter were simulated to determine the effects of varying the proportion of tested animals (p) on the time to elimination of BTB (individual animal prevalence of less than one in a thousand) in cattle and humans. The time to freedom from disease ranged from 75 years for p = 20% to 12 years for p = 100%. For p > 60% the time to elimination was less than 20 years. The cumulated cost was largely stable: for p values higher than 40%, cost ranged from 1.47 to 1.60 billion euros with a time frame of 12 to 32 years to reach freedom from disease. The model simulations also suggest that using a 2mm cut off instead of a 4mm cut off in the Single Intradermal Comparative Cervical Tuberculin skin test (SICCT) would result in cheaper and quicker elimination programs. This analysis informs Moroccan bovine tuberculosis control policy regarding time frame, range of cost and levels of intervention. However, further research is needed to clarify the national human-bovine tuberculosis ratio in Morocco. PMID:28152056
Transmission dynamics and elimination potential of zoonotic tuberculosis in morocco.
Abakar, Mahamat Fayiz; Yahyaoui Azami, Hind; Justus Bless, Philipp; Crump, Lisa; Lohmann, Petra; Laager, Mirjam; Chitnis, Nakul; Zinsstag, Jakob
2017-02-01
Bovine tuberculosis (BTB) is an endemic zoonosis in Morocco caused by Mycobacterium bovis, which infects many domestic animals and is transmitted to humans through consumption of raw milk or from contact with infected animals. The prevalence of BTB in Moroccan cattle is estimated at 18%, and 33% at the individual and the herd level respectively, but the human M. bovis burden needs further clarification. The current control strategy based on test and slaughter should be improved through local context adaptation taking into account a suitable compensation in order to reduce BTB prevalence in Morocco and decrease the disease burden in humans and animals. We established a simple compartmental deterministic mathematical model for BTB transmission in cattle and humans to provide a general understanding of BTB, in particular regarding transmission to humans. Differential equations were used to model the different pathways between the compartments for cattle and humans. Scenarios of test and slaughter were simulated to determine the effects of varying the proportion of tested animals (p) on the time to elimination of BTB (individual animal prevalence of less than one in a thousand) in cattle and humans. The time to freedom from disease ranged from 75 years for p = 20% to 12 years for p = 100%. For p > 60% the time to elimination was less than 20 years. The cumulated cost was largely stable: for p values higher than 40%, cost ranged from 1.47 to 1.60 billion euros with a time frame of 12 to 32 years to reach freedom from disease. The model simulations also suggest that using a 2mm cut off instead of a 4mm cut off in the Single Intradermal Comparative Cervical Tuberculin skin test (SICCT) would result in cheaper and quicker elimination programs. This analysis informs Moroccan bovine tuberculosis control policy regarding time frame, range of cost and levels of intervention. However, further research is needed to clarify the national human-bovine tuberculosis ratio in Morocco.
Organs-on-a-chip: a focus on compartmentalized microdevices.
Moraes, Christopher; Mehta, Geeta; Lesher-Perez, Sasha Cai; Takayama, Shuichi
2012-06-01
Advances in microengineering technologies have enabled a variety of insights into biomedical sciences that would not have been possible with conventional techniques. Engineering microenvironments that simulate in vivo organ systems may provide critical insight into the cellular basis for pathophysiologies, development, and homeostasis in various organs, while curtailing the high experimental costs and complexities associated with in vivo studies. In this article, we aim to survey recent attempts to extend tissue-engineered platforms toward simulating organ structure and function, and discuss the various approaches and technologies utilized in these systems. We specifically focus on microtechnologies that exploit phenomena associated with compartmentalization to create model culture systems that better represent the in vivo organ microenvironment.
Observations, theoretical ideas and modeling of turbulent flows: Past, present and future
NASA Technical Reports Server (NTRS)
Chapman, G. T.; Tobak, M.
1985-01-01
Turbulence was analyzed in a historical context featuring the interactions between observations, theoretical ideas, and modeling within three successive movements. These are identified as predominantly statistical, structural and deterministic. The statistical movement is criticized for its failure to deal with the structural elements observed in turbulent flows. The structural movement is criticized for its failure to embody observed structural elements within a formal theory. The deterministic movement is described as having the potential of overcoming these deficiencies by allowing structural elements to exhibit chaotic behavior that is nevertheless embodied within a theory. Four major ideas of this movement are described: bifurcation theory, strange attractors, fractals, and the renormalization group. A framework for the future study of turbulent flows is proposed, based on the premises of the deterministic movement.
Transmission Models of Historical Ebola Outbreaks
Drake, John M.; Bakach, Iurii; Just, Matthew R.; O’Regan, Suzanne M.; Gambhir, Manoj
2015-01-01
To guide the collection of data under emergent epidemic conditions, we reviewed compartmental models of historical Ebola outbreaks to determine their implications and limitations. We identified future modeling directions and propose that the minimal epidemiologic dataset for Ebola model construction comprises duration of incubation period and symptomatic period, distribution of secondary cases by infection setting, and compliance with intervention recommendations. PMID:26196358
Malaria transmission rates estimated from serological data.
Burattini, M. N.; Massad, E.; Coutinho, F. A.
1993-01-01
A mathematical model was used to estimate malaria transmission rates based on serological data. The model is minimally stochastic and assumes an age-dependent force of infection for malaria. The transmission rates estimated were applied to a simple compartmental model in order to mimic the malaria transmission. The model has shown a good retrieving capacity for serological and parasite prevalence data. PMID:8270011
Two Strain Dengue Model with Temporary Cross Immunity and Seasonality
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Ballesteros, Sebastien; Stollenwerk, Nico
2010-09-01
Models on dengue fever epidemiology have previously shown critical fluctuations with power law distributions and also deterministic chaos in some parameter regions due to the multi-strain structure of the disease pathogen. In our first model including well known biological features, we found a rich dynamical structure including limit cycles, symmetry breaking bifurcations, torus bifurcations, coexisting attractors including isola solutions and deterministic chaos (as indicated by positive Lyapunov exponents) in a much larger parameter region, which is also biologically more plausible than the previous results of other researches. Based on these findings we will investigate the model structures further including seasonality.
Two Strain Dengue Model with Temporary Cross Immunity and Seasonality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguiar, Maira; Ballesteros, Sebastien; Stollenwerk, Nico
Models on dengue fever epidemiology have previously shown critical fluctuations with power law distributions and also deterministic chaos in some parameter regions due to the multi-strain structure of the disease pathogen. In our first model including well known biological features, we found a rich dynamical structure including limit cycles, symmetry breaking bifurcations, torus bifurcations, coexisting attractors including isola solutions and deterministic chaos (as indicated by positive Lyapunov exponents) in a much larger parameter region, which is also biologically more plausible than the previous results of other researches. Based on these findings we will investigate the model structures further including seasonality.
The threshold of a stochastic delayed SIR epidemic model with vaccination
NASA Astrophysics Data System (ADS)
Liu, Qun; Jiang, Daqing
2016-11-01
In this paper, we study the threshold dynamics of a stochastic delayed SIR epidemic model with vaccination. We obtain sufficient conditions for extinction and persistence in the mean of the epidemic. The threshold between persistence in the mean and extinction of the stochastic system is also obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number Rbar0 of the deterministic system. Results show that time delay has important effects on the persistence and extinction of the epidemic.
Álvarez-Yela, Astrid Catalina; Gómez-Cano, Fabio; Zambrano, María Mercedes; Husserl, Johana; Danies, Giovanna; Restrepo, Silvia; González-Barrios, Andrés Fernando
2017-01-01
Soil microbial communities are responsible for a wide range of ecological processes and have an important economic impact in agriculture. Determining the metabolic processes performed by microbial communities is crucial for understanding and managing ecosystem properties. Metagenomic approaches allow the elucidation of the main metabolic processes that determine the performance of microbial communities under different environmental conditions and perturbations. Here we present the first compartmentalized metabolic reconstruction at a metagenomics scale of a microbial ecosystem. This systematic approach conceives a meta-organism without boundaries between individual organisms and allows the in silico evaluation of the effect of agricultural intervention on soils at a metagenomics level. To characterize the microbial ecosystems, topological properties, taxonomic and metabolic profiles, as well as a Flux Balance Analysis (FBA) were considered. Furthermore, topological and optimization algorithms were implemented to carry out the curation of the models, to ensure the continuity of the fluxes between the metabolic pathways, and to confirm the metabolite exchange between subcellular compartments. The proposed models provide specific information about ecosystems that are generally overlooked in non-compartmentalized or non-curated networks, like the influence of transport reactions in the metabolic processes, especially the important effect on mitochondrial processes, as well as provide more accurate results of the fluxes used to optimize the metabolic processes within the microbial community. PMID:28767679
On the deterministic and stochastic use of hydrologic models
Farmer, William H.; Vogel, Richard M.
2016-01-01
Environmental simulation models, such as precipitation-runoff watershed models, are increasingly used in a deterministic manner for environmental and water resources design, planning, and management. In operational hydrology, simulated responses are now routinely used to plan, design, and manage a very wide class of water resource systems. However, all such models are calibrated to existing data sets and retain some residual error. This residual, typically unknown in practice, is often ignored, implicitly trusting simulated responses as if they are deterministic quantities. In general, ignoring the residuals will result in simulated responses with distributional properties that do not mimic those of the observed responses. This discrepancy has major implications for the operational use of environmental simulation models as is shown here. Both a simple linear model and a distributed-parameter precipitation-runoff model are used to document the expected bias in the distributional properties of simulated responses when the residuals are ignored. The systematic reintroduction of residuals into simulated responses in a manner that produces stochastic output is shown to improve the distributional properties of the simulated responses. Every effort should be made to understand the distributional behavior of simulation residuals and to use environmental simulation models in a stochastic manner.
Self-priming compartmentalization digital LAMP for point-of-care.
Zhu, Qiangyuan; Gao, Yibo; Yu, Bingwen; Ren, Hao; Qiu, Lin; Han, Sihai; Jin, Wei; Jin, Qinhan; Mu, Ying
2012-11-21
Digital nucleic acid amplification provides unprecedented opportunities for absolute nucleic acid quantification by counting of single molecules. This technique is useful for molecular genetic analysis in cancer, stem cell, bacterial, non-invasive prenatal diagnosis in which many biologists are interested. This paper describes a self-priming compartmentalization (SPC) microfluidic chip platform for performing digital loop-mediated amplification (LAMP). The energy for the pumping is pre-stored in the degassed bulk PDMS by exploiting the high gas solubility of PDMS; therefore, no additional structures other than channels and reservoirs are required. The sample and oil are sequentially sucked into the channels, and the pressure difference of gas dissolved in PDMS allows sample self-compartmentalization without the need for further chip manipulation such as with pneumatic microvalves and control systems, and so on. The SPC digital LAMP chip can be used like a 384-well plate, so, the world-to-chip fluidic interconnections are avoided. The microfluidic chip contains 4 separate panels, each panel contains 1200 independent 6 nL chambers and can be used to detect 4 samples simultaneously. Digital LAMP on the microfluidic chip was tested quantitatively by using β-actin DNA from humans. The self-priming compartmentalization behavior is roughly predictable using a two-dimensional model. The uniformity of compartmentalization was analyzed by fluorescent intensity and fraction of volume. The results showed that the feasibility and flexibility of the microfluidic chip platform for amplifying single nucleic acid molecules in different chambers made by diluting and distributing sample solutions. The SPC chip has the potential to meet the requirements of a general laboratory: power-free, valve-free, operating at isothermal temperature, inexpensive, sensitive, economizing labour time and reagents. The disposable analytical devices with appropriate air-tight packaging should be useful for point-of-care, and enabling it to become one of the common tools for biology research, especially, in point-of-care testing.
Theory and applications of a deterministic approximation to the coalescent model
Jewett, Ethan M.; Rosenberg, Noah A.
2014-01-01
Under the coalescent model, the random number nt of lineages ancestral to a sample is nearly deterministic as a function of time when nt is moderate to large in value, and it is well approximated by its expectation E[nt]. In turn, this expectation is well approximated by simple deterministic functions that are easy to compute. Such deterministic functions have been applied to estimate allele age, effective population size, and genetic diversity, and they have been used to study properties of models of infectious disease dynamics. Although a number of simple approximations of E[nt] have been derived and applied to problems of population-genetic inference, the theoretical accuracy of the formulas and the inferences obtained using these approximations is not known, and the range of problems to which they can be applied is not well understood. Here, we demonstrate general procedures by which the approximation nt ≈ E[nt] can be used to reduce the computational complexity of coalescent formulas, and we show that the resulting approximations converge to their true values under simple assumptions. Such approximations provide alternatives to exact formulas that are computationally intractable or numerically unstable when the number of sampled lineages is moderate or large. We also extend an existing class of approximations of E[nt] to the case of multiple populations of time-varying size with migration among them. Our results facilitate the use of the deterministic approximation nt ≈ E[nt] for deriving functionally simple, computationally efficient, and numerically stable approximations of coalescent formulas under complicated demographic scenarios. PMID:24412419
Ordinal optimization and its application to complex deterministic problems
NASA Astrophysics Data System (ADS)
Yang, Mike Shang-Yu
1998-10-01
We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.
Design of a dynamic optical tissue phantom to model extravasation pharmacokinetics
NASA Astrophysics Data System (ADS)
Zhang, Jane Y.; Ergin, Aysegul; Andken, Kerry Lee; Sheng, Chao; Bigio, Irving J.
2010-02-01
We describe an optical tissue phantom that enables the simulation of drug extravasation from microvessels and validates computational compartmental models of drug delivery. The phantom consists of a microdialysis tubing bundle to simulate the permeable blood vessels, immersed in either an aqueous suspension of titanium dioxide (TiO2) or a TiO2 mixed agarose scattering medium. Drug administration is represented by a dye circulated through this porous microdialysis tubing bundle. Optical pharmacokinetic (OP) methods are used to measure changes in the absorption coefficient of the scattering medium due to the arrival and diffusion of the dye. We have established particle sizedependent concentration profiles over time of phantom drug delivery by intravenous (IV) and intra-arterial (IA) routes. Additionally, pharmacokinetic compartmental models are implemented in computer simulations for the conditions studied within the phantom. The simulated concentration-time profiles agree well with measurements from the phantom. The results are encouraging for future optical pharmacokinetic method development, both physical and computational, to understand drug extravasation under various physiological conditions.
NASA Astrophysics Data System (ADS)
Wang, Xin; Zhang, Yanqi; Zhang, Limin; Li, Jiao; Zhou, Zhongxing; Zhao, Huijuan; Gao, Feng
2016-04-01
We present a generalized strategy for direct reconstruction in pharmacokinetic diffuse fluorescence tomography (DFT) with CT-analogous scanning mode, which can accomplish one-step reconstruction of the indocyanine-green pharmacokinetic-rate images within in vivo small animals by incorporating the compartmental kinetic model into an adaptive extended Kalman filtering scheme and using an instantaneous sampling dataset. This scheme, compared with the established indirect and direct methods, eliminates the interim error of the DFT inversion and relaxes the expensive requirement of the instrument for obtaining highly time-resolved date-sets of complete 360 deg projections. The scheme is validated by two-dimensional simulations for the two-compartment model and pilot phantom experiments for the one-compartment model, suggesting that the proposed method can estimate the compartmental concentrations and the pharmacokinetic-rates simultaneously with a fair quantitative and localization accuracy, and is well suitable for cost-effective and dense-sampling instrumentation based on the highly-sensitive photon counting technique.
Rift Valley fever trasmission dynamics described by compartmental models.
Danzetta, Maria Luisa; Bruno, Rossana; Sauro, Francesca; Savini, Lara; Calistri, Paolo
2016-11-01
Rift Valley fever (RVF) is one of the most important zoonotic Transboundary Animal Diseases able to cross international borders and cause devastating effect on animal health and food security. Climate changes and the presence of competent vectors in the most of the current RVF-free temperate countries strongly support the inclusion of RVF virus (RVFV) among the most significant emerging viral threats for public and animal health. The transmission of RVFV is driven by complex eco-climatic factors making the epidemiology of RVF infection difficult to study and to understand. Mathematical, statistical and spatial models are often used to explain the mechanisms underlying these biological processes, providing new and effective tools to plan measures for public health protection. In this paper we performed a systematic literature review on RVF published papers with the aim of identifying and describing the most recent papers developing compartmental models for the study of RVFV transmission dynamics. Copyright © 2016 Elsevier B.V. All rights reserved.
Palmer, Tim N.; O’Shea, Michael
2015-01-01
How is the brain configured for creativity? What is the computational substrate for ‘eureka’ moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal (ultimately quantum decoherent) noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete. PMID:26528173
Heart rate variability as determinism with jump stochastic parameters.
Zheng, Jiongxuan; Skufca, Joseph D; Bollt, Erik M
2013-08-01
We use measured heart rate information (RR intervals) to develop a one-dimensional nonlinear map that describes short term deterministic behavior in the data. Our study suggests that there is a stochastic parameter with persistence which causes the heart rate and rhythm system to wander about a bifurcation point. We propose a modified circle map with a jump process noise term as a model which can qualitatively capture such this behavior of low dimensional transient determinism with occasional (stochastically defined) jumps from one deterministic system to another within a one parameter family of deterministic systems.
Diffusion processes of fragmentary information on scale-free networks
NASA Astrophysics Data System (ADS)
Li, Xun; Cao, Lang
2016-05-01
Compartmental models of diffusion over contact networks have proven representative of real-life propagation phenomena among interacting individuals. However, there is a broad class of collective spreading mechanisms departing from compartmental representations, including those for diffusive objects capable of fragmentation and transmission unnecessarily as a whole. Here, we consider a continuous-state susceptible-infected-susceptible (SIS) model as an ideal limit-case of diffusion processes of fragmentary information on networks, where individuals possess fractions of the information content and update them by selectively exchanging messages with partners in the vicinity. Specifically, we incorporate local information, such as neighbors' node degrees and carried contents, into the individual partner choice, and examine the roles of a variety of such strategies in the information diffusion process, both qualitatively and quantitatively. Our method provides an effective and flexible route of modulating continuous-state diffusion dynamics on networks and has potential in a wide array of practical applications.
Tag-mediated cooperation with non-deterministic genotype-phenotype mapping
NASA Astrophysics Data System (ADS)
Zhang, Hong; Chen, Shu
2016-01-01
Tag-mediated cooperation provides a helpful framework for resolving evolutionary social dilemmas. However, most of the previous studies have not taken into account genotype-phenotype distinction in tags, which may play an important role in the process of evolution. To take this into consideration, we introduce non-deterministic genotype-phenotype mapping into a tag-based model with spatial prisoner's dilemma. By our definition, the similarity between genotypic tags does not directly imply the similarity between phenotypic tags. We find that the non-deterministic mapping from genotypic tag to phenotypic tag has non-trivial effects on tag-mediated cooperation. Although we observe that high levels of cooperation can be established under a wide variety of conditions especially when the decisiveness is moderate, the uncertainty in the determination of phenotypic tags may have a detrimental effect on the tag mechanism by disturbing the homophilic interaction structure which can explain the promotion of cooperation in tag systems. Furthermore, the non-deterministic mapping may undermine the robustness of the tag mechanism with respect to various factors such as the structure of the tag space and the tag flexibility. This observation warns us about the danger of applying the classical tag-based models to the analysis of empirical phenomena if genotype-phenotype distinction is significant in real world. Non-deterministic genotype-phenotype mapping thus provides a new perspective to the understanding of tag-mediated cooperation.
Counteracting structural errors in ensemble forecast of influenza outbreaks.
Pei, Sen; Shaman, Jeffrey
2017-10-13
For influenza forecasts generated using dynamical models, forecast inaccuracy is partly attributable to the nonlinear growth of error. As a consequence, quantification of the nonlinear error structure in current forecast models is needed so that this growth can be corrected and forecast skill improved. Here, we inspect the error growth of a compartmental influenza model and find that a robust error structure arises naturally from the nonlinear model dynamics. By counteracting these structural errors, diagnosed using error breeding, we develop a new forecast approach that combines dynamical error correction and statistical filtering techniques. In retrospective forecasts of historical influenza outbreaks for 95 US cities from 2003 to 2014, overall forecast accuracy for outbreak peak timing, peak intensity and attack rate, are substantially improved for predicted lead times up to 10 weeks. This error growth correction method can be generalized to improve the forecast accuracy of other infectious disease dynamical models.Inaccuracy of influenza forecasts based on dynamical models is partly due to nonlinear error growth. Here the authors address the error structure of a compartmental influenza model, and develop a new improved forecast approach combining dynamical error correction and statistical filtering techniques.
Compartmental modelling of the pharmacokinetics of a breast cancer resistance protein.
Grandjean, Thomas R B; Chappell, Mike J; Yates, James T W; Jones, Kevin; Wood, Gemma; Coleman, Tanya
2011-11-01
A mathematical model for the pharmacokinetics of Hoechst 33342 following administration into a culture medium containing a population of transfected cells (HEK293 hBCRP) with a potent breast cancer resistance protein inhibitor, Fumitremorgin C (FTC), present is described. FTC is reported to almost completely annul resistance mediated by BCRP in vitro. This non-linear compartmental model has seven macroscopic sub-units, with 14 rate parameters. It describes the relationship between the concentration of Hoechst 33342 and FTC, initially spiked in the medium, and the observed change in fluorescence due to Hoechst 33342 binding to DNA. Structural identifiability analysis has been performed using two methods, one based on the similarity transformation/exhaustive modelling approach and the other based on the differential algebra approach. The analyses demonstrated that all models derived are uniquely identifiable for the experiments/observations available. A kinetic modelling software package, namely FACSIMILE (MPCA Software, UK), was used for parameter fitting and to obtain numerical solutions for the system equations. Model fits gave very good agreement with in vitro data provided by AstraZeneca across a variety of experimental scenarios. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
A recycling model of the biokinetics of systemic tellurium.
Giussani, Augusto
2014-11-01
To develop a compartmental model of the systemic biokinetics of tellurium required for calculating the internal dose and interpreting bioassay measurements after incorporation of radioactive tellurium. The compartmental model for tellurium was developed with the software SAAM II v. 2.0 (©The Epsilon Group, Charlottesville, Virginia, USA). Model parameters were determined on the basis of published retention and excretion data in humans and animals. The model consists of two blood compartments, one compartment each for liver, kidneys, thyroid, four compartments for bone tissues and a generic compartment for the soft tissues. The model predicts a rapid urinary excretion of systemic tellurium: 45% in the first 24 h and 84% after 50 d. Faecal excretion amounts to 0.4% after 3 d and 9% after 50 d. Whole body retention is 55% after one day, and 2.8% after 100 d. These values as well as the retained fractions in the single organs are reasonably consistent with the available human and animal data (studies with swine and guinea pigs). The proposed model gives a realistic description of the available biokinetic data for tellurium and will be adopted by the International Commission on Radiological Protection for applications in internal dosimetry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2015-01-01
The 2012 IECC has an airtightness requirement of 3 air changes per hour at 50 Pascals test pressure for both single family and multifamily construction in Climate Zones 3-8. Other programs (LEED, ASHRAE 189, ASHRAE 62.2) have similar or tighter compartmentalization requirements, thus driving the need for easier and more effective methods of compartmentalization in multifamily buildings.
Li, W B; Karpas, Z; Salonen, L; Kurttio, P; Muikku, M; Wahl, W; Höllriegl, V; Hoeschen, C; Oeh, U
2009-06-01
To predict uranium in human hair due to chronic exposure through drinking water, a compartment representing human hair was added into the uranium biokinetic model developed by the International Commission on Radiological Protection (ICRP). The hair compartmental model was used to predict uranium excretion in human hair as a bioassay indicator due to elevated uranium intakes. Two excretion pathways, one starting from the compartment of plasma and the other from the compartment of intermediate turnover soft tissue, are assumed to transfer uranium to the compartment of hair. The transfer rate was determined from reported uranium contents in urine and in hair, taking into account the hair growth rate of 0.1 g d(-1). The fractional absorption in the gastrointestinal tract of 0.6% was found to fit best to describe the measured uranium levels among the users of drilled wells in Finland. The ingestion dose coefficient for (238)U, which includes its progeny of (234)Th, (234m)Pa, and (234)Pa, was calculated equal to 1.3 x 10(-8) Sv Bq(-1) according to the hair compartmental model. This estimate is smaller than the value of 4.5 x 10(-8) Sv Bq(-1) published by ICRP for the members of the public. In this new model, excretion of uranium through urine is better represented when excretion to the hair compartment is accounted for and hair analysis can provide a means for assessing the internal body burden of uranium. The model is applicable for chronic exposure as well as for an acute exposure incident. In the latter case, the hair sample can be collected and analyzed even several days after the incident, whereas urinalysis requires sample collection shortly after the exposure. The model developed in this study applies to ingestion intakes of uranium.
Time Domain and Frequency Domain Deterministic Channel Modeling for Tunnel/Mining Environments.
Zhou, Chenming; Jacksha, Ronald; Yan, Lincan; Reyes, Miguel; Kovalchik, Peter
2017-01-01
Understanding wireless channels in complex mining environments is critical for designing optimized wireless systems operated in these environments. In this paper, we propose two physics-based, deterministic ultra-wideband (UWB) channel models for characterizing wireless channels in mining/tunnel environments - one in the time domain and the other in the frequency domain. For the time domain model, a general Channel Impulse Response (CIR) is derived and the result is expressed in the classic UWB tapped delay line model. The derived time domain channel model takes into account major propagation controlling factors including tunnel or entry dimensions, frequency, polarization, electrical properties of the four tunnel walls, and transmitter and receiver locations. For the frequency domain model, a complex channel transfer function is derived analytically. Based on the proposed physics-based deterministic channel models, channel parameters such as delay spread, multipath component number, and angular spread are analyzed. It is found that, despite the presence of heavy multipath, both channel delay spread and angular spread for tunnel environments are relatively smaller compared to that of typical indoor environments. The results and findings in this paper have application in the design and deployment of wireless systems in underground mining environments.
Time Domain and Frequency Domain Deterministic Channel Modeling for Tunnel/Mining Environments
Zhou, Chenming; Jacksha, Ronald; Yan, Lincan; Reyes, Miguel; Kovalchik, Peter
2018-01-01
Understanding wireless channels in complex mining environments is critical for designing optimized wireless systems operated in these environments. In this paper, we propose two physics-based, deterministic ultra-wideband (UWB) channel models for characterizing wireless channels in mining/tunnel environments — one in the time domain and the other in the frequency domain. For the time domain model, a general Channel Impulse Response (CIR) is derived and the result is expressed in the classic UWB tapped delay line model. The derived time domain channel model takes into account major propagation controlling factors including tunnel or entry dimensions, frequency, polarization, electrical properties of the four tunnel walls, and transmitter and receiver locations. For the frequency domain model, a complex channel transfer function is derived analytically. Based on the proposed physics-based deterministic channel models, channel parameters such as delay spread, multipath component number, and angular spread are analyzed. It is found that, despite the presence of heavy multipath, both channel delay spread and angular spread for tunnel environments are relatively smaller compared to that of typical indoor environments. The results and findings in this paper have application in the design and deployment of wireless systems in underground mining environments.† PMID:29457801
Dual Roles for Spike Signaling in Cortical Neural Populations
Ballard, Dana H.; Jehee, Janneke F. M.
2011-01-01
A prominent feature of signaling in cortical neurons is that of randomness in the action potential. The output of a typical pyramidal cell can be well fit with a Poisson model, and variations in the Poisson rate repeatedly have been shown to be correlated with stimuli. However while the rate provides a very useful characterization of neural spike data, it may not be the most fundamental description of the signaling code. Recent data showing γ frequency range multi-cell action potential correlations, together with spike timing dependent plasticity, are spurring a re-examination of the classical model, since precise timing codes imply that the generation of spikes is essentially deterministic. Could the observed Poisson randomness and timing determinism reflect two separate modes of communication, or do they somehow derive from a single process? We investigate in a timing-based model whether the apparent incompatibility between these probabilistic and deterministic observations may be resolved by examining how spikes could be used in the underlying neural circuits. The crucial component of this model draws on dual roles for spike signaling. In learning receptive fields from ensembles of inputs, spikes need to behave probabilistically, whereas for fast signaling of individual stimuli, the spikes need to behave deterministically. Our simulations show that this combination is possible if deterministic signals using γ latency coding are probabilistically routed through different members of a cortical cell population at different times. This model exhibits standard features characteristic of Poisson models such as orientation tuning and exponential interval histograms. In addition, it makes testable predictions that follow from the γ latency coding. PMID:21687798
Guidelines 13 and 14—Prediction uncertainty
Hill, Mary C.; Tiedeman, Claire
2005-01-01
An advantage of using optimization for model development and calibration is that optimization provides methods for evaluating and quantifying prediction uncertainty. Both deterministic and statistical methods can be used. Guideline 13 discusses using regression and post-audits, which we classify as deterministic methods. Guideline 14 discusses inferential statistics and Monte Carlo methods, which we classify as statistical methods.
Optimal Vaccination in a Stochastic Epidemic Model of Two Non-Interacting Populations
2015-02-17
of diminishing returns from vacci- nation will generally take place at smaller vaccine allocations V compared to the deterministic model. Optimal...take place and small r0 values where it does not is illustrat- ed in Fig. 4C. As r0 is decreased, the region between the two instances of switching...approximately distribute vaccine in proportion to population size. For large r0 (r0 ≳ 2.9), two switches take place . In the deterministic optimal solution, a
Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity
NASA Astrophysics Data System (ADS)
Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján
2017-06-01
It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.
A Compartmentalized Out-of-Equilibrium Enzymatic Reaction Network for Sustained Autonomous Movement
2016-01-01
Every living cell is a compartmentalized out-of-equilibrium system exquisitely able to convert chemical energy into function. In order to maintain homeostasis, the flux of metabolites is tightly controlled by regulatory enzymatic networks. A crucial prerequisite for the development of lifelike materials is the construction of synthetic systems with compartmentalized reaction networks that maintain out-of-equilibrium function. Here, we aim for autonomous movement as an example of the conversion of feedstock molecules into function. The flux of the conversion is regulated by a rationally designed enzymatic reaction network with multiple feedforward loops. By compartmentalizing the network into bowl-shaped nanocapsules the output of the network is harvested as kinetic energy. The entire system shows sustained and tunable microscopic motion resulting from the conversion of multiple external substrates. The successful compartmentalization of an out-of-equilibrium reaction network is a major first step in harnessing the design principles of life for construction of adaptive and internally regulated lifelike systems. PMID:27924313
The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments
NASA Astrophysics Data System (ADS)
Chen, Fajing; Jiao, Meiyan; Chen, Jing
2013-04-01
Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.
From Weakly Chaotic Dynamics to Deterministic Subdiffusion via Copula Modeling
NASA Astrophysics Data System (ADS)
Nazé, Pierre
2018-03-01
Copula modeling consists in finding a probabilistic distribution, called copula, whereby its coupling with the marginal distributions of a set of random variables produces their joint distribution. The present work aims to use this technique to connect the statistical distributions of weakly chaotic dynamics and deterministic subdiffusion. More precisely, we decompose the jumps distribution of Geisel-Thomae map into a bivariate one and determine the marginal and copula distributions respectively by infinite ergodic theory and statistical inference techniques. We verify therefore that the characteristic tail distribution of subdiffusion is an extreme value copula coupling Mittag-Leffler distributions. We also present a method to calculate the exact copula and joint distributions in the case where weakly chaotic dynamics and deterministic subdiffusion statistical distributions are already known. Numerical simulations and consistency with the dynamical aspects of the map support our results.
Compartmental analysis of the disposition of benzo[a]pyrene in rats.
Bevan, D R; Weyand, E H
1988-11-01
We have previously reported the disposition of benzo[a]pyrene (B[a]P) and its metabolites in male Sprague-Dawley rats following intratracheal instillation of [3H]B[a]P [Weyand, E.H. and Bevan, D.R. (1986) Cancer Res., 46, 5655-5661]. In some experiments, cannulas were implanted in the bile duct of the animals prior to administration of [3H]B[a]P [Weyand, E.H. and Bevan, D.R. (1987) Drug Metab. Disposition, 15, 442-448]. Based on these data, we have developed a compartmental model of the distribution of radioactivity to provide a quantitative description of the fate of B[a]P and its metabolites in rats. Modeling of the distribution of radioactivity was performed using the Simulation, Analysis and Modeling (SAAM) and conversational SAAM (CONSAM) computer programs. Compartments in the model included organs into which the largest amounts of radioactivity were distributed as well as pathways for excretion of radioactivity from the animals. Data from animals with and without cannulas implanted in the bile duct were considered simultaneously during modeling. Radioactivity was so rapidly absorbed from the lungs that an absorption phase into blood was not apparent at the earliest sampling times. Using the model of extrapolate to shorter times, it was predicted that the maximum amount of radioactivity was present in blood within 2 min after administration. In addition, considerable recycling of radioactivity back to lungs from blood was predicted by the model. Transfer of radioactivity from blood to liver and carcass (skin, muscle, bones, fat and associated blood) also was extensive. Carcass was modeled as the sum of two compartments to obtain agreement between the model and experimental data. The model accounted for enterohepatic circulation of B[a]P metabolites; data also required that intestinal secretion be included in the model. Quantitative data obtained from compartmental analysis included rate constants for transfer of radioactivity among compartments as well as statistical parameters indicating the identifiability of the rate constants. That the model is consistent with two sets of data, those obtained in animals with and without a biliary cannula, indicates its potential utility in predicting the disposition of B[a]P and its metabolites in vivo.
Natural analogs in the petroleum industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, J.R.
1995-09-01
This article describes the use of natural analogues in petroleum exploration and includes numerous geologic model descriptions which have historically been used in the prediction of geometries and location of oil and gas accumulations. These geologic models have been passed down to and used by succeeding generations of petroleum geologists. Some examples of these geologic models include the Allan fault-plane model, porosity prediction, basin modelling, prediction of basin compartmentalization, and diagenesis.
Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.
Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O
2006-03-01
The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.
A deterministic model of electron transport for electron probe microanalysis
NASA Astrophysics Data System (ADS)
Bünger, J.; Richter, S.; Torrilhon, M.
2018-01-01
Within the last decades significant improvements in the spatial resolution of electron probe microanalysis (EPMA) were obtained by instrumental enhancements. In contrast, the quantification procedures essentially remained unchanged. As the classical procedures assume either homogeneity or a multi-layered structure of the material, they limit the spatial resolution of EPMA. The possibilities of improving the spatial resolution through more sophisticated quantification procedures are therefore almost untouched. We investigate a new analytical model (M 1-model) for the quantification procedure based on fast and accurate modelling of electron-X-ray-matter interactions in complex materials using a deterministic approach to solve the electron transport equations. We outline the derivation of the model from the Boltzmann equation for electron transport using the method of moments with a minimum entropy closure and present first numerical results for three different test cases (homogeneous, thin film and interface). Taking Monte Carlo as a reference, the results for the three test cases show that the M 1-model is able to reproduce the electron dynamics in EPMA applications very well. Compared to classical analytical models like XPP and PAP, the M 1-model is more accurate and far more flexible, which indicates the potential of deterministic models of electron transport to further increase the spatial resolution of EPMA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Decker, A.D.; Kuuskraa, V.A.; Klawitter, A.L.
Recurrent basement faulting is the primary controlling mechanism for aligning and compartmentalizing upper Cretaceous aged tight gas reservoirs of the San Juan and Piceance Basins. Northwest trending structural lineaments that formed in conjunction with the Uncompahgre Highlands have profoundly influenced sedimentation trends and created boundaries for gas migration; sealing and compartmentalizing sedimentary packages in both basins. Fractures which formed over the structural lineaments provide permeability pathways which allowing gas recovery from otherwise tight gas reservoirs. Structural alignments and associated reservoir compartments have been accurately targeted by integrating advanced remote sensing imagery, high resolution aeromagnetics, seismic interpretation, stratigraphic mapping and dynamicmore » structural modelling. This unifying methodology is a powerful tool for exploration geologists and is also a systematic approach to tight gas resource assessment in frontier basins.« less
Transcriptional repression mediated by repositioning of genes to the nuclear lamina.
Reddy, K L; Zullo, J M; Bertolino, E; Singh, H
2008-03-13
Nuclear compartmentalization seems to have an important role in regulating metazoan genes. Although studies on immunoglobulin and other loci have shown a correlation between positioning at the nuclear lamina and gene repression, the functional consequences of this compartmentalization remain untested. We devised an approach for inducible tethering of genes to the inner nuclear membrane (INM), and tested the consequences of such repositioning on gene activity in mouse fibroblasts. Here, using three-dimensional DNA-immunoFISH, we demonstrate repositioning of chromosomal regions to the nuclear lamina that is dependent on breakdown and reformation of the nuclear envelope during mitosis. Moreover, tethering leads to the accumulation of lamin and INM proteins, but not to association with pericentromeric heterochromatin or nuclear pore complexes. Recruitment of genes to the INM can result in their transcriptional repression. Finally, we use targeted adenine methylation (DamID) to show that, as is the case for our model system, inactive immunoglobulin loci at the nuclear periphery are contacted by INM and lamina proteins. We propose that these molecular interactions may be used to compartmentalize and to limit the accessibility of immunoglobulin loci to transcription and recombination factors.
NASA Astrophysics Data System (ADS)
Soltanzadeh, I.; Azadi, M.; Vakili, G. A.
2011-07-01
Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.
NASA Astrophysics Data System (ADS)
Yan, Y.; Barth, A.; Beckers, J. M.; Candille, G.; Brankart, J. M.; Brasseur, P.
2015-07-01
Sea surface height, sea surface temperature, and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. Sixty ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. An incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with independent/semiindependent observations. For deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations, in order to diagnose the ensemble distribution properties in a deterministic way. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centered random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analyzed jointly. The consistency and complementarity between both validations are highlighted.
A Random Variable Approach to Nuclear Targeting and Survivability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Undem, Halvor A.
We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less
Classification and unification of the microscopic deterministic traffic models.
Yang, Bo; Monterola, Christopher
2015-10-01
We identify a universal mathematical structure in microscopic deterministic traffic models (with identical drivers), and thus we show that all such existing models in the literature, including both the two-phase and three-phase models, can be understood as special cases of a master model by expansion around a set of well-defined ground states. This allows any two traffic models to be properly compared and identified. The three-phase models are characterized by the vanishing of leading orders of expansion within a certain density range, and as an example the popular intelligent driver model is shown to be equivalent to a generalized optimal velocity (OV) model. We also explore the diverse solutions of the generalized OV model that can be important both for understanding human driving behaviors and algorithms for autonomous driverless vehicles.
Workstation Table Engineering Model Design, Development, Fabrication, and Testing
DOT National Transportation Integrated Search
2012-05-01
This research effort is focused on providing a workstation table design that will reduce the risk of occupant injuries due to secondary impacts and to compartmentalize the occupants to prevent impacts with other objects and/or passengers seated acros...
Workstation table engineering model design, development, fabrication, and testing
DOT National Transportation Integrated Search
2012-05-01
This research effort is focused on providing a workstation table design that will reduce the risk of occupant injuries due to secondary impacts and to compartmentalize the occupants to prevent impacts with other objects and/or passengers seated acros...
A stochastic model for correlated protein motions
NASA Astrophysics Data System (ADS)
Karain, Wael I.; Qaraeen, Nael I.; Ajarmah, Basem
2006-06-01
A one-dimensional Langevin-type stochastic difference equation is used to find the deterministic and Gaussian contributions of time series representing the projections of a Bovine Pancreatic Trypsin Inhibitor (BPTI) protein molecular dynamics simulation along different eigenvector directions determined using principal component analysis. The deterministic part shows a distinct nonlinear behavior only for eigenvectors contributing significantly to the collective protein motion.
Probabilistic Finite Element Analysis & Design Optimization for Structural Designs
NASA Astrophysics Data System (ADS)
Deivanayagam, Arumugam
This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.
Gamma time-dependency in Blaxter's compartmental model.
NASA Technical Reports Server (NTRS)
Matis, J. H.
1972-01-01
A new two-compartment model for the passage of particles through the gastro-intestinal tract of ruminants is proposed. In this model, a gamma distribution of lifetimes is introduced in the first compartment; thereby, passage from that compartment becomes time-dependent. This modification is strongly suggested by the physical alteration which certain substances, e.g. hay particles, undergo in the digestive process. The proposed model is applied to experimental data.
Understanding post-operative temperature drop in cardiac surgery: a mathematical model.
Tindall, M J; Peletier, M A; Severens, N M W; Veldman, D J; de Mol, B A J M
2008-12-01
A mathematical model is presented to understand heat transfer processes during the cooling and re-warming of patients during cardiac surgery. Our compartmental model is able to account for many of the qualitative features observed in the cooling of various regions of the body including the central core containing the majority of organs, the rectal region containing the intestines and the outer peripheral region of skin and muscle. In particular, we focus on the issue of afterdrop: a drop in core temperature following patient re-warming, which can lead to serious post-operative complications. Model results for a typical cooling and re-warming procedure during surgery are in qualitative agreement with experimental data in producing the afterdrop effect and the observed dynamical variation in temperature between the core, rectal and peripheral regions. The influence of heat transfer processes and the volume of each compartmental region on the afterdrop effect is discussed. We find that excess fat on the peripheral and rectal regions leads to an increase in the afterdrop effect. Our model predicts that, by allowing constant re-warming after the core temperature has been raised, the afterdrop effect will be reduced.
A physiology-based parametric imaging method for FDG-PET data
NASA Astrophysics Data System (ADS)
Scussolini, Mara; Garbarino, Sara; Sambuceti, Gianmario; Caviglia, Giacomo; Piana, Michele
2017-12-01
Parametric imaging is a compartmental approach that processes nuclear imaging data to estimate the spatial distribution of the kinetic parameters governing tracer flow. The present paper proposes a novel and efficient computational method for parametric imaging which is potentially applicable to several compartmental models of diverse complexity and which is effective in the determination of the parametric maps of all kinetic coefficients. We consider applications to [18 F]-fluorodeoxyglucose positron emission tomography (FDG-PET) data and analyze the two-compartment catenary model describing the standard FDG metabolization by an homogeneous tissue and the three-compartment non-catenary model representing the renal physiology. We show uniqueness theorems for both models. The proposed imaging method starts from the reconstructed FDG-PET images of tracer concentration and preliminarily applies image processing algorithms for noise reduction and image segmentation. The optimization procedure solves pixel-wise the non-linear inverse problem of determining the kinetic parameters from dynamic concentration data through a regularized Gauss-Newton iterative algorithm. The reliability of the method is validated against synthetic data, for the two-compartment system, and experimental real data of murine models, for the renal three-compartment system.
A network-based approach for resistance transmission in bacterial populations.
Gehring, Ronette; Schumm, Phillip; Youssef, Mina; Scoglio, Caterina
2010-01-07
Horizontal transfer of mobile genetic elements (conjugation) is an important mechanism whereby resistance is spread through bacterial populations. The aim of our work is to develop a mathematical model that quantitatively describes this process, and to use this model to optimize antimicrobial dosage regimens to minimize resistance development. The bacterial population is conceptualized as a compartmental mathematical model to describe changes in susceptible, resistant, and transconjugant bacteria over time. This model is combined with a compartmental pharmacokinetic model to explore the effect of different plasma drug concentration profiles. An agent-based simulation tool is used to account for resistance transfer occurring when two bacteria are adjacent or in close proximity. In addition, a non-linear programming optimal control problem is introduced to minimize bacterial populations as well as the drug dose. Simulation and optimization results suggest that the rapid death of susceptible individuals in the population is pivotal in minimizing the number of transconjugants in a population. This supports the use of potent antimicrobials that rapidly kill susceptible individuals and development of dosage regimens that maintain effective antimicrobial drug concentrations for as long as needed to kill off the susceptible population. Suggestions are made for experiments to test the hypotheses generated by these simulations.
Compartmental transport model of microbicide delivery by an intravaginal ring
Geonnotti, Anthony R.; Katz, David F.
2010-01-01
Topical antimicrobials, or microbicides, are being developed to prevent HIV transmission through local, mucosal delivery of antiviral compounds. While hydrogel vehicles deliver the majority of current microbicide products, intravaginal rings (IVRs) are an alternative microbicide modality in preclinical development. IVRs provide a long-term dosing alternative to hydrogel use, and might provide improved user adherence. IVR efficacy requires sustained delivery of antiviral compounds to the entire vaginal compartment. A two-dimensional, compartmental vaginal drug transport model was created to evaluate the delivery of drugs from an intravaginal ring. The model utilized MRI-derived ring geometry and location, experimentally defined ring fluxes and vaginal fluid velocities, and biophysically relevant transport theory. Model outputs indicated the presence of potentially inhibitory concentrations of antiviral compounds along the entire vaginal canal within 24 hours following IVR insertion. Distributions of inhibitory concentrations of antiviral compounds were substantially influenced by vaginal fluid flow and production, while showing little change due to changes in diffusion coefficients or ring fluxes. Additionally, model results were predictive of in vivo concentrations obtained in clinical trials. Overall, this analysis initiates a mechanistic computational framework, heretofore missing, to understand and evaluate the potential of IVRs for effective delivery of antiviral compounds. PMID:20222027
NASA Astrophysics Data System (ADS)
Reynders, Edwin P. B.; Langley, Robin S.
2018-08-01
The hybrid deterministic-statistical energy analysis method has proven to be a versatile framework for modeling built-up vibro-acoustic systems. The stiff system components are modeled deterministically, e.g., using the finite element method, while the wave fields in the flexible components are modeled as diffuse. In the present paper, the hybrid method is extended such that not only the ensemble mean and variance of the harmonic system response can be computed, but also of the band-averaged system response. This variance represents the uncertainty that is due to the assumption of a diffuse field in the flexible components of the hybrid system. The developments start with a cross-frequency generalization of the reciprocity relationship between the total energy in a diffuse field and the cross spectrum of the blocked reverberant loading at the boundaries of that field. By making extensive use of this generalization in a first-order perturbation analysis, explicit expressions are derived for the cross-frequency and band-averaged variance of the vibrational energies in the diffuse components and for the cross-frequency and band-averaged variance of the cross spectrum of the vibro-acoustic field response of the deterministic components. These expressions are extensively validated against detailed Monte Carlo analyses of coupled plate systems in which diffuse fields are simulated by randomly distributing small point masses across the flexible components, and good agreement is found.
Hybrid stochastic and deterministic simulations of calcium blips.
Rüdiger, S; Shuai, J W; Huisinga, W; Nagaiah, C; Warnecke, G; Parker, I; Falcke, M
2007-09-15
Intracellular calcium release is a prime example for the role of stochastic effects in cellular systems. Recent models consist of deterministic reaction-diffusion equations coupled to stochastic transitions of calcium channels. The resulting dynamics is of multiple time and spatial scales, which complicates far-reaching computer simulations. In this article, we introduce a novel hybrid scheme that is especially tailored to accurately trace events with essential stochastic variations, while deterministic concentration variables are efficiently and accurately traced at the same time. We use finite elements to efficiently resolve the extreme spatial gradients of concentration variables close to a channel. We describe the algorithmic approach and we demonstrate its efficiency compared to conventional methods. Our single-channel model matches experimental data and results in intriguing dynamics if calcium is used as charge carrier. Random openings of the channel accumulate in bursts of calcium blips that may be central for the understanding of cellular calcium dynamics.
Detecting and disentangling nonlinear structure from solar flux time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.
1992-01-01
Interest in solar activity has grown in the past two decades for many reasons. Most importantly for flight dynamics, solar activity changes the atmospheric density, which has important implications for spacecraft trajectory and lifetime prediction. Building upon the previously developed Rayleigh-Benard nonlinear dynamic solar model, which exhibits many dynamic behaviors observed in the Sun, this work introduces new chaotic solar forecasting techniques. Our attempt to use recently developed nonlinear chaotic techniques to model and forecast solar activity has uncovered highly entangled dynamics. Numerical techniques for decoupling additive and multiplicative white noise from deterministic dynamics and examines falloff of the power spectra at high frequencies as a possible means of distinguishing deterministic chaos from noise than spectrally white or colored are presented. The power spectral techniques presented are less cumbersome than current methods for identifying deterministic chaos, which require more computationally intensive calculations, such as those involving Lyapunov exponents and attractor dimension.
Alex L Shigo
1983-01-01
This guide shows, in 110 photos, how discoloration and decay form in trees. An expanded concept of tree decay is given. After wounding, trees form boundaries to resist the spread of pathogens. The boundary-setting defense process is called compartmentalization, and model of the process is CODIT. The expanded concept and the model are used to reexamine many other tree...
Stochastic Analysis and Probabilistic Downscaling of Soil Moisture
NASA Astrophysics Data System (ADS)
Deshon, J. P.; Niemann, J. D.; Green, T. R.; Jones, A. S.
2017-12-01
Soil moisture is a key variable for rainfall-runoff response estimation, ecological and biogeochemical flux estimation, and biodiversity characterization, each of which is useful for watershed condition assessment. These applications require not only accurate, fine-resolution soil-moisture estimates but also confidence limits on those estimates and soil-moisture patterns that exhibit realistic statistical properties (e.g., variance and spatial correlation structure). The Equilibrium Moisture from Topography, Vegetation, and Soil (EMT+VS) model downscales coarse-resolution (9-40 km) soil moisture from satellite remote sensing or land-surface models to produce fine-resolution (10-30 m) estimates. The model was designed to produce accurate deterministic soil-moisture estimates at multiple points, but the resulting patterns do not reproduce the variance or spatial correlation of observed soil-moisture patterns. The primary objective of this research is to generalize the EMT+VS model to produce a probability density function (pdf) for soil moisture at each fine-resolution location and time. Each pdf has a mean that is equal to the deterministic soil-moisture estimate, and the pdf can be used to quantify the uncertainty in the soil-moisture estimates and to simulate soil-moisture patterns. Different versions of the generalized model are hypothesized based on how uncertainty enters the model, whether the uncertainty is additive or multiplicative, and which distributions describe the uncertainty. These versions are then tested by application to four catchments with detailed soil-moisture observations (Tarrawarra, Satellite Station, Cache la Poudre, and Nerrigundah). The performance of the generalized models is evaluated by comparing the statistical properties of the simulated soil-moisture patterns to those of the observations and the deterministic EMT+VS model. The versions of the generalized EMT+VS model with normally distributed stochastic components produce soil-moisture patterns with more realistic statistical properties than the deterministic model. Additionally, the results suggest that the variance and spatial correlation of the stochastic soil-moisture variations do not vary consistently with the spatial-average soil moisture.
Shin, Hwa Sung; Kim, Hyung Joon; Min, Seul Ki; Kim, Sung Hoon; Lee, Byung Man; Jeon, Noo Li
2010-08-01
Axonal pathology has been clearly implicated in neurodegenerative diseases making the compartmental culture of neurons a useful research tool. Primary neurons have already been cultured in compartmental microfluidic devices but their derivation from an animal is a time-consuming and difficult work and has a limit in their sources. Embryonic stem cell (ESC)-derived neurons (ESC_Ns) overcome this limit, since ESCs can be renewed without limit and can be differentiated into ESC_Ns by robust and reproducible protocols. In this research, ESC_Ns were derived from mouse ESCs in compartmental microfluidic devices, and their axons were isolated from the somal cell bodies. Once embryoid bodies (EBs) were localized in the microfluidic culture chamber, ESC_Ns spread out from the EBs and occupied the cell culture chamber. Their axons traversed the microchannels and finally were isolated from the somata, providing an arrangement comparable to dissociated primary neurons. This ESC_N compartmental microfluidic culture system not only offers a substitute for the primary neuron counterpart system but also makes it possible to make comparisons between the two systems.
Compartmentalized Platforms for Neuro-pharmacological Research
Jadhav, Amol D.; Wei, Li; Shi, Peng
2016-01-01
Dissociated primary neuronal cell culture remains an indispensable approach for neurobiology research in order to investigate basic mechanisms underlying diverse neuronal functions, drug screening and pharmacological investigation. Compartmentalization, a widely adopted technique since its emergence in 1970s enables spatial segregation of neuronal segments and detailed investigation that is otherwise limited with traditional culture methods. Although these compartmental chambers (e.g. Campenot chamber) have been proven valuable for the investigation of Peripheral Nervous System (PNS) neurons and to some extent within Central Nervous System (CNS) neurons, their utility has remained limited given the arduous manufacturing process, incompatibility with high-resolution optical imaging and limited throughput. The development in the area of microfabrication and microfluidics has enabled creation of next generation compartmentalized devices that are cheap, easy to manufacture, require reduced sample volumes, enable precise control over the cellular microenvironment both spatially as well as temporally, and permit highthroughput testing. In this review we briefly evaluate the various compartmentalization tools used for neurobiological research, and highlight application of the emerging microfluidic platforms towards in vitro single cell neurobiology. PMID:26813122
Stochastic modelling of microstructure formation in solidification processes
NASA Astrophysics Data System (ADS)
Nastac, Laurentiu; Stefanescu, Doru M.
1997-07-01
To relax many of the assumptions used in continuum approaches, a general stochastic model has been developed. The stochastic model can be used not only for an accurate description of the fraction of solid evolution, and therefore accurate cooling curves, but also for simulation of microstructure formation in castings. The advantage of using the stochastic approach is to give a time- and space-dependent description of solidification processes. Time- and space-dependent processes can also be described by partial differential equations. Unlike a differential formulation which, in most cases, has to be transformed into a difference equation and solved numerically, the stochastic approach is essentially a direct numerical algorithm. The stochastic model is comprehensive, since the competition between various phases is considered. Furthermore, grain impingement is directly included through the structure of the model. In the present research, all grain morphologies are simulated with this procedure. The relevance of the stochastic approach is that the simulated microstructures can be directly compared with microstructures obtained from experiments. The computer becomes a `dynamic metallographic microscope'. A comparison between deterministic and stochastic approaches has been performed. An important objective of this research was to answer the following general questions: (1) `Would fully deterministic approaches continue to be useful in solidification modelling?' and (2) `Would stochastic algorithms be capable of entirely replacing purely deterministic models?'
Multi-Scale Modeling of the Gamma Radiolysis of Nitrate Solutions.
Horne, Gregory P; Donoclift, Thomas A; Sims, Howard E; Orr, Robin M; Pimblott, Simon M
2016-11-17
A multiscale modeling approach has been developed for the extended time scale long-term radiolysis of aqueous systems. The approach uses a combination of stochastic track structure and track chemistry as well as deterministic homogeneous chemistry techniques and involves four key stages: radiation track structure simulation, the subsequent physicochemical processes, nonhomogeneous diffusion-reaction kinetic evolution, and homogeneous bulk chemistry modeling. The first three components model the physical and chemical evolution of an isolated radiation chemical track and provide radiolysis yields, within the extremely low dose isolated track paradigm, as the input parameters for a bulk deterministic chemistry model. This approach to radiation chemical modeling has been tested by comparison with the experimentally observed yield of nitrite from the gamma radiolysis of sodium nitrate solutions. This is a complex radiation chemical system which is strongly dependent on secondary reaction processes. The concentration of nitrite is not just dependent upon the evolution of radiation track chemistry and the scavenging of the hydrated electron and its precursors but also on the subsequent reactions of the products of these scavenging reactions with other water radiolysis products. Without the inclusion of intratrack chemistry, the deterministic component of the multiscale model is unable to correctly predict experimental data, highlighting the importance of intratrack radiation chemistry in the chemical evolution of the irradiated system.
Multicompartmentalized polymersomes for selective encapsulation of biomacromolecules.
Fu, Zhikang; Ochsner, Mirjam Andreasson; de Hoog, Hans-Peter M; Tomczak, Nikodem; Nallani, Madhavan
2011-03-14
Multicompartmentalized polymersomes are formed using block co-polymers PMOXA-PDMS-PMOXA and PS-PIAT, and are subsequently proven to be capable of selective encapsulation of biomacromolecules. This architecture mimics the compartmentalization found in cells and may serve as a simple, albeit robust, model system.
A deterministic width function model
NASA Astrophysics Data System (ADS)
Puente, C. E.; Sivakumar, B.
Use of a deterministic fractal-multifractal (FM) geometric method to model width functions of natural river networks, as derived distributions of simple multifractal measures via fractal interpolating functions, is reported. It is first demonstrated that the FM procedure may be used to simulate natural width functions, preserving their most relevant features like their overall shape and texture and their observed power-law scaling on their power spectra. It is then shown, via two natural river networks (Racoon and Brushy creeks in the United States), that the FM approach may also be used to closely approximate existing width functions.
Probabilistic Modeling of the Renal Stone Formation Module
NASA Technical Reports Server (NTRS)
Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.
2013-01-01
The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously randomly sampling the probability distributions of the electrolyte concentrations and system parameters that are inputs into the deterministic model. The total urine chemistry concentrations are used to determine the urine chemistry activity using the Joint Expert Speciation System (JESS), a biochemistry model. Information used from JESS is then fed into the deterministic growth model. Outputs from JESS and the deterministic model are passed back to the probabilistic model where a multivariate regression is used to assess the likelihood of a stone forming and the likelihood of a stone requiring clinical intervention. The parameters used to determine to quantify these risks include: relative supersaturation (RS) of calcium oxalate, citrate/calcium ratio, crystal number density, total urine volume, pH, magnesium excretion, maximum stone width, and ureteral location. Methods and Validation: The RSFM is designed to perform a Monte Carlo simulation to generate probability distributions of clinically significant renal stones, as well as provide an associated uncertainty in the estimate. Initially, early versions will be used to test integration of the components and assess component validation and verification (V&V), with later versions used to address questions regarding design reference mission scenarios. Once integrated with the deterministic component, the credibility assessment of the integrated model will follow NASA STD 7009 requirements.
Dynamics of Zika virus outbreaks: an overview of mathematical modeling approaches.
Wiratsudakul, Anuwat; Suparit, Parinya; Modchang, Charin
2018-01-01
The Zika virus was first discovered in 1947. It was neglected until a major outbreak occurred on Yap Island, Micronesia, in 2007. Teratogenic effects resulting in microcephaly in newborn infants is the greatest public health threat. In 2016, the Zika virus epidemic was declared as a Public Health Emergency of International Concern (PHEIC). Consequently, mathematical models were constructed to explicitly elucidate related transmission dynamics. In this review article, two steps of journal article searching were performed. First, we attempted to identify mathematical models previously applied to the study of vector-borne diseases using the search terms "dynamics," "mathematical model," "modeling," and "vector-borne" together with the names of vector-borne diseases including chikungunya, dengue, malaria, West Nile, and Zika. Then the identified types of model were further investigated. Second, we narrowed down our survey to focus on only Zika virus research. The terms we searched for were "compartmental," "spatial," "metapopulation," "network," "individual-based," "agent-based" AND "Zika." All relevant studies were included regardless of the year of publication. We have collected research articles that were published before August 2017 based on our search criteria. In this publication survey, we explored the Google Scholar and PubMed databases. We found five basic model architectures previously applied to vector-borne virus studies, particularly in Zika virus simulations. These include compartmental, spatial, metapopulation, network, and individual-based models. We found that Zika models carried out for early epidemics were mostly fit into compartmental structures and were less complicated compared to the more recent ones. Simple models are still commonly used for the timely assessment of epidemics. Nevertheless, due to the availability of large-scale real-world data and computational power, recently there has been growing interest in more complex modeling frameworks. Mathematical models are employed to explore and predict how an infectious disease spreads in the real world, evaluate the disease importation risk, and assess the effectiveness of intervention strategies. As the trends in modeling of infectious diseases have been shifting towards data-driven approaches, simple and complex models should be exploited differently. Simple models can be produced in a timely fashion to provide an estimation of the possible impacts. In contrast, complex models integrating real-world data require more time to develop but are far more realistic. The preparation of complicated modeling frameworks prior to the outbreaks is recommended, including the case of future Zika epidemic preparation.
Pharmacokinetic modeling in aquatic animals. 1. Models and concepts
Barron, M.G.; Stehly, Guy R.; Hayton, W.L.
1990-01-01
While clinical and toxicological applications of pharmacokinetics have continued to evolve both conceptually and experimentally, pharmacokinetics modeling in aquatic animals has not progressed accordingly. In this paper we present methods and concepts of pharmacokinetic modeling in aquatic animals using multicompartmental, clearance-based, non-compartmental and physiologically-based pharmacokinetic models. These models should be considered as alternatives to traditional approaches, which assume that the animal acts as a single homogeneous compartment based on apparent monoexponential elimination.
Stochastic Processes in Physics: Deterministic Origins and Control
NASA Astrophysics Data System (ADS)
Demers, Jeffery
Stochastic processes are ubiquitous in the physical sciences and engineering. While often used to model imperfections and experimental uncertainties in the macroscopic world, stochastic processes can attain deeper physical significance when used to model the seemingly random and chaotic nature of the underlying microscopic world. Nowhere more prevalent is this notion than in the field of stochastic thermodynamics - a modern systematic framework used describe mesoscale systems in strongly fluctuating thermal environments which has revolutionized our understanding of, for example, molecular motors, DNA replication, far-from equilibrium systems, and the laws of macroscopic thermodynamics as they apply to the mesoscopic world. With progress, however, come further challenges and deeper questions, most notably in the thermodynamics of information processing and feedback control. Here it is becoming increasingly apparent that, due to divergences and subtleties of interpretation, the deterministic foundations of the stochastic processes themselves must be explored and understood. This thesis presents a survey of stochastic processes in physical systems, the deterministic origins of their emergence, and the subtleties associated with controlling them. First, we study time-dependent billiards in the quivering limit - a limit where a billiard system is indistinguishable from a stochastic system, and where the simplified stochastic system allows us to view issues associated with deterministic time-dependent billiards in a new light and address some long-standing problems. Then, we embark on an exploration of the deterministic microscopic Hamiltonian foundations of non-equilibrium thermodynamics, and we find that important results from mesoscopic stochastic thermodynamics have simple microscopic origins which would not be apparent without the benefit of both the micro and meso perspectives. Finally, we study the problem of stabilizing a stochastic Brownian particle with feedback control, and we find that in order to avoid paradoxes involving the first law of thermodynamics, we need a model for the fine details of the thermal driving noise. The underlying theme of this thesis is the argument that the deterministic microscopic perspective and stochastic mesoscopic perspective are both important and useful, and when used together, we can more deeply and satisfyingly understand the physics occurring over either scale.
Nanopore Current Oscillations: Nonlinear Dynamics on the Nanoscale.
Hyland, Brittany; Siwy, Zuzanna S; Martens, Craig C
2015-05-21
In this Letter, we describe theoretical modeling of an experimentally realized nanoscale system that exhibits the general universal behavior of a nonlinear dynamical system. In particular, we consider the description of voltage-induced current fluctuations through a single nanopore from the perspective of nonlinear dynamics. We briefly review the experimental system and its behavior observed and then present a simple phenomenological nonlinear model that reproduces the qualitative behavior of the experimental data. The model consists of a two-dimensional deterministic nonlinear bistable oscillator experiencing both dissipation and random noise. The multidimensionality of the model and the interplay between deterministic and stochastic forces are both required to obtain a qualitatively accurate description of the physical system.
Go big or go home: impact of screening coverage on syphilis infection dynamics.
Tuite, Ashleigh; Fisman, David
2016-02-01
Syphilis outbreaks in urban men who have sex with men (MSM) are an ongoing public health challenge in many high-income countries, despite intensification of efforts to screen and treat at-risk individuals. We sought to understand how population-level coverage of asymptomatic screening impacts the ability to control syphilis transmission. We developed a risk-structured deterministic compartmental mathematical model of syphilis transmission in a population of sexually active MSM. We assumed a baseline level of treatment of syphilis cases due to seeking medical care in all scenarios. We evaluated the impact of sustained annual population-wide screening coverage ranging from 0% to 90% on syphilis incidence over the short term (20 years) and at endemic equilibrium. The relationship between screening coverage and equilibrium syphilis incidence displayed an inverted U-shape relationship, with peak equilibrium incidence occurring with 20-30% annual screening coverage. Annual screening of 62% of the population was required for local elimination (incidence <1 case per 100 000 population). Results were qualitatively similar in the face of differing programmatic, behavioural and natural history assumptions, although the screening thresholds for local elimination differed. With 6-monthly or 3-monthly screening, the population coverage required to achieve local elimination was reduced to 39% or 23%, respectively. Although screening has the potential to control syphilis outbreaks, suboptimal coverage may paradoxically lead to a higher equilibrium infection incidence than that observed in the absence of intervention. Suboptimal screening programme design should be considered as a possible contributor to unsuccessful syphilis control programmes in the context of the current epidemic. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
NASA Astrophysics Data System (ADS)
Contreras, Arturo Javier
This dissertation describes a novel Amplitude-versus-Angle (AVA) inversion methodology to quantitatively integrate pre-stack seismic data, well logs, geologic data, and geostatistical information. Deterministic and stochastic inversion algorithms are used to characterize flow units of deepwater reservoirs located in the central Gulf of Mexico. A detailed fluid/lithology sensitivity analysis was conducted to assess the nature of AVA effects in the study area. Standard AVA analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generate typical Class III AVA responses. Layer-dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution, indicating that presence of light saturating fluids clearly affects the elastic response of sands. Accordingly, AVA deterministic and stochastic inversions, which combine the advantages of AVA analysis with those of inversion, have provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties and fluid-sensitive modulus attributes (P-Impedance, S-Impedance, density, and LambdaRho, in the case of deterministic inversion; and P-velocity, S-velocity, density, and lithotype (sand-shale) distributions, in the case of stochastic inversion). The quantitative use of rock/fluid information through AVA seismic data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, provides accurate 3D models of petrophysical properties such as porosity, permeability, and water saturation. Pre-stack stochastic inversion provides more realistic and higher-resolution results than those obtained from analogous deterministic techniques. Furthermore, 3D petrophysical models can be more accurately co-simulated from AVA stochastic inversion results. By combining AVA sensitivity analysis techniques with pre-stack stochastic inversion, geologic data, and awareness of inversion pitfalls, it is possible to substantially reduce the risk in exploration and development of conventional and non-conventional reservoirs. From the final integration of deterministic and stochastic inversion results with depositional models and analogous examples, the M-series reservoirs have been interpreted as stacked terminal turbidite lobes within an overall fan complex (the Miocene MCAVLU Submarine Fan System); this interpretation is consistent with previous core data interpretations and regional stratigraphic/depositional studies.
Programming chemistry in DNA-addressable bioreactors
Fellermann, Harold; Cardelli, Luca
2014-01-01
We present a formal calculus, termed the chemtainer calculus, able to capture the complexity of compartmentalized reaction systems such as populations of possibly nested vesicular compartments. Compartments contain molecular cargo as well as surface markers in the form of DNA single strands. These markers serve as compartment addresses and allow for their targeted transport and fusion, thereby enabling reactions of previously separated chemicals. The overall system organization allows for the set-up of programmable chemistry in microfluidic or other automated environments. We introduce a simple sequential programming language whose instructions are motivated by state-of-the-art microfluidic technology. Our approach integrates electronic control, chemical computing and material production in a unified formal framework that is able to mimic the integrated computational and constructive capabilities of the subcellular matrix. We provide a non-deterministic semantics of our programming language that enables us to analytically derive the computational and constructive power of our machinery. This semantics is used to derive the sets of all constructable chemicals and supermolecular structures that emerge from different underlying instruction sets. Because our proofs are constructive, they can be used to automatically infer control programs for the construction of target structures from a limited set of resource molecules. Finally, we present an example of our framework from the area of oligosaccharide synthesis. PMID:25121647
Liu, Huolong; Li, Mingzhong
2014-11-20
In this work a two-compartmental population balance model (TCPBM) was proposed to model a pulsed top-spray fluidized bed granulation. The proposed TCPBM considered the spatially heterogeneous granulation mechanisms of the granule growth by dividing the granulator into two perfectly mixed zones of the wetting compartment and drying compartment, in which the aggregation mechanism was assumed in the wetting compartment and the breakage mechanism was considered in the drying compartment. The sizes of the wetting and drying compartments were constant in the TCPBM, in which 30% of the bed was the wetting compartment and 70% of the bed was the drying compartment. The exchange rate of particles between the wetting and drying compartments was determined by the details of the flow properties and distribution of particles predicted by the computational fluid dynamics (CFD) simulation. The experimental validation has shown that the proposed TCPBM can predict evolution of the granule size and distribution within the granulator under different binder spray operating conditions accurately. Copyright © 2014 Elsevier B.V. All rights reserved.
Pashut, Tamar; Magidov, Dafna; Ben-Porat, Hana; Wolfus, Shuki; Friedman, Alex; Perel, Eli; Lavidor, Michal; Bar-Gad, Izhar; Yeshurun, Yosef; Korngreen, Alon
2014-01-01
Although transcranial magnetic stimulation (TMS) is a popular tool for both basic research and clinical applications, its actions on nerve cells are only partially understood. We have previously predicted, using compartmental modeling, that magnetic stimulation of central nervous system neurons depolarized the soma followed by initiation of an action potential in the initial segment of the axon. The simulations also predict that neurons with low current threshold are more susceptible to magnetic stimulation. Here we tested these theoretical predictions by combining in vitro patch-clamp recordings from rat brain slices with magnetic stimulation and compartmental modeling. In agreement with the modeling, our recordings demonstrate the dependence of magnetic stimulation-triggered action potentials on the type and state of the neuron and its orientation within the magnetic field. Our results suggest that the observed effects of TMS are deeply rooted in the biophysical properties of single neurons in the central nervous system and provide a framework both for interpreting existing TMS data and developing new simulation-based tools and therapies. PMID:24917788
The Stochastic Multi-strain Dengue Model: Analysis of the Dynamics
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Stollenwerk, Nico; Kooi, Bob W.
2011-09-01
Dengue dynamics is well known to be particularly complex with large fluctuations of disease incidences. An epidemic multi-strain model motivated by dengue fever epidemiology shows deterministic chaos in wide parameter regions. The addition of seasonal forcing, mimicking the vectorial dynamics, and a low import of infected individuals, which is realistic in the dynamics of infectious diseases epidemics show complex dynamics and qualitatively a good agreement between empirical DHF monitoring data and the obtained model simulation. The addition of noise can explain the fluctuations observed in the empirical data and for large enough population size, the stochastic system can be well described by the deterministic skeleton.
Ford, Jennifer Lynn; Green, Joanne Balmer; Lietz, Georg; Oxley, Anthony; Green, Michael H
2017-09-01
Background: Provitamin A carotenoids are an important source of dietary vitamin A for many populations. Thus, accurate and simple methods for estimating carotenoid bioefficacy are needed to evaluate the vitamin A value of test solutions and plant sources. β-Carotene bioefficacy is often estimated from the ratio of the areas under plasma isotope response curves after subjects ingest labeled β-carotene and a labeled retinyl acetate reference dose [isotope reference method (IRM)], but to our knowledge, the method has not yet been evaluated for accuracy. Objectives: Our objectives were to develop and test a physiologically based compartmental model that includes both absorptive and postabsorptive β-carotene bioconversion and to use the model to evaluate the accuracy of the IRM and a simple plasma retinol isotope ratio [(RIR), labeled β-carotene-derived retinol/labeled reference-dose-derived retinol in one plasma sample] for estimating relative bioefficacy. Methods: We used model-based compartmental analysis (Simulation, Analysis and Modeling software) to develop and apply a model that provided known values for β-carotene bioefficacy. Theoretical data for 10 subjects were generated by the model and used to determine bioefficacy by RIR and IRM; predictions were compared with known values. We also applied RIR and IRM to previously published data. Results: Plasma RIR accurately predicted β-carotene relative bioefficacy at 14 d or later. IRM also accurately predicted bioefficacy by 14 d, except that, when there was substantial postabsorptive bioconversion, IRM underestimated bioefficacy. Based on our model, 1-d predictions of relative bioefficacy include absorptive plus a portion of early postabsorptive conversion. Conclusion: The plasma RIR is a simple tracer method that accurately predicts β-carotene relative bioefficacy based on analysis of one blood sample obtained at ≥14 d after co-ingestion of labeled β-carotene and retinyl acetate. The method also provides information about the contributions of absorptive and postabsorptive conversion to total bioefficacy if an additional sample is taken at 1 d. © 2017 American Society for Nutrition.
Bull, Marta; Learn, Gerald; Genowati, Indira; McKernan, Jennifer; Hitti, Jane; Lockhart, David; Tapia, Kenneth; Holte, Sarah; Dragavon, Joan; Coombs, Robert; Mullins, James; Frenkel, Lisa
2009-09-22
Compartmentalization of HIV-1 between the genital tract and blood was noted in half of 57 women included in 12 studies primarily using cell-free virus. To further understand differences between genital tract and blood viruses of women with chronic HIV-1 infection cell-free and cell-associated virus populations were sequenced from these tissues, reasoning that integrated viral DNA includes variants archived from earlier in infection, and provides a greater array of genotypes for comparisons. Multiple sequences from single-genome-amplification of HIV-1 RNA and DNA from the genital tract and blood of each woman were compared in a cross-sectional study. Maximum likelihood phylogenies were evaluated for evidence of compartmentalization using four statistical tests. Genital tract and blood HIV-1 appears compartmentalized in 7/13 women by >/=2 statistical analyses. These subjects' phylograms were characterized by low diversity genital-specific viral clades interspersed between clades containing both genital and blood sequences. Many of the genital-specific clades contained monotypic HIV-1 sequences. In 2/7 women, HIV-1 populations were significantly compartmentalized across all four statistical tests; both had low diversity genital tract-only clades. Collapsing monotypic variants into a single sequence diminished the prevalence and extent of compartmentalization. Viral sequences did not demonstrate tissue-specific signature amino acid residues, differential immune selection, or co-receptor usage. In women with chronic HIV-1 infection multiple identical sequences suggest proliferation of HIV-1-infected cells, and low diversity tissue-specific phylogenetic clades are consistent with bursts of viral replication. These monotypic and tissue-specific viruses provide statistical support for compartmentalization of HIV-1 between the female genital tract and blood. However, the intermingling of these clades with clades comprised of both genital and blood sequences and the absence of tissue-specific genetic features suggests compartmentalization between blood and genital tract may be due to viral replication and proliferation of infected cells, and questions whether HIV-1 in the female genital tract is distinct from blood.
Identification of gene regulation models from single-cell data
NASA Astrophysics Data System (ADS)
Weber, Lisa; Raymond, William; Munsky, Brian
2018-09-01
In quantitative analyses of biological processes, one may use many different scales of models (e.g. spatial or non-spatial, deterministic or stochastic, time-varying or at steady-state) or many different approaches to match models to experimental data (e.g. model fitting or parameter uncertainty/sloppiness quantification with different experiment designs). These different analyses can lead to surprisingly different results, even when applied to the same data and the same model. We use a simplified gene regulation model to illustrate many of these concerns, especially for ODE analyses of deterministic processes, chemical master equation and finite state projection analyses of heterogeneous processes, and stochastic simulations. For each analysis, we employ MATLAB and PYTHON software to consider a time-dependent input signal (e.g. a kinase nuclear translocation) and several model hypotheses, along with simulated single-cell data. We illustrate different approaches (e.g. deterministic and stochastic) to identify the mechanisms and parameters of the same model from the same simulated data. For each approach, we explore how uncertainty in parameter space varies with respect to the chosen analysis approach or specific experiment design. We conclude with a discussion of how our simulated results relate to the integration of experimental and computational investigations to explore signal-activated gene expression models in yeast (Neuert et al 2013 Science 339 584–7) and human cells (Senecal et al 2014 Cell Rep. 8 75–83)5.
Mabileau, Guillaume; Scutelniciuc, Otilia; Tsereteli, Maia; Konorazov, Ivan; Yelizaryeva, Alla; Popovici, Svetlana; Saifuddin, Karimov; Losina, Elena; Manova, Manoela; Saldanha, Vinay; Malkin, Jean-Elie; Yazdanpanah, Yazdan
2018-03-01
We evaluated the effectiveness and cost-effectiveness of interventions targeting hepatitis C virus (HCV) and HIV infections among people who inject drugs (PWID) in Eastern Europe/Central Asia. We specifically considered the needle-syringe program (NSP), opioid substitution therapy (OST), HCV and HIV diagnosis, antiretroviral therapy (ART), and/or new HCV treatment (direct acting antiviral [DAA]) in Belarus, Georgia, Kazakhstan, Republic of Moldova, and Tajikistan. We developed a deterministic dynamic compartmental model and evaluated the number of infections averted, costs, and incremental cost-effectiveness ratios (ICERs) of interventions. OST decreased frequencies of injecting by 85% and NSP needle sharing rates by 57%; ART was introduced at CD4 <350 and DAA at fibrosis stage ≥F2 at a $2370 to $23 280 cost. Increasing NSP+OST had a high impact on transmissions (infections averted in PWID: 42% in Tajikistan to 55% in Republic of Moldova for HCV; 30% in Belarus to 61% in Kazakhstan for HIV over 20 years). Increasing NSP+OST+ART was very cost-effective in Georgia (ICER = $910/year of life saved [YLS]), and was cost-saving in Kazakhstan and Republic of Moldova. NSP+OST+ART and HIV diagnosis was very cost-effective in Tajikistan (ICER = $210/YLS). Increasing the coverage of all interventions was always the most effective strategy and was cost-effective in Belarus and Kazakhstan (ICER = $12 960 and $21 850/YLS); it became cost-effective/cost-saving in all countries when we decreased DAA costs. Increasing NSP+OST coverage, in addition to ART and HIV diagnosis, had a high impact on both epidemics and was very cost-effective and even cost-saving. When HCV diagnosis was improved, increased DAA averted a high number of new infections if associated with NSP+OST.
NASA Astrophysics Data System (ADS)
Wei, Y.; Thomas, S.; Zhou, H.; Arcas, D.; Titov, V. V.
2017-12-01
The increasing potential tsunami hazards pose great challenges for infrastructures along the coastlines of the U.S. Pacific Northwest. Tsunami impact at a coastal site is usually assessed from deterministic scenarios based on 10,000 years of geological records in the Cascadia Subduction Zone (CSZ). Aside from these deterministic methods, the new ASCE 7-16 tsunami provisions provide engineering design criteria of tsunami loads on buildings based on a probabilistic approach. This work develops a site-specific model near Newport, OR using high-resolution grids, and compute tsunami inundation depth and velocities at the study site resulted from credible probabilistic and deterministic earthquake sources in the Cascadia Subduction Zone. Three Cascadia scenarios, two deterministic scenarios, XXL1 and L1, and a 2,500-yr probabilistic scenario compliant with the new ASCE 7-16 standard, are simulated using combination of a depth-averaged shallow water model for offshore propagation and a Boussinesq-type model for onshore inundation. We speculate on the methods and procedure to obtain the 2,500-year probabilistic scenario for Newport that is compliant with the ASCE 7-16 tsunami provisions. We provide details of model results, particularly the inundation depth and flow speed for a new building, which will also be designated as a tsunami vertical evacuation shelter, at Newport, Oregon. We show that the ASCE 7-16 consistent hazards are between those obtained from deterministic L1 and XXL1 scenarios, and the greatest impact on the building may come from later waves. As a further step, we utilize the inundation model results to numerically compute tracks of large vessels in the vicinity of the building site and estimate if these vessels will impact on the building site during the extreme XXL1 and ASCE 7-16 hazard-consistent scenarios. Two-step study is carried out first to study tracks of massless particles and then large vessels with assigned mass considering drag force, inertial force, ship grounding and mooring. The simulation results show that none of the large vessels will impact on the building site in all tested scenarios.
The Stochastic Modelling of Endemic Diseases
NASA Astrophysics Data System (ADS)
Susvitasari, Kurnia; Siswantining, Titin
2017-01-01
A study about epidemic has been conducted since a long time ago, but genuine progress was hardly forthcoming until the end of the 19th century (Bailey, 1975). Both deterministic and stochastic models were used to describe these. Then, from 1927 to 1939 Kermack and McKendrick introduced a generality of this model, including some variables to consider such as rate of infection and recovery. The purpose of this project is to investigate the behaviour of the models when we set the basic reproduction number, R0. This quantity is defined as the expected number of contacts made by a typical infective to susceptibles in the population. According to the epidemic threshold theory, when R0 ≤ 1, minor epidemic occurs with probability one in both approaches, but when R0 > 1, the deterministic and stochastic models have different interpretation. In the deterministic approach, major epidemic occurs with probability one when R0 > 1 and predicts that the disease will settle down to an endemic equilibrium. Stochastic models, on the other hand, identify that the minor epidemic can possibly occur. If it does, then the epidemic will die out quickly. Moreover, if we let the population size be large and the major epidemic occurs, then it will take off and then reach the endemic level and move randomly around the deterministic’s equilibrium.
Economic analysis of interventions to improve village chicken production in Myanmar.
Henning, J; Morton, J; Pym, R; Hla, T; Sunn, K; Meers, J
2013-07-01
A cost-benefit analysis using deterministic and stochastic modelling was conducted to identify the net benefits for households that adopt (1) vaccination of individual birds against Newcastle disease (ND) or (2) improved management of chick rearing by providing coops for the protection of chicks from predation and chick starter feed inside a creep feeder to support chicks' nutrition in village chicken flocks in Myanmar. Partial budgeting was used to assess the additional costs and benefits associated with each of the two interventions tested relative to neither strategy. In the deterministic model, over the first 3 years after the introduction of the interventions, the cumulative sum of the net differences from neither strategy was 13,189Kyat for ND vaccination and 77,645Kyat for improved chick management (effective exchange rate in 2005: 1000Kyat=1$US). Both interventions were also profitable after discounting over a 10-year period; Net Present Values for ND vaccination and improved chick management were 30,791 and 167,825Kyat, respectively. The Benefit-Cost Ratio for ND vaccination was very high (28.8). This was lower for improved chick management, due to greater costs of the intervention, but still favourable at 4.7. Using both interventions concurrently yielded a Net Present Value of 470,543Kyat and a Benefit-Cost Ratio of 11.2 over the 10-year period in the deterministic model. Using the stochastic model, for the first 3 years following the introduction of the interventions, the mean cumulative sums of the net difference were similar to those values obtained from the deterministic model. Sensitivity analysis indicated that the cumulative net differences were strongly influenced by grower bird sale income, particularly under improved chick management. The effects of the strategies on odds of households selling and consuming birds after 7 months, and numbers of birds being sold or consumed after this period also influenced profitability. Cost variations for equipment used under improved chick management were not markedly associated with profitability. Net Present Values and Benefit-Cost Ratios discounted over a 10-year period were also similar to the deterministic model when mean values obtained through stochastic modelling were used. In summary, the study showed that ND vaccination and improved chick management can improve the viability and profitability of village chicken production in Myanmar. Copyright © 2013 Elsevier B.V. All rights reserved.
Spatio-Temporal Data Analysis at Scale Using Models Based on Gaussian Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stein, Michael
Gaussian processes are the most commonly used statistical model for spatial and spatio-temporal processes that vary continuously. They are broadly applicable in the physical sciences and engineering and are also frequently used to approximate the output of complex computer models, deterministic or stochastic. We undertook research related to theory, computation, and applications of Gaussian processes as well as some work on estimating extremes of distributions for which a Gaussian process assumption might be inappropriate. Our theoretical contributions include the development of new classes of spatial-temporal covariance functions with desirable properties and new results showing that certain covariance models lead tomore » predictions with undesirable properties. To understand how Gaussian process models behave when applied to deterministic computer models, we derived what we believe to be the first significant results on the large sample properties of estimators of parameters of Gaussian processes when the actual process is a simple deterministic function. Finally, we investigated some theoretical issues related to maxima of observations with varying upper bounds and found that, depending on the circumstances, standard large sample results for maxima may or may not hold. Our computational innovations include methods for analyzing large spatial datasets when observations fall on a partially observed grid and methods for estimating parameters of a Gaussian process model from observations taken by a polar-orbiting satellite. In our application of Gaussian process models to deterministic computer experiments, we carried out some matrix computations that would have been infeasible using even extended precision arithmetic by focusing on special cases in which all elements of the matrices under study are rational and using exact arithmetic. The applications we studied include total column ozone as measured from a polar-orbiting satellite, sea surface temperatures over the Pacific Ocean, and annual temperature extremes at a site in New York City. In each of these applications, our theoretical and computational innovations were directly motivated by the challenges posed by analyzing these and similar types of data.« less
Moog, Daniel; Maier, Uwe G
2017-08-01
Is the spatial organization of membranes and compartments within cells subjected to any rules? Cellular compartmentation differs between prokaryotic and eukaryotic life, because it is present to a high degree only in eukaryotes. In 1964, Prof. Eberhard Schnepf formulated the compartmentation rule (Schnepf theorem), which posits that a biological membrane, the main physical structure responsible for cellular compartmentation, usually separates a plasmatic form a non-plasmatic phase. Here we review and re-investigate the Schnepf theorem by applying the theorem to different cellular structures, from bacterial cells to eukaryotes with their organelles and compartments. In conclusion, we can confirm the general correctness of the Schnepf theorem, noting explicit exceptions only in special cases such as endosymbiosis and parasitism. © 2017 WILEY Periodicals, Inc.
Finney, Charles E.; Kaul, Brian C.; Daw, C. Stuart; ...
2015-02-18
Here we review developments in the understanding of cycle to cycle variability in internal combustion engines, with a focus on spark-ignited and premixed combustion conditions. Much of the research on cyclic variability has focused on stochastic aspects, that is, features that can be modeled as inherently random with no short term predictability. In some cases, models of this type appear to work very well at describing experimental observations, but the lack of predictability limits control options. Also, even when the statistical properties of the stochastic variations are known, it can be very difficult to discern their underlying physical causes andmore » thus mitigate them. Some recent studies have demonstrated that under some conditions, cyclic combustion variations can have a relatively high degree of low dimensional deterministic structure, which implies some degree of predictability and potential for real time control. These deterministic effects are typically more pronounced near critical stability limits (e.g. near tipping points associated with ignition or flame propagation) such during highly dilute fueling or near the onset of homogeneous charge compression ignition. We review recent progress in experimental and analytical characterization of cyclic variability where low dimensional, deterministic effects have been observed. We describe some theories about the sources of these dynamical features and discuss prospects for interactive control and improved engine designs. In conclusion, taken as a whole, the research summarized here implies that the deterministic component of cyclic variability will become a pivotal issue (and potential opportunity) as engine manufacturers strive to meet aggressive emissions and fuel economy regulations in the coming decades.« less
Variational principles for stochastic fluid dynamics
Holm, Darryl D.
2015-01-01
This paper derives stochastic partial differential equations (SPDEs) for fluid dynamics from a stochastic variational principle (SVP). The paper proceeds by taking variations in the SVP to derive stochastic Stratonovich fluid equations; writing their Itô representation; and then investigating the properties of these stochastic fluid models in comparison with each other, and with the corresponding deterministic fluid models. The circulation properties of the stochastic Stratonovich fluid equations are found to closely mimic those of the deterministic ideal fluid models. As with deterministic ideal flows, motion along the stochastic Stratonovich paths also preserves the helicity of the vortex field lines in incompressible stochastic flows. However, these Stratonovich properties are not apparent in the equivalent Itô representation, because they are disguised by the quadratic covariation drift term arising in the Stratonovich to Itô transformation. This term is a geometric generalization of the quadratic covariation drift term already found for scalar densities in Stratonovich's famous 1966 paper. The paper also derives motion equations for two examples of stochastic geophysical fluid dynamics; namely, the Euler–Boussinesq and quasi-geostropic approximations. PMID:27547083
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutjahr, A.L.; Kincaid, C.T.; Mercer, J.W.
1987-04-01
The objective of this report is to summarize the various modeling approaches that were used to simulate solute transport in a variably saturated emission. In particular, the technical strengths and weaknesses of each approach are discussed, and conclusions and recommendations for future studies are made. Five models are considered: (1) one-dimensional analytical and semianalytical solutions of the classical deterministic convection-dispersion equation (van Genuchten, Parker, and Kool, this report ); (2) one-dimensional simulation using a continuous-time Markov process (Knighton and Wagenet, this report); (3) one-dimensional simulation using the time domain method and the frequency domain method (Duffy and Al-Hassan, this report);more » (4) one-dimensional numerical approach that combines a solution of the classical deterministic convection-dispersion equation with a chemical equilibrium speciation model (Cederberg, this report); and (5) three-dimensional numerical solution of the classical deterministic convection-dispersion equation (Huyakorn, Jones, Parker, Wadsworth, and White, this report). As part of the discussion, the input data and modeling results are summarized. The models were used in a data analysis mode, as opposed to a predictive mode. Thus, the following discussion will concentrate on the data analysis aspects of model use. Also, all the approaches were similar in that they were based on a convection-dispersion model of solute transport. Each discussion addresses the modeling approaches in the order listed above.« less
Demographic noise can reverse the direction of deterministic selection
Constable, George W. A.; Rogers, Tim; McKane, Alan J.; Tarnita, Corina E.
2016-01-01
Deterministic evolutionary theory robustly predicts that populations displaying altruistic behaviors will be driven to extinction by mutant cheats that absorb common benefits but do not themselves contribute. Here we show that when demographic stochasticity is accounted for, selection can in fact act in the reverse direction to that predicted deterministically, instead favoring cooperative behaviors that appreciably increase the carrying capacity of the population. Populations that exist in larger numbers experience a selective advantage by being more stochastically robust to invasions than smaller populations, and this advantage can persist even in the presence of reproductive costs. We investigate this general effect in the specific context of public goods production and find conditions for stochastic selection reversal leading to the success of public good producers. This insight, developed here analytically, is missed by the deterministic analysis as well as by standard game theoretic models that enforce a fixed population size. The effect is found to be amplified by space; in this scenario we find that selection reversal occurs within biologically reasonable parameter regimes for microbial populations. Beyond the public good problem, we formulate a general mathematical framework for models that may exhibit stochastic selection reversal. In this context, we describe a stochastic analog to r−K theory, by which small populations can evolve to higher densities in the absence of disturbance. PMID:27450085
Stochastic oscillations in models of epidemics on a network of cities
NASA Astrophysics Data System (ADS)
Rozhnova, G.; Nunes, A.; McKane, A. J.
2011-11-01
We carry out an analytic investigation of stochastic oscillations in a susceptible-infected-recovered model of disease spread on a network of n cities. In the model a fraction fjk of individuals from city k commute to city j, where they may infect, or be infected by, others. Starting from a continuous-time Markov description of the model the deterministic equations, which are valid in the limit when the population of each city is infinite, are recovered. The stochastic fluctuations about the fixed point of these equations are derived by use of the van Kampen system-size expansion. The fixed point structure of the deterministic equations is remarkably simple: A unique nontrivial fixed point always exists and has the feature that the fraction of susceptible, infected, and recovered individuals is the same for each city irrespective of its size. We find that the stochastic fluctuations have an analogously simple dynamics: All oscillations have a single frequency, equal to that found in the one-city case. We interpret this phenomenon in terms of the properties of the spectrum of the matrix of the linear approximation of the deterministic equations at the fixed point.
From statistical proofs of the Kochen-Specker theorem to noise-robust noncontextuality inequalities
NASA Astrophysics Data System (ADS)
Kunjwal, Ravi; Spekkens, Robert W.
2018-05-01
The Kochen-Specker theorem rules out models of quantum theory wherein projective measurements are assigned outcomes deterministically and independently of context. This notion of noncontextuality is not applicable to experimental measurements because these are never free of noise and thus never truly projective. For nonprojective measurements, therefore, one must drop the requirement that an outcome be assigned deterministically in the model and merely require that it be assigned a distribution over outcomes in a manner that is context-independent. By demanding context independence in the representation of preparations as well, one obtains a generalized principle of noncontextuality that also supports a quantum no-go theorem. Several recent works have shown how to derive inequalities on experimental data which, if violated, demonstrate the impossibility of finding a generalized-noncontextual model of this data. That is, these inequalities do not presume quantum theory and, in particular, they make sense without requiring an operational analog of the quantum notion of projectiveness. We here describe a technique for deriving such inequalities starting from arbitrary proofs of the Kochen-Specker theorem. It extends significantly previous techniques that worked only for logical proofs, which are based on sets of projective measurements that fail to admit of any deterministic noncontextual assignment, to the case of statistical proofs, which are based on sets of projective measurements that d o admit of some deterministic noncontextual assignments, but not enough to explain the quantum statistics.
Convergence studies of deterministic methods for LWR explicit reflector methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Canepa, S.; Hursin, M.; Ferroukhi, H.
2013-07-01
The standard approach in modem 3-D core simulators, employed either for steady-state or transient simulations, is to use Albedo coefficients or explicit reflectors at the core axial and radial boundaries. In the latter approach, few-group homogenized nuclear data are a priori produced with lattice transport codes using 2-D reflector models. Recently, the explicit reflector methodology of the deterministic CASMO-4/SIMULATE-3 code system was identified to potentially constitute one of the main sources of errors for core analyses of the Swiss operating LWRs, which are all belonging to GII design. Considering that some of the new GIII designs will rely on verymore » different reflector concepts, a review and assessment of the reflector methodology for various LWR designs appeared as relevant. Therefore, the purpose of this paper is to first recall the concepts of the explicit reflector modelling approach as employed by CASMO/SIMULATE. Then, for selected reflector configurations representative of both GII and GUI designs, a benchmarking of the few-group nuclear data produced with the deterministic lattice code CASMO-4 and its successor CASMO-5, is conducted. On this basis, a convergence study with regards to geometrical requirements when using deterministic methods with 2-D homogenous models is conducted and the effect on the downstream 3-D core analysis accuracy is evaluated for a typical GII deflector design in order to assess the results against available plant measurements. (authors)« less
Is realistic neuronal modeling realistic?
Almog, Mara
2016-01-01
Scientific models are abstractions that aim to explain natural phenomena. A successful model shows how a complex phenomenon arises from relatively simple principles while preserving major physical or biological rules and predicting novel experiments. A model should not be a facsimile of reality; it is an aid for understanding it. Contrary to this basic premise, with the 21st century has come a surge in computational efforts to model biological processes in great detail. Here we discuss the oxymoronic, realistic modeling of single neurons. This rapidly advancing field is driven by the discovery that some neurons don't merely sum their inputs and fire if the sum exceeds some threshold. Thus researchers have asked what are the computational abilities of single neurons and attempted to give answers using realistic models. We briefly review the state of the art of compartmental modeling highlighting recent progress and intrinsic flaws. We then attempt to address two fundamental questions. Practically, can we realistically model single neurons? Philosophically, should we realistically model single neurons? We use layer 5 neocortical pyramidal neurons as a test case to examine these issues. We subject three publically available models of layer 5 pyramidal neurons to three simple computational challenges. Based on their performance and a partial survey of published models, we conclude that current compartmental models are ad hoc, unrealistic models functioning poorly once they are stretched beyond the specific problems for which they were designed. We then attempt to plot possible paths for generating realistic single neuron models. PMID:27535372
Pigache, Francois; Messine, Frédéric; Nogarede, Bertrand
2007-07-01
This paper deals with a deterministic and rational way to design piezoelectric transformers in radial mode. The proposed approach is based on the study of the inverse problem of design and on its reformulation as a mixed constrained global optimization problem. The methodology relies on the association of the analytical models for describing the corresponding optimization problem and on an exact global optimization software, named IBBA and developed by the second author to solve it. Numerical experiments are presented and compared in order to validate the proposed approach.
Rare event computation in deterministic chaotic systems using genealogical particle analysis
NASA Astrophysics Data System (ADS)
Wouters, J.; Bouchet, F.
2016-09-01
In this paper we address the use of rare event computation techniques to estimate small over-threshold probabilities of observables in deterministic dynamical systems. We demonstrate that genealogical particle analysis algorithms can be successfully applied to a toy model of atmospheric dynamics, the Lorenz ’96 model. We furthermore use the Ornstein-Uhlenbeck system to illustrate a number of implementation issues. We also show how a time-dependent objective function based on the fluctuation path to a high threshold can greatly improve the performance of the estimator compared to a fixed-in-time objective function.
Pyrotechnic modeling for the NSI and pin puller
NASA Technical Reports Server (NTRS)
Powers, Joseph M.; Gonthier, Keith A.
1993-01-01
A discussion concerning the modeling of pyrotechnically driven actuators is presented in viewgraph format. The following topics are discussed: literature search, constitutive data for full-scale model, simple deterministic model, observed phenomena, and results from simple model.
Deterministic multi-zone ice accretion modeling
NASA Technical Reports Server (NTRS)
Yamaguchi, K.; Hansman, R. John, Jr.; Kazmierczak, Michael
1991-01-01
The focus here is on a deterministic model of the surface roughness transition behavior of glaze ice. The initial smooth/rough transition location, bead formation, and the propagation of the transition location are analyzed. Based on the hypothesis that the smooth/rough transition location coincides with the laminar/turbulent boundary layer transition location, a multizone model is implemented in the LEWICE code. In order to verify the effectiveness of the model, ice accretion predictions for simple cylinders calculated by the multizone LEWICE are compared to experimental ice shapes. The glaze ice shapes are found to be sensitive to the laminar surface roughness and bead thickness parameters controlling the transition location, while the ice shapes are found to be insensitive to the turbulent surface roughness.
Combining deterministic and stochastic velocity fields in the analysis of deep crustal seismic data
NASA Astrophysics Data System (ADS)
Larkin, Steven Paul
Standard crustal seismic modeling obtains deterministic velocity models which ignore the effects of wavelength-scale heterogeneity, known to exist within the Earth's crust. Stochastic velocity models are a means to include wavelength-scale heterogeneity in the modeling. These models are defined by statistical parameters obtained from geologic maps of exposed crystalline rock, and are thus tied to actual geologic structures. Combining both deterministic and stochastic velocity models into a single model allows a realistic full wavefield (2-D) to be computed. By comparing these simulations to recorded seismic data, the effects of wavelength-scale heterogeneity can be investigated. Combined deterministic and stochastic velocity models are created for two datasets, the 1992 RISC seismic experiment in southeastern California and the 1986 PASSCAL seismic experiment in northern Nevada. The RISC experiment was located in the transition zone between the Salton Trough and the southern Basin and Range province. A high-velocity body previously identified beneath the Salton Trough is constrained to pinch out beneath the Chocolate Mountains to the northeast. The lateral extent of this body is evidence for the ephemeral nature of rifting loci as a continent is initially rifted. Stochastic modeling of wavelength-scale structures above this body indicate that little more than 5% mafic intrusion into a more felsic continental crust is responsible for the observed reflectivity. Modeling of the wide-angle RISC data indicates that coda waves following PmP are initially dominated by diffusion of energy out of the near-surface basin as the wavefield reverberates within this low-velocity layer. At later times, this coda consists of scattered body waves and P to S conversions. Surface waves do not play a significant role in this coda. Modeling of the PASSCAL dataset indicates that a high-gradient crust-mantle transition zone or a rough Moho interface is necessary to reduce precritical PmP energy. Possibly related, inconsistencies in published velocity models are rectified by hypothesizing the existence of large, elongate, high-velocity bodies at the base of the crust oriented to and of similar scale as the basins and ranges at the surface. This structure would result in an anisotropic lower crust.
Gutiérrez, Simón; Fernandez, Carlos; Barata, Carlos; Tarazona, José Vicente
2009-12-20
This work presents a computer model for Risk Assessment of Basins by Ecotoxicological Evaluation (RABETOX). The model is based on whole effluent toxicity testing and water flows along a specific river basin. It is capable of estimating the risk along a river segment using deterministic and probabilistic approaches. The Henares River Basin was selected as a case study to demonstrate the importance of seasonal hydrological variations in Mediterranean regions. As model inputs, two different ecotoxicity tests (the miniaturized Daphnia magna acute test and the D.magna feeding test) were performed on grab samples from 5 waste water treatment plant effluents. Also used as model inputs were flow data from the past 25 years, water velocity measurements and precise distance measurements using Geographical Information Systems (GIS). The model was implemented into a spreadsheet and the results were interpreted and represented using GIS in order to facilitate risk communication. To better understand the bioassays results, the effluents were screened through SPME-GC/MS analysis. The deterministic model, performed each month during one calendar year, showed a significant seasonal variation of risk while revealing that September represents the worst-case scenario with values up to 950 Risk Units. This classifies the entire area of study for the month of September as "sublethal significant risk for standard species". The probabilistic approach using Monte Carlo analysis was performed on 7 different forecast points distributed along the Henares River. A 0% probability of finding "low risk" was found at all forecast points with a more than 50% probability of finding "potential risk for sensitive species". The values obtained through both the deterministic and probabilistic approximations reveal the presence of certain substances, which might be causing sublethal effects in the aquatic species present in the Henares River.
Yang, Jason H.; Polanowska-Grabowska, Renata K.; Smith, Jeffrey S.; Shields, Charles W.; Saucerman, Jeffrey J.
2014-01-01
β-adrenergic signaling is spatiotemporally heterogeneous in the cardiac myocyte, conferring exquisite control to sympathetic stimulation. Such heterogeneity drives the formation of protein kinase A (PKA) signaling microdomains, which regulate Ca2+ handling and contractility. Here, we test the hypothesis that the nucleus independently comprises a PKA signaling microdomain regulating myocyte hypertrophy. Spatially-targeted FRET reporters for PKA activity identified slower PKA activation and lower isoproterenol sensitivity in the nucleus (t50 = 10.60±0.68 min; EC50 = 89.00 nmol/L) than in the cytosol (t50 = 3.71±0.25 min; EC50 = 1.22 nmol/L). These differences were not explained by cAMP or AKAP-based compartmentation. A computational model of cytosolic and nuclear PKA activity was developed and predicted that differences in nuclear PKA dynamics and magnitude are regulated by slow PKA catalytic subunit diffusion, while differences in isoproterenol sensitivity are regulated by nuclear expression of protein kinase inhibitor (PKI). These were validated by FRET and immunofluorescence. The model also predicted differential phosphorylation of PKA substrates regulating cell contractility and hypertrophy. Ca2+ and cell hypertrophy measurements validated these predictions and identified higher isoproterenol sensitivity for contractile enhancements (EC50 = 1.84 nmol/L) over cell hypertrophy (EC50 = 85.88 nmol/L). Over-expression of spatially targeted PKA catalytic subunit to the cytosol or nucleus enhanced contractile and hypertrophic responses, respectively. We conclude that restricted PKA catalytic subunit diffusion is an important PKA compartmentation mechanism and the nucleus comprises a novel PKA signaling microdomain, insulating hypertrophic from contractile β-adrenergic signaling responses. PMID:24225179
NASA Astrophysics Data System (ADS)
Ghodsi, Seyed Hamed; Kerachian, Reza; Estalaki, Siamak Malakpour; Nikoo, Mohammad Reza; Zahmatkesh, Zahra
2016-02-01
In this paper, two deterministic and stochastic multilateral, multi-issue, non-cooperative bargaining methodologies are proposed for urban runoff quality management. In the proposed methodologies, a calibrated Storm Water Management Model (SWMM) is used to simulate stormwater runoff quantity and quality for different urban stormwater runoff management scenarios, which have been defined considering several Low Impact Development (LID) techniques. In the deterministic methodology, the best management scenario, representing location and area of LID controls, is identified using the bargaining model. In the stochastic methodology, uncertainties of some key parameters of SWMM are analyzed using the info-gap theory. For each water quality management scenario, robustness and opportuneness criteria are determined based on utility functions of different stakeholders. Then, to find the best solution, the bargaining model is performed considering a combination of robustness and opportuneness criteria for each scenario based on utility function of each stakeholder. The results of applying the proposed methodology in the Velenjak urban watershed located in the northeastern part of Tehran, the capital city of Iran, illustrate its practical utility for conflict resolution in urban water quantity and quality management. It is shown that the solution obtained using the deterministic model cannot outperform the result of the stochastic model considering the robustness and opportuneness criteria. Therefore, it can be concluded that the stochastic model, which incorporates the main uncertainties, could provide more reliable results.
Deterministic Stress Modeling of Hot Gas Segregation in a Turbine
NASA Technical Reports Server (NTRS)
Busby, Judy; Sondak, Doug; Staubach, Brent; Davis, Roger
1998-01-01
Simulation of unsteady viscous turbomachinery flowfields is presently impractical as a design tool due to the long run times required. Designers rely predominantly on steady-state simulations, but these simulations do not account for some of the important unsteady flow physics. Unsteady flow effects can be modeled as source terms in the steady flow equations. These source terms, referred to as Lumped Deterministic Stresses (LDS), can be used to drive steady flow solution procedures to reproduce the time-average of an unsteady flow solution. The goal of this work is to investigate the feasibility of using inviscid lumped deterministic stresses to model unsteady combustion hot streak migration effects on the turbine blade tip and outer air seal heat loads using a steady computational approach. The LDS model is obtained from an unsteady inviscid calculation. The LDS model is then used with a steady viscous computation to simulate the time-averaged viscous solution. Both two-dimensional and three-dimensional applications are examined. The inviscid LDS model produces good results for the two-dimensional case and requires less than 10% of the CPU time of the unsteady viscous run. For the three-dimensional case, the LDS model does a good job of reproducing the time-averaged viscous temperature migration and separation as well as heat load on the outer air seal at a CPU cost that is 25% of that of an unsteady viscous computation.
NASA Astrophysics Data System (ADS)
Choiri, S.; Ainurofiq, A.
2018-03-01
Drug release from a montmorillonite (MMT) matrix is a complex mechanism controlled by swelling mechanism of MMT and an interaction of drug and MMT. The aim of this research was to explain a suitable model of the drug release mechanism from MMT and its binary mixture with a hydrophilic polymer in the controlled release formulation based on a compartmental modelling approach. Theophylline was used as a drug model and incorporated into MMT and a binary mixture with hydroxyl propyl methyl cellulose (HPMC) as a hydrophilic polymer, by a kneading method. The dissolution test was performed and the modelling of drug release was assisted by a WinSAAM software. A 2 model was purposed based on the swelling capability and basal spacing of MMT compartments. The model evaluation was carried out to goodness of fit and statistical parameters and models were validated by a cross-validation technique. The drug release from MMT matrix regulated by a burst release mechanism of unloaded drug, swelling ability, basal spacing of MMT compartment, and equilibrium between basal spacing and swelling compartments. Furthermore, the addition of HPMC in MMT system altered the presence of swelling compartment and equilibrium between swelling and basal spacing compartment systems. In addition, a hydrophilic polymer reduced the burst release mechanism of unloaded drug.
Li, Longxiang; Xue, Donglin; Deng, Weijie; Wang, Xu; Bai, Yang; Zhang, Feng; Zhang, Xuejun
2017-11-10
In deterministic computer-controlled optical surfacing, accurate dwell time execution by computer numeric control machines is crucial in guaranteeing a high-convergence ratio for the optical surface error. It is necessary to consider the machine dynamics limitations in the numerical dwell time algorithms. In this paper, these constraints on dwell time distribution are analyzed, and a model of the equal extra material removal is established. A positive dwell time algorithm with minimum equal extra material removal is developed. Results of simulations based on deterministic magnetorheological finishing demonstrate the necessity of considering machine dynamics performance and illustrate the validity of the proposed algorithm. Indeed, the algorithm effectively facilitates the determinacy of sub-aperture optical surfacing processes.
Kucza, Witold
2013-07-25
Stochastic and deterministic simulations of dispersion in cylindrical channels on the Poiseuille flow have been presented. The random walk (stochastic) and the uniform dispersion (deterministic) models have been used for computations of flow injection analysis responses. These methods coupled with the genetic algorithm and the Levenberg-Marquardt optimization methods, respectively, have been applied for determination of diffusion coefficients. The diffusion coefficients of fluorescein sodium, potassium hexacyanoferrate and potassium dichromate have been determined by means of the presented methods and FIA responses that are available in literature. The best-fit results agree with each other and with experimental data thus validating both presented approaches. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.
Self-Structure and Emotional Experience
Ditzfeld, Christopher P.; Showers, Carolin J.
2013-01-01
Two studies examine individual differences in affective reactivity by linking emotional experience to cognitive self-structure. Consistent with the view that individuals with an evaluatively compartmentalized self-structure are emotionally reactive, we find that evaluative compartmentalization is associated with the experience of, and desire for, high-arousal positive affect, whereas evaluative integration is associated with the experience of low-arousal positive and negative affect and the desire for low-arousal positive affect. Although compartmentalized individuals are less granular in their tendency to report experiencing both high- and low-arousal affect (cf. Feldman Barrett, 2004), they are strongly differentiated in their perceptions of high-arousal states as positive and low-arousal states as negative. Thus, compartmentalized individuals’ reactivity may be explained by their preference for high-arousal positive states and the “breadth” of their emotionality (e.g., the tendency to experience sadness and nervousness at the same time). PMID:24125479
Towards repurposing the yeast peroxisome for compartmentalizing heterologous metabolic pathways
DeLoache, William C.; Russ, Zachary N.; Dueber, John E.
2016-03-30
Compartmentalization of enzymes into organelles is a promising strategy for limiting metabolic crosstalk and improving pathway efficiency, but improved tools and design rules are needed to make this strategy available to more engineered pathways. Here we focus on the Saccharomyces cerevisiae peroxisome and develop a sensitive high-throughput assay for peroxisomal cargo import. We identify an enhanced peroxisomal targeting signal type 1 (PTS1) for rapidly sequestering non-native cargo proteins. Additionally, we perform the first systematic in vivo measurements of nonspecific metabolite permeability across the peroxisomal membrane using a polymer exclusion assay. Finally, we apply these new insights to compartmentalize a two-enzymemore » pathway in the peroxisome and characterize the expression regimes where compartmentalization leads to improved product titre. Lastly, this work builds a foundation for using the peroxisome as a synthetic organelle, highlighting both promise and future challenges on the way to realizing this goal.« less
Towards repurposing the yeast peroxisome for compartmentalizing heterologous metabolic pathways
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeLoache, William C.; Russ, Zachary N.; Dueber, John E.
Compartmentalization of enzymes into organelles is a promising strategy for limiting metabolic crosstalk and improving pathway efficiency, but improved tools and design rules are needed to make this strategy available to more engineered pathways. Here we focus on the Saccharomyces cerevisiae peroxisome and develop a sensitive high-throughput assay for peroxisomal cargo import. We identify an enhanced peroxisomal targeting signal type 1 (PTS1) for rapidly sequestering non-native cargo proteins. Additionally, we perform the first systematic in vivo measurements of nonspecific metabolite permeability across the peroxisomal membrane using a polymer exclusion assay. Finally, we apply these new insights to compartmentalize a two-enzymemore » pathway in the peroxisome and characterize the expression regimes where compartmentalization leads to improved product titre. Lastly, this work builds a foundation for using the peroxisome as a synthetic organelle, highlighting both promise and future challenges on the way to realizing this goal.« less
Cellular compartmentalization of secondary metabolism
Kistler, H. Corby; Broz, Karen
2015-01-01
Fungal secondary metabolism is often considered apart from the essential housekeeping functions of the cell. However, there are clear links between fundamental cellular metabolism and the biochemical pathways leading to secondary metabolite synthesis. Besides utilizing key biochemical precursors shared with the most essential processes of the cell (e.g., amino acids, acetyl CoA, NADPH), enzymes for secondary metabolite synthesis are compartmentalized at conserved subcellular sites that position pathway enzymes to use these common biochemical precursors. Co-compartmentalization of secondary metabolism pathway enzymes also may function to channel precursors, promote pathway efficiency and sequester pathway intermediates and products from the rest of the cell. In this review we discuss the compartmentalization of three well-studied fungal secondary metabolite biosynthetic pathways for penicillin G, aflatoxin and deoxynivalenol, and summarize evidence used to infer subcellular localization. We also discuss how these metabolites potentially are trafficked within the cell and may be exported. PMID:25709603
Stochastic von Bertalanffy models, with applications to fish recruitment.
Lv, Qiming; Pitchford, Jonathan W
2007-02-21
We consider three individual-based models describing growth in stochastic environments. Stochastic differential equations (SDEs) with identical von Bertalanffy deterministic parts are formulated, with a stochastic term which decreases, remains constant, or increases with organism size, respectively. Probability density functions for hitting times are evaluated in the context of fish growth and mortality. Solving the hitting time problem analytically or numerically shows that stochasticity can have a large positive impact on fish recruitment probability. It is also demonstrated that the observed mean growth rate of surviving individuals always exceeds the mean population growth rate, which itself exceeds the growth rate of the equivalent deterministic model. The consequences of these results in more general biological situations are discussed.
Deterministic SLIR model for tuberculosis disease mapping
NASA Astrophysics Data System (ADS)
Aziz, Nazrina; Diah, Ijlal Mohd; Ahmad, Nazihah; Kasim, Maznah Mat
2017-11-01
Tuberculosis (TB) occurs worldwide. It can be transmitted to others directly through air when active TB persons sneeze, cough or spit. In Malaysia, it was reported that TB cases had been recognized as one of the most infectious disease that lead to death. Disease mapping is one of the methods that can be used as the prevention strategies since it can displays clear picture for the high-low risk areas. Important thing that need to be considered when studying the disease occurrence is relative risk estimation. The transmission of TB disease is studied through mathematical model. Therefore, in this study, deterministic SLIR models are used to estimate relative risk for TB disease transmission.
Sampled-Data Consensus of Linear Multi-agent Systems With Packet Losses.
Zhang, Wenbing; Tang, Yang; Huang, Tingwen; Kurths, Jurgen
In this paper, the consensus problem is studied for a class of multi-agent systems with sampled data and packet losses, where random and deterministic packet losses are considered, respectively. For random packet losses, a Bernoulli-distributed white sequence is used to describe packet dropouts among agents in a stochastic way. For deterministic packet losses, a switched system with stable and unstable subsystems is employed to model packet dropouts in a deterministic way. The purpose of this paper is to derive consensus criteria, such that linear multi-agent systems with sampled-data and packet losses can reach consensus. By means of the Lyapunov function approach and the decomposition method, the design problem of a distributed controller is solved in terms of convex optimization. The interplay among the allowable bound of the sampling interval, the probability of random packet losses, and the rate of deterministic packet losses are explicitly derived to characterize consensus conditions. The obtained criteria are closely related to the maximum eigenvalue of the Laplacian matrix versus the second minimum eigenvalue of the Laplacian matrix, which reveals the intrinsic effect of communication topologies on consensus performance. Finally, simulations are given to show the effectiveness of the proposed results.In this paper, the consensus problem is studied for a class of multi-agent systems with sampled data and packet losses, where random and deterministic packet losses are considered, respectively. For random packet losses, a Bernoulli-distributed white sequence is used to describe packet dropouts among agents in a stochastic way. For deterministic packet losses, a switched system with stable and unstable subsystems is employed to model packet dropouts in a deterministic way. The purpose of this paper is to derive consensus criteria, such that linear multi-agent systems with sampled-data and packet losses can reach consensus. By means of the Lyapunov function approach and the decomposition method, the design problem of a distributed controller is solved in terms of convex optimization. The interplay among the allowable bound of the sampling interval, the probability of random packet losses, and the rate of deterministic packet losses are explicitly derived to characterize consensus conditions. The obtained criteria are closely related to the maximum eigenvalue of the Laplacian matrix versus the second minimum eigenvalue of the Laplacian matrix, which reveals the intrinsic effect of communication topologies on consensus performance. Finally, simulations are given to show the effectiveness of the proposed results.
2012-02-12
is the total number of data points, is an approximately unbiased estimate of the “expected relative Kullback - Leibler distance” ( information loss...possible models). Thus, after each model from Table 2 is fit to a data set, we can compute the Akaike weights for the set of candidate models and use ...computed from the OLS best- fit model solution (top), from a deconvolution of the data using normal curves (middle) and from a deconvolution of the data
Interactive Reliability Model for Whisker-toughened Ceramics
NASA Technical Reports Server (NTRS)
Palko, Joseph L.
1993-01-01
Wider use of ceramic matrix composites (CMC) will require the development of advanced structural analysis technologies. The use of an interactive model to predict the time-independent reliability of a component subjected to multiaxial loads is discussed. The deterministic, three-parameter Willam-Warnke failure criterion serves as the theoretical basis for the reliability model. The strength parameters defining the model are assumed to be random variables, thereby transforming the deterministic failure criterion into a probabilistic criterion. The ability of the model to account for multiaxial stress states with the same unified theory is an improvement over existing models. The new model was coupled with a public-domain finite element program through an integrated design program. This allows a design engineer to predict the probability of failure of a component. A simple structural problem is analyzed using the new model, and the results are compared to existing models.
Nonclassical point of view of the Brownian motion generation via fractional deterministic model
NASA Astrophysics Data System (ADS)
Gilardi-Velázquez, H. E.; Campos-Cantón, E.
In this paper, we present a dynamical system based on the Langevin equation without stochastic term and using fractional derivatives that exhibit properties of Brownian motion, i.e. a deterministic model to generate Brownian motion is proposed. The stochastic process is replaced by considering an additional degree of freedom in the second-order Langevin equation. Thus, it is transformed into a system of three first-order linear differential equations, additionally α-fractional derivative are considered which allow us to obtain better statistical properties. Switching surfaces are established as a part of fluctuating acceleration. The final system of three α-order linear differential equations does not contain a stochastic term, so the system generates motion in a deterministic way. Nevertheless, from the time series analysis, we found that the behavior of the system exhibits statistics properties of Brownian motion, such as, a linear growth in time of mean square displacement, a Gaussian distribution. Furthermore, we use the detrended fluctuation analysis to prove the Brownian character of this motion.
Murakami, Masayoshi; Shteingart, Hanan; Loewenstein, Yonatan; Mainen, Zachary F
2017-05-17
The selection and timing of actions are subject to determinate influences such as sensory cues and internal state as well as to effectively stochastic variability. Although stochastic choice mechanisms are assumed by many theoretical models, their origin and mechanisms remain poorly understood. Here we investigated this issue by studying how neural circuits in the frontal cortex determine action timing in rats performing a waiting task. Electrophysiological recordings from two regions necessary for this behavior, medial prefrontal cortex (mPFC) and secondary motor cortex (M2), revealed an unexpected functional dissociation. Both areas encoded deterministic biases in action timing, but only M2 neurons reflected stochastic trial-by-trial fluctuations. This differential coding was reflected in distinct timescales of neural dynamics in the two frontal cortical areas. These results suggest a two-stage model in which stochastic components of action timing decisions are injected by circuits downstream of those carrying deterministic bias signals. Copyright © 2017 Elsevier Inc. All rights reserved.
Modelling of different measures for improving removal in a stormwater pond.
German, J; Jansons, K; Svensson, G; Karlsson, D; Gustafsson, L G
2005-01-01
The effect of retrofitting an existing pond on removal efficiency and hydraulic performance was modelled using the commercial software Mike21 and compartmental modelling. The Mike21 model had previously been calibrated on the studied pond. Installation of baffles, the addition of culverts under a causeway and removal of an existing island were all studied as possible improvement measures in the pond. The subsequent effect on hydraulic performance and removal of suspended solids was then evaluated. Copper, cadmium, BOD, nitrogen and phosphorus removal were also investigated for that specific improvement measure showing the best results. Outcomes of this study reveal that all measures increase the removal efficiency of suspended solids. The hydraulic efficiency is improved for all cases, except for the case where the island is removed. Compartmental modelling was also used to evaluate hydraulic performance and facilitated a better understanding of the way each of the different measures affected the flow pattern and performance. It was concluded that the installation of baffles is the best of the studied measures resulting in a reduction in the annual load on the receiving lake by approximately 8,000 kg of suspended solids (25% reduction of the annual load), 2 kg of copper (10% reduction of the annual load) and 600 kg of BOD (10% reduction of the annual load).
Analysis of Functional Coupling: Mitochondrial Creatine Kinase and Adenine Nucleotide Translocase
Vendelin, Marko; Lemba, Maris; Saks, Valdur A.
2004-01-01
The mechanism of functional coupling between mitochondrial creatine kinase (MiCK) and adenine nucleotide translocase (ANT) in isolated heart mitochondria is analyzed. Two alternative mechanisms are studied: 1), dynamic compartmentation of ATP and ADP, which assumes the differences in concentrations of the substrates between intermembrane space and surrounding solution due to some diffusion restriction and 2), direct transfer of the substrates between MiCK and ANT. The mathematical models based on these possible mechanisms were composed and simulation results were compared with the available experimental data. The first model, based on a dynamic compartmentation mechanism, was not sufficient to reproduce the measured values of apparent dissociation constants of MiCK reaction coupled to oxidative phosphorylation. The second model, which assumes the direct transfer of substrates between MiCK and ANT, is shown to be in good agreement with experiments—i.e., the second model reproduced the measured constants and the estimated ADP flux, entering mitochondria after the MiCK reaction. This model is thermodynamically consistent, utilizing the free energy profiles of reactions. The analysis revealed the minimal changes in the free energy profile of the MiCK-ANT interaction required to reproduce the experimental data. A possible free energy profile of the coupled MiCK-ANT system is presented. PMID:15240503
Eradicating a Disease: Lessons from Mathematical Epidemiology
ERIC Educational Resources Information Center
Glomski, Matthew; Ohanian, Edward
2012-01-01
Smallpox remains the only human disease ever eradicated. In this paper, we consider the mathematics behind control strategies used in the effort to eradicate smallpox, from the life tables of Daniel Bernoulli, to the more modern susceptible-infected-removed (SIR)-type compartmental models. In addition, we examine the mathematical feasibility of…
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Compartmentalized gene regulatory network of the pathogenic fungus Fusarium graminearum
USDA-ARS?s Scientific Manuscript database
Head blight caused by Fusarium graminearum (Fg) is a major limiting factor of wheat production with both yield loss and mycotoxin contamination. Here we report a model for global Fg gene regulatory networks (GRNs) inferred from a large collection of transcriptomic data using a machine-learning appro...
Brain Research Focuses on New Assays, Drugs
ERIC Educational Resources Information Center
Chemical and Engineering News, 1977
1977-01-01
Those attending the CIC/ACS (Chemical Institute of Canada /American Chemical Society) joint conference at Montreal heard about recent advances in brain chemistry research, the use of compartmental models for predicting pollution, the presence of carcinogens (N-Nitrosamines) in sidestream tobacco smoke, and the synthesis of sex attractants using…
USDA-ARS?s Scientific Manuscript database
Lycopene is a red carotenoid found in tomatoes hypothesized to mediate disease preventive effects associated with tomato consumption. Lycopene is consumed primarily as the all-trans geometric isomer in foods, while human plasma and tissues demonstrate greater proportions of cis isomers. The objecti...
Developing deterioration models for Wyoming bridges.
DOT National Transportation Integrated Search
2016-05-01
Deterioration models for the Wyoming Bridge Inventory were developed using both stochastic and deterministic models. : The selection of explanatory variables is investigated and a new method using LASSO regression to eliminate human bias : in explana...
Short-range solar radiation forecasts over Sweden
NASA Astrophysics Data System (ADS)
Landelius, Tomas; Lindskog, Magnus; Körnich, Heiner; Andersson, Sandra
2018-04-01
In this article the performance for short-range solar radiation forecasts by the global deterministic and ensemble models from the European Centre for Medium-Range Weather Forecasts (ECMWF) is compared with an ensemble of the regional mesoscale model HARMONIE-AROME used by the national meteorological services in Sweden, Norway and Finland. Note however that only the control members and the ensemble means are included in the comparison. The models resolution differs considerably with 18 km for the ECMWF ensemble, 9 km for the ECMWF deterministic model, and 2.5 km for the HARMONIE-AROME ensemble. The models share the same radiation code. It turns out that they all underestimate systematically the Direct Normal Irradiance (DNI) for clear-sky conditions. Except for this shortcoming, the HARMONIE-AROME ensemble model shows the best agreement with the distribution of observed Global Horizontal Irradiance (GHI) and DNI values. During mid-day the HARMONIE-AROME ensemble mean performs best. The control member of the HARMONIE-AROME ensemble also scores better than the global deterministic ECMWF model. This is an interesting result since mesoscale models have so far not shown good results when compared to the ECMWF models. Three days with clear, mixed and cloudy skies are used to illustrate the possible added value of a probabilistic forecast. It is shown that in these cases the mesoscale ensemble could provide decision support to a grid operator in terms of forecasts of both the amount of solar power and its probabilities.
NASA Astrophysics Data System (ADS)
Yan, Yajing; Barth, Alexander; Beckers, Jean-Marie; Candille, Guillem; Brankart, Jean-Michel; Brasseur, Pierre
2015-04-01
Sea surface height, sea surface temperature and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. 60 ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. Incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with observations used in the assimilation experiments and independent observations, which goes further than most previous studies and constitutes one of the original points of this paper. Regarding the deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations in order to diagnose the ensemble distribution properties in a deterministic way. Regarding the probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centred random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analysed jointly. The consistency and complementarity between both validations are highlighted. High reliable situations, in which the RMS error and the CRPS give the same information, are identified for the first time in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jie; Draxl, Caroline; Hopson, Thomas
Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less
Terçariol, César Augusto Sangaletti; Martinez, Alexandre Souto
2005-08-01
Consider a medium characterized by N points whose coordinates are randomly generated by a uniform distribution along the edges of a unitary d-dimensional hypercube. A walker leaves from each point of this disordered medium and moves according to the deterministic rule to go to the nearest point which has not been visited in the preceding mu steps (deterministic tourist walk). Each trajectory generated by this dynamics has an initial nonperiodic part of t steps (transient) and a final periodic part of p steps (attractor). The neighborhood rank probabilities are parametrized by the normalized incomplete beta function Id= I1/4 [1/2, (d+1) /2] . The joint distribution S(N) (mu,d) (t,p) is relevant, and the marginal distributions previously studied are particular cases. We show that, for the memory-less deterministic tourist walk in the euclidean space, this distribution is Sinfinity(1,d) (t,p) = [Gamma (1+ I(-1)(d)) (t+ I(-1)(d) ) /Gamma(t+p+ I(-1)(d)) ] delta(p,2), where t=0, 1,2, ... infinity, Gamma(z) is the gamma function and delta(i,j) is the Kronecker delta. The mean-field models are the random link models, which correspond to d-->infinity, and the random map model which, even for mu=0 , presents nontrivial cycle distribution [ S(N)(0,rm) (p) proportional to p(-1) ] : S(N)(0,rm) (t,p) =Gamma(N)/ {Gamma[N+1- (t+p) ] N( t+p)}. The fundamental quantities are the number of explored points n(e)=t+p and Id. Although the obtained distributions are simple, they do not follow straightforwardly and they have been validated by numerical experiments.
What lies beneath? Diffusion EAP-based study of brain tissue microstructure.
Zucchelli, Mauro; Brusini, Lorenza; Andrés Méndez, C; Daducci, Alessandro; Granziera, Cristina; Menegaz, Gloria
2016-08-01
Diffusion weighted magnetic resonance signals convey information about tissue microstructure and cytoarchitecture. In the last years, many models have been proposed for recovering the diffusion signal and extracting information to constitute new families of numerical indices. Two main categories of reconstruction models can be identified in diffusion magnetic resonance imaging (DMRI): ensemble average propagator (EAP) models and compartmental models. From both, descriptors can be derived for elucidating the underlying microstructural architecture. While compartmental models indices directly quantify the fraction of different cell compartments in each voxel, EAP-derived indices are only a derivative measure and the effect of the different microstructural configurations on the indices is still unclear. In this paper, we analyze three EAP indices calculated using the 3D Simple Harmonic Oscillator based Reconstruction and Estimation (3D-SHORE) model and estimate their changes with respect to the principal microstructural configurations. We take advantage of the state of the art simulations to quantify the variations of the indices with the simulation parameters. Analysis of in-vivo data correlates the EAP indices with the microstructural parameters obtained from the Neurite Orientation Dispersion and Density Imaging (NODDI) model as a pseudo ground truth for brain data. Results show that the EAP derived indices convey information on the tissue microstructure and that their combined values directly reflect the configuration of the different compartments in each voxel. Copyright © 2016 Elsevier B.V. All rights reserved.
Biomarkers of environmental benzene exposure.
Weisel, C; Yu, R; Roy, A; Georgopoulos, P
1996-01-01
Environmental exposures to benzene result in increases in body burden that are reflected in various biomarkers of exposure, including benzene in exhaled breath, benzene in blood and urinary trans-trans-muconic acid and S-phenylmercapturic acid. A review of the literature indicates that these biomarkers can be used to distinguish populations with different levels of exposure (such as smokers from nonsmokers and occupationally exposed from environmentally exposed populations) and to determine differences in metabolism. Biomarkers in humans have shown that the percentage of benzene metabolized by the ring-opening pathway is greater at environmental exposures than that at higher occupational exposures, a trend similar to that found in animal studies. This suggests that the dose-response curve is nonlinear; that potential different metabolic mechanisms exist at high and low doses; and that the validity of a linear extrapolation of adverse effects measured at high doses to a population exposed to lower, environmental levels of benzene is uncertain. Time-series measurements of the biomarker, exhaled breath, were used to evaluate a physiologically based pharmacokinetic (PBPK) model. Biases were identified between the PBPK model predictions and experimental data that were adequately described using an empirical compartmental model. It is suggested that a mapping of the PBPK model to a compartmental model can be done to optimize the parameters in the PBPK model to provide a future framework for developing a population physiologically based pharmacokinetic model. PMID:9118884
Modelling ecosystem service flows under uncertainty with stochiastic SPAN
Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.
2012-01-01
Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.
Tveito, Aslak; Lines, Glenn T; Edwards, Andrew G; McCulloch, Andrew
2016-07-01
Markov models are ubiquitously used to represent the function of single ion channels. However, solving the inverse problem to construct a Markov model of single channel dynamics from bilayer or patch-clamp recordings remains challenging, particularly for channels involving complex gating processes. Methods for solving the inverse problem are generally based on data from voltage clamp measurements. Here, we describe an alternative approach to this problem based on measurements of voltage traces. The voltage traces define probability density functions of the functional states of an ion channel. These probability density functions can also be computed by solving a deterministic system of partial differential equations. The inversion is based on tuning the rates of the Markov models used in the deterministic system of partial differential equations such that the solution mimics the properties of the probability density function gathered from (pseudo) experimental data as well as possible. The optimization is done by defining a cost function to measure the difference between the deterministic solution and the solution based on experimental data. By evoking the properties of this function, it is possible to infer whether the rates of the Markov model are identifiable by our method. We present applications to Markov model well-known from the literature. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Compartmentalized storage tank for electrochemical cell system
NASA Technical Reports Server (NTRS)
Piecuch, Benjamin Michael (Inventor); Dalton, Luke Thomas (Inventor)
2010-01-01
A compartmentalized storage tank is disclosed. The compartmentalized storage tank includes a housing, a first fluid storage section disposed within the housing, a second fluid storage section disposed within the housing, the first and second fluid storage sections being separated by a movable divider, and a constant force spring. The constant force spring is disposed between the housing and the movable divider to exert a constant force on the movable divider to cause a pressure P1 in the first fluid storage section to be greater than a pressure P2 in the second fluid storage section, thereby defining a pressure differential.
HIV Migration Between Blood and Cerebrospinal Fluid or Semen Over Time
Chaillon, Antoine; Gianella, Sara; Wertheim, Joel O.; Richman, Douglas D.; Mehta, Sanjay R.; Smith, David M.
2014-01-01
Previous studies reported associations between neuropathogenesis and human immunodeficiency virus (HIV) compartmentalization in cerebrospinal fluid (CSF) and between sexual transmission and human immunodeficiency virus type 1 (HIV) compartmentalization in semen. It remains unclear, however, how compartmentalization dynamics change over time. To address this, we used statistical methods and Bayesian phylogenetic approaches to reconstruct temporal dynamics of HIV migration between blood and CSF and between blood and the male genital tract. We investigated 11 HIV-infected individuals with paired semen and blood samples and 4 individuals with paired CSF and blood samples. Aligned partial HIV env sequences were analyzed by (1) phylogenetic reconstruction, using a Bayesian Markov-chain Monte Carlo approach; (2) evaluation of viral compartmentalization, using tree-based and distance-based methods; and (3) analysis of migration events, using a discrete Bayesian asymmetric phylogeographic approach of diffusion with Markov jump counts estimation. Finally, we evaluated potential correlates of viral gene flow across anatomical compartments. We observed bidirectional replenishment of viral compartments and asynchronous peaks of viral migration from and to blood over time, suggesting that disruption of viral compartment is transient and directionally selected. These findings imply that viral subpopulations in anatomical sites are an active part of the whole viral population and that compartmental reservoirs could have implications in future eradication studies. PMID:24302756
de Almeida, Sergio M; Rotta, Indianara; Ribeiro, Clea E; Oliveira, Michelli F; Chaillon, Antoine; de Pereira, Ana Paula; Cunha, Ana Paula; Zonta, Marise; Bents, Joao França; Raboni, Sonia M; Smith, Davey; Letendre, Scott; Ellis, Ronald J
2017-06-01
Despite the effective suppression of viremia with antiretroviral therapy, HIV can still replicate in the central nervous system (CNS). This was a longitudinal study of the cerebrospinal fluid (CSF) and serum dynamics of several biomarkers related to inflammation, the blood-brain barrier, neuronal injury, and IgG intrathecal synthesis in serial samples of CSF and serum from a patient infected with HIV-1 subtype C with CNS compartmentalization.The phylogenetic analyses of plasma and CSF samples in an acute phase using next-generation sequencing and F-statistics analysis of C2-V3 haplotypes revealed distinct compartmentalized CSF viruses in paired CSF and peripheral blood mononuclear cell samples. The CSF biomarker analysis in this patient showed that symptomatic CSF escape is accompanied by CNS inflammation, high levels of cell and humoral immune biomarkers, CNS barrier dysfunction, and an increase in neuronal injury biomarkers with demyelization. Independent and isolated HIV replication can occur in the CNS, even in HIV-1 subtype C, leading to compartmentalization and development of quasispecies distinct from the peripheral plasma. These immunological aspects of the HIV CNS escape have not been described previously. To our knowledge, this is the first report of CNS HIV escape and compartmentalization in HIV-1 subtype C.
Dynamics of Zika virus outbreaks: an overview of mathematical modeling approaches
Wiratsudakul, Anuwat; Suparit, Parinya
2018-01-01
Background The Zika virus was first discovered in 1947. It was neglected until a major outbreak occurred on Yap Island, Micronesia, in 2007. Teratogenic effects resulting in microcephaly in newborn infants is the greatest public health threat. In 2016, the Zika virus epidemic was declared as a Public Health Emergency of International Concern (PHEIC). Consequently, mathematical models were constructed to explicitly elucidate related transmission dynamics. Survey Methodology In this review article, two steps of journal article searching were performed. First, we attempted to identify mathematical models previously applied to the study of vector-borne diseases using the search terms “dynamics,” “mathematical model,” “modeling,” and “vector-borne” together with the names of vector-borne diseases including chikungunya, dengue, malaria, West Nile, and Zika. Then the identified types of model were further investigated. Second, we narrowed down our survey to focus on only Zika virus research. The terms we searched for were “compartmental,” “spatial,” “metapopulation,” “network,” “individual-based,” “agent-based” AND “Zika.” All relevant studies were included regardless of the year of publication. We have collected research articles that were published before August 2017 based on our search criteria. In this publication survey, we explored the Google Scholar and PubMed databases. Results We found five basic model architectures previously applied to vector-borne virus studies, particularly in Zika virus simulations. These include compartmental, spatial, metapopulation, network, and individual-based models. We found that Zika models carried out for early epidemics were mostly fit into compartmental structures and were less complicated compared to the more recent ones. Simple models are still commonly used for the timely assessment of epidemics. Nevertheless, due to the availability of large-scale real-world data and computational power, recently there has been growing interest in more complex modeling frameworks. Discussion Mathematical models are employed to explore and predict how an infectious disease spreads in the real world, evaluate the disease importation risk, and assess the effectiveness of intervention strategies. As the trends in modeling of infectious diseases have been shifting towards data-driven approaches, simple and complex models should be exploited differently. Simple models can be produced in a timely fashion to provide an estimation of the possible impacts. In contrast, complex models integrating real-world data require more time to develop but are far more realistic. The preparation of complicated modeling frameworks prior to the outbreaks is recommended, including the case of future Zika epidemic preparation. PMID:29593941
Lai, C.; Tsay, T.-K.; Chien, C.-H.; Wu, I.-L.
2009-01-01
Researchers at the Hydroinformatic Research and Development Team (HIRDT) of the National Taiwan University undertook a project to create a real time flood forecasting model, with an aim to predict the current in the Tamsui River Basin. The model was designed based on deterministic approach with mathematic modeling of complex phenomenon, and specific parameter values operated to produce a discrete result. The project also devised a rainfall-stage model that relates the rate of rainfall upland directly to the change of the state of river, and is further related to another typhoon-rainfall model. The geographic information system (GIS) data, based on precise contour model of the terrain, estimate the regions that were perilous to flooding. The HIRDT, in response to the project's progress, also devoted their application of a deterministic model to unsteady flow of thermodynamics to help predict river authorities issue timely warnings and take other emergency measures.
Fuzzy linear model for production optimization of mining systems with multiple entities
NASA Astrophysics Data System (ADS)
Vujic, Slobodan; Benovic, Tomo; Miljanovic, Igor; Hudej, Marjan; Milutinovic, Aleksandar; Pavlovic, Petar
2011-12-01
Planning and production optimization within multiple mines or several work sites (entities) mining systems by using fuzzy linear programming (LP) was studied. LP is the most commonly used operations research methods in mining engineering. After the introductory review of properties and limitations of applying LP, short reviews of the general settings of deterministic and fuzzy LP models are presented. With the purpose of comparative analysis, the application of both LP models is presented using the example of the Bauxite Basin Niksic with five mines. After the assessment, LP is an efficient mathematical modeling tool in production planning and solving many other single-criteria optimization problems of mining engineering. After the comparison of advantages and deficiencies of both deterministic and fuzzy LP models, the conclusion presents benefits of the fuzzy LP model but is also stating that seeking the optimal plan of production means to accomplish the overall analysis that will encompass the LP model approaches.
Spatio-temporal modelling of rainfall in the Murray-Darling Basin
NASA Astrophysics Data System (ADS)
Nowak, Gen; Welsh, A. H.; O'Neill, T. J.; Feng, Lingbing
2018-02-01
The Murray-Darling Basin (MDB) is a large geographical region in southeastern Australia that contains many rivers and creeks, including Australia's three longest rivers, the Murray, the Murrumbidgee and the Darling. Understanding rainfall patterns in the MDB is very important due to the significant impact major events such as droughts and floods have on agricultural and resource productivity. We propose a model for modelling a set of monthly rainfall data obtained from stations in the MDB and for producing predictions in both the spatial and temporal dimensions. The model is a hierarchical spatio-temporal model fitted to geographical data that utilises both deterministic and data-derived components. Specifically, rainfall data at a given location are modelled as a linear combination of these deterministic and data-derived components. A key advantage of the model is that it is fitted in a step-by-step fashion, enabling appropriate empirical choices to be made at each step.
Lv, Qiming; Schneider, Manuel K; Pitchford, Jonathan W
2008-08-01
We study individual plant growth and size hierarchy formation in an experimental population of Arabidopsis thaliana, within an integrated analysis that explicitly accounts for size-dependent growth, size- and space-dependent competition, and environmental stochasticity. It is shown that a Gompertz-type stochastic differential equation (SDE) model, involving asymmetric competition kernels and a stochastic term which decreases with the logarithm of plant weight, efficiently describes individual plant growth, competition, and variability in the studied population. The model is evaluated within a Bayesian framework and compared to its deterministic counterpart, and to several simplified stochastic models, using distributional validation. We show that stochasticity is an important determinant of size hierarchy and that SDE models outperform the deterministic model if and only if structural components of competition (asymmetry; size- and space-dependence) are accounted for. Implications of these results are discussed in the context of plant ecology and in more general modelling situations.
Stochastic Watershed Models for Risk Based Decision Making
NASA Astrophysics Data System (ADS)
Vogel, R. M.
2017-12-01
Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation
Application of in Vitro Biotransformation Data and ...
The adverse biological effects of toxic substances are dependent upon the exposure concentration and the duration of exposure. Pharmacokinetic models can quantitatively relate the external concentration of a toxicant in the environment to the internal dose of the toxicant in the target tissues of an exposed organism. The exposure concentration of a toxic substance is usually not the same as the concentration of the active form of the toxicant that reaches the target tissues following absorption, distribution, and biotransformation of the parent toxicant. Biotransformation modulates the biological activity of chemicals through bioactivation and detoxication pathways. Many toxicants require biotransformation to exert their adverse biological effects. Considerable species differences in biotransformation and other pharmacokinetic processes can make extrapolation of toxicity data from laboratory animals to humans problematic. Additionally, interindividual differences in biotransformation among human populations with diverse genetics and lifestyles can lead to considerable variability in the bioactivation of toxic chemicals. Compartmental pharmacokinetic models of animals and humans are needed to understand the quantitative relationships between chemical exposure and target tissue dose as well as animal to human differences and interindividual differences in human populations. The data-based compartmental pharmacokinetic models widely used in clinical pharmacology ha
Aziza, Fanny; Mettler, Eric; Daudin, Jean-Jacques; Sanaa, Moez
2006-06-01
Cheese smearing is a complex process and the potential for cross-contamination with pathogenic or undesirable microorganisms is critical. During ripening, cheeses are salted and washed with brine to develop flavor and remove molds that could develop on the surfaces. Considering the potential for cross-contamination of this process in quantitative risk assessments could contribute to a better understanding of this phenomenon and, eventually, improve its control. The purpose of this article is to model the cross-contamination of smear-ripened cheeses due to the smearing operation under industrial conditions. A compartmental, dynamic, and stochastic model is proposed for mechanical brush smearing. This model has been developed to describe the exchange of microorganisms between compartments. Based on the analytical solution of the model equations and on experimental data collected with an industrial smearing machine, we assessed the values of the transfer parameters of the model. Monte Carlo simulations, using the distributions of transfer parameters, provide the final number of contaminated products in a batch and their final level of contamination for a given scenario taking into account the initial number of contaminated cheeses of the batch and their contaminant load. Based on analytical results, the model provides indicators for smearing efficiency and propensity of the process for cross-contamination. Unlike traditional approaches in mechanistic models, our approach captures the variability and uncertainty inherent in the process and the experimental data. More generally, this model could represent a generic base to use in modeling similar processes prone to cross-contamination.
Interpreting experimental data on egg production--applications of dynamic differential equations.
France, J; Lopez, S; Kebreab, E; Dijkstra, J
2013-09-01
This contribution focuses on applying mathematical models based on systems of ordinary first-order differential equations to synthesize and interpret data from egg production experiments. Models based on linear systems of differential equations are contrasted with those based on nonlinear systems. Regression equations arising from analytical solutions to linear compartmental schemes are considered as candidate functions for describing egg production curves, together with aspects of parameter estimation. Extant candidate functions are reviewed, a role for growth functions such as the Gompertz equation suggested, and a function based on a simple new model outlined. Structurally, the new model comprises a single pool with an inflow and an outflow. Compartmental simulation models based on nonlinear systems of differential equations, and thus requiring numerical solution, are next discussed, and aspects of parameter estimation considered. This type of model is illustrated in relation to development and evaluation of a dynamic model of calcium and phosphorus flows in layers. The model consists of 8 state variables representing calcium and phosphorus pools in the crop, stomachs, plasma, and bone. The flow equations are described by Michaelis-Menten or mass action forms. Experiments that measure Ca and P uptake in layers fed different calcium concentrations during shell-forming days are used to evaluate the model. In addition to providing a useful management tool, such a simulation model also provides a means to evaluate feeding strategies aimed at reducing excretion of potential pollutants in poultry manure to the environment.
Sensitivity analysis in a Lassa fever deterministic mathematical model
NASA Astrophysics Data System (ADS)
Abdullahi, Mohammed Baba; Doko, Umar Chado; Mamuda, Mamman
2015-05-01
Lassa virus that causes the Lassa fever is on the list of potential bio-weapons agents. It was recently imported into Germany, the Netherlands, the United Kingdom and the United States as a consequence of the rapid growth of international traffic. A model with five mutually exclusive compartments related to Lassa fever is presented and the basic reproduction number analyzed. A sensitivity analysis of the deterministic model is performed. This is done in order to determine the relative importance of the model parameters to the disease transmission. The result of the sensitivity analysis shows that the most sensitive parameter is the human immigration, followed by human recovery rate, then person to person contact. This suggests that control strategies should target human immigration, effective drugs for treatment and education to reduced person to person contact.
A stochastic chemostat model with an inhibitor and noise independent of population sizes
NASA Astrophysics Data System (ADS)
Sun, Shulin; Zhang, Xiaolu
2018-02-01
In this paper, a stochastic chemostat model with an inhibitor is considered, here the inhibitor is input from an external source and two organisms in chemostat compete for a nutrient. Firstly, we show that the system has a unique global positive solution. Secondly, by constructing some suitable Lyapunov functions, we investigate that the average in time of the second moment of the solutions of the stochastic model is bounded for a relatively small noise. That is, the asymptotic behaviors of the stochastic system around the equilibrium points of the deterministic system are studied. However, the sufficient large noise can make the microorganisms become extinct with probability one, although the solutions to the original deterministic model may be persistent. Finally, the obtained analytical results are illustrated by computer simulations.
NASA Technical Reports Server (NTRS)
Smialek, James L.
2002-01-01
An equation has been developed to model the iterative scale growth and spalling process that occurs during cyclic oxidation of high temperature materials. Parabolic scale growth and spalling of a constant surface area fraction have been assumed. Interfacial spallation of the only the thickest segments was also postulated. This simplicity allowed for representation by a simple deterministic summation series. Inputs are the parabolic growth rate constant, the spall area fraction, oxide stoichiometry, and cycle duration. Outputs include the net weight change behavior, as well as the total amount of oxygen and metal consumed, the total amount of oxide spalled, and the mass fraction of oxide spalled. The outputs all follow typical well-behaved trends with the inputs and are in good agreement with previous interfacial models.
Population density equations for stochastic processes with memory kernels
NASA Astrophysics Data System (ADS)
Lai, Yi Ming; de Kamps, Marc
2017-06-01
We present a method for solving population density equations (PDEs)-a mean-field technique describing homogeneous populations of uncoupled neurons—where the populations can be subject to non-Markov noise for arbitrary distributions of jump sizes. The method combines recent developments in two different disciplines that traditionally have had limited interaction: computational neuroscience and the theory of random networks. The method uses a geometric binning scheme, based on the method of characteristics, to capture the deterministic neurodynamics of the population, separating the deterministic and stochastic process cleanly. We can independently vary the choice of the deterministic model and the model for the stochastic process, leading to a highly modular numerical solution strategy. We demonstrate this by replacing the master equation implicit in many formulations of the PDE formalism by a generalization called the generalized Montroll-Weiss equation—a recent result from random network theory—describing a random walker subject to transitions realized by a non-Markovian process. We demonstrate the method for leaky- and quadratic-integrate and fire neurons subject to spike trains with Poisson and gamma-distributed interspike intervals. We are able to model jump responses for both models accurately to both excitatory and inhibitory input under the assumption that all inputs are generated by one renewal process.
Mai, Tam V-T; Duong, Minh V; Nguyen, Hieu T; Lin, Kuang C; Huynh, Lam K
2017-04-27
An integrated deterministic and stochastic model within the master equation/Rice-Ramsperger-Kassel-Marcus (ME/RRKM) framework was first used to characterize temperature- and pressure-dependent behaviors of thermal decomposition of acetic anhydride in a wide range of conditions (i.e., 300-1500 K and 0.001-100 atm). Particularly, using potential energy surface and molecular properties obtained from high-level electronic structure calculations at CCSD(T)/CBS, macroscopic thermodynamic properties and rate coefficients of the title reaction were derived with corrections for hindered internal rotation and tunneling treatments. Being in excellent agreement with the scattered experimental data, the results from deterministic and stochastic frameworks confirmed and complemented each other to reveal that the main decomposition pathway proceeds via a 6-membered-ring transition state with the 0 K barrier of 35.2 kcal·mol -1 . This observation was further understood and confirmed by the sensitivity analysis on the time-resolved species profiles and the derived rate coefficients with respect to the ab initio barriers. Such an agreement suggests the integrated model can be confidently used for a wide range of conditions as a powerful postfacto and predictive tool in detailed chemical kinetic modeling and simulation for the title reaction and thus can be extended to complex chemical reactions.
A Compartmental Model for Computing Cell Numbers in CFSE-based Lymphocyte Proliferation Assays
2012-01-31
of the “expected relative Kullback - Leibler distance” ( information loss) when a model is used to describe a data set [23...deconvolution of the data into cell numbers, it cannot be used to accurately assess the number of cells in a particular generation. This information could be...notation is meant to emphasize the dependence of the estimate on the particular data set used to fit the model. It should be noted that, rather
Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.
Marino, Dale J; Starr, Thomas B
2007-12-01
A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case evaluated. Independent draws of PBPK inputs resulted in the slightly higher URFs. Results were also comparable to corresponding values from the previously reported deterministic mouse PBPK and dose-response modeling approach that used LED(10)s to derive potency factors. This finding indicated that the adjustment from ED(10) to LED(10) in the deterministic approach for DCM compensated for variability resulting from probabilistic PBPK and dose-response modeling in the mouse. Finally, results show a similar degree of variability in DCM risk estimates from a number of different sources including the current effort even though these estimates were developed using very different techniques. Given the variety of different approaches involved, 95th percentile-to-mean risk estimate ratios of 2.1-4.1 represent reasonable bounds on variability estimates regarding probabilistic assessments of DCM.
Chaotic sources of noise in machine acoustics
NASA Astrophysics Data System (ADS)
Moon, F. C., Prof.; Broschart, Dipl.-Ing. T.
1994-05-01
In this paper a model is posited for deterministic, random-like noise in machines with sliding rigid parts impacting linear continuous machine structures. Such problems occur in gear transmission systems. A mathematical model is proposed to explain the random-like structure-borne and air-borne noise from such systems when the input is a periodic deterministic excitation of the quasi-rigid impacting parts. An experimental study is presented which supports the model. A thin circular plate is impacted by a chaotically vibrating mass excited by a sinusoidal moving base. The results suggest that the plate vibrations might be predicted by replacing the chaotic vibrating mass with a probabilistic forcing function. Prechaotic vibrations of the impacting mass show classical period doubling phenomena.
Statistically Qualified Neuro-Analytic system and Method for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
1998-11-04
An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less
Models of Cerebral-Body Perfusion and Cerebral Chemical Transport.
1988-03-01
Pressure Waves 22 Conchusion 23 References 36 A Compartmental Brain Model for Chemical Transport and CO2 Controlled Blood Flow Abstract 37 Introduction 38...surrounding the body, e.g., atmospheric pressure , pressure al high and low altitudes, high underwater pressure , vacuum and excessive gravity acceleration...Resistance of the AreriolarNenous capillary, accounting for the pressure drop observed between them. RCB Resistance of the Blood -Brain barrier (between
Zhao, Long-shan; Yin, Ran; Wei, Bin-bin; Li, Qing; Jiang, Zhen-yuan; Chen, Xiao-hui; Bi, Kai-shun
2012-11-01
To compare the pharmacokinetic parameters of cefuroxime lysine, a new second-generation of cephalosporin antibiotics, after intravenous (IV), intraperitoneal (IP), or intramuscular (IM) administration. Twelve male and 12 virgin female Sprague-Dawley rats, weighing from 200 to 250 g, were divided into three groups (n=4 for each gender in each group). The rats were administered a single dose (67.5 mg/kg) of cefuroxime lysine via IV bolus or IP or IM injection. Blood samples were collected and analyzed with a validated UFLC-MS/MS method. The concentration-time data were then calculated by compartmental and non-compartmental pharmacokinetic methods using DAS software. After IV, IP or IM administration, the plasma cefuroxime lysine disposition was best described by a tri-compartmental, bi-compartmental or mono-compartmental open model, respectively, with first-order elimination. The plasma concentration profiles were similar through the 3 administration routes. The distribution process was rapid after IV administration [t(1/2(d)), 0.10 ± 0.11 h vs 1.36 ± 0.65 and 1.25 ± 1.01 h]. The AUMC(0-∞) is markedly larger, and mean residence time (MRT) is greatly longer after IP administration than that in IV, or IM routes (AUMC(0-∞): 55.33 ± 20.34 vs 16.84 ± 4.85 and 36.17 ± 13.24 mg·h(2)/L; MRT: 0.93 ± 0.10 h vs 0.37 ± 0.07 h and 0.65 ± 0.05 h). The C(max) after IM injection was significantly higher than that in IP injection (73.51 ± 12.46 vs 49.09 ± 7.06 mg/L). The AUC(0-∞) in male rats were significantly higher than that in female rats after IM administration (66.38 ± 16.5 vs 44.23 ± 6.37 mg·h/L). There was no significantly sex-related difference in other pharmacokinetic parameters of cefuroxime lysine between male and female rats. Cefuroxime lysine shows quick absorption after IV injection, a long retension after IP injection, and a high C(max) after IM injection. After IM administration the AUC(0-∞) in male rats was significantly larger than that in female rats.
NASA Astrophysics Data System (ADS)
Quarles, C. C.; Gochberg, D. F.; Gore, J. C.; Yankeelov, T. E.
2009-10-01
Dynamic susceptibility contrast (DSC) MRI methods rely on compartmentalization of the contrast agent such that a susceptibility gradient can be induced between the contrast-containing compartment and adjacent spaces, such as between intravascular and extravascular spaces. When there is a disruption of the blood-brain barrier, as is frequently the case with brain tumors, a contrast agent leaks out of the vasculature, resulting in additional T1, T2 and T*2 relaxation effects in the extravascular space, thereby affecting the signal intensity time course and reducing the reliability of the computed hemodynamic parameters. In this study, a theoretical model describing these dynamic intra- and extravascular T1, T2 and T*2 relaxation interactions is proposed. The applicability of using the proposed model to investigate the influence of relevant MRI pulse sequences (e.g. echo time, flip angle), and physical (e.g. susceptibility calibration factors, pre-contrast relaxation rates) and physiological parameters (e.g. permeability, blood flow, compartmental volume fractions) on DSC-MRI signal time curves is demonstrated. Such a model could yield important insights into the biophysical basis of contrast-agent-extravasastion-induced effects on measured DSC-MRI signals and provide a means to investigate pulse sequence optimization and appropriate data analysis methods for the extraction of physiologically relevant imaging metrics.
Belciug, Smaranda; Gorunescu, Florin
2015-02-01
Scarce healthcare resources require carefully made policies ensuring optimal bed allocation, quality healthcare service, and adequate financial support. This paper proposes a complex analysis of the resource allocation in a hospital department by integrating in the same framework a queuing system, a compartmental model, and an evolutionary-based optimization. The queuing system shapes the flow of patients through the hospital, the compartmental model offers a feasible structure of the hospital department in accordance to the queuing characteristics, and the evolutionary paradigm provides the means to optimize the bed-occupancy management and the resource utilization using a genetic algorithm approach. The paper also focuses on a "What-if analysis" providing a flexible tool to explore the effects on the outcomes of the queuing system and resource utilization through systematic changes in the input parameters. The methodology was illustrated using a simulation based on real data collected from a geriatric department of a hospital from London, UK. In addition, the paper explores the possibility of adapting the methodology to different medical departments (surgery, stroke, and mental illness). Moreover, the paper also focuses on the practical use of the model from the healthcare point of view, by presenting a simulated application. Copyright © 2014 Elsevier Inc. All rights reserved.
Shen, Meiyu; Russek-Cohen, Estelle; Slud, Eric V
2016-08-12
Bioequivalence (BE) studies are an essential part of the evaluation of generic drugs. The most common in vivo BE study design is the two-period two-treatment crossover design. AUC (area under the concentration-time curve) and Cmax (maximum concentration) are obtained from the observed concentration-time profiles for each subject from each treatment under each sequence. In the BE evaluation of pharmacokinetic crossover studies, the normality of the univariate response variable, e.g. log(AUC) 1 or log(Cmax), is often assumed in the literature without much evidence. Therefore, we investigate the distributional assumption of the normality of response variables, log(AUC) and log(Cmax), by simulating concentration-time profiles from two-stage pharmacokinetic models (commonly used in pharmacokinetic research) for a wide range of pharmacokinetic parameters and measurement error structures. Our simulations show that, under reasonable distributional assumptions on the pharmacokinetic parameters, log(AUC) has heavy tails and log(Cmax) is skewed. Sensitivity analyses are conducted to investigate how the distribution of the standardized log(AUC) (or the standardized log(Cmax)) for a large number of simulated subjects deviates from normality if distributions of errors in the pharmacokinetic model for plasma concentrations deviate from normality and if the plasma concentration can be described by different compartmental models.
Nonlinear unitary quantum collapse model with self-generated noise
NASA Astrophysics Data System (ADS)
Geszti, Tamás
2018-04-01
Collapse models including some external noise of unknown origin are routinely used to describe phenomena on the quantum-classical border; in particular, quantum measurement. Although containing nonlinear dynamics and thereby exposed to the possibility of superluminal signaling in individual events, such models are widely accepted on the basis of fully reproducing the non-signaling statistical predictions of quantum mechanics. Here we present a deterministic nonlinear model without any external noise, in which randomness—instead of being universally present—emerges in the measurement process, from deterministic irregular dynamics of the detectors. The treatment is based on a minimally nonlinear von Neumann equation for a Stern–Gerlach or Bell-type measuring setup, containing coordinate and momentum operators in a self-adjoint skew-symmetric, split scalar product structure over the configuration space. The microscopic states of the detectors act as a nonlocal set of hidden parameters, controlling individual outcomes. The model is shown to display pumping of weights between setup-defined basis states, with a single winner randomly selected and the rest collapsing to zero. Environmental decoherence has no role in the scenario. Through stochastic modelling, based on Pearle’s ‘gambler’s ruin’ scheme, outcome probabilities are shown to obey Born’s rule under a no-drift or ‘fair-game’ condition. This fully reproduces quantum statistical predictions, implying that the proposed non-linear deterministic model satisfies the non-signaling requirement. Our treatment is still vulnerable to hidden signaling in individual events, which remains to be handled by future research.
Lohith, Talakad G; Zoghbi, Sami S; Morse, Cheryl L; Araneta, Maria D Ferraris; Barth, Vanessa N; Goebl, Nancy A; Tauscher, Johannes T; Pike, Victor W; Innis, Robert B; Fujita, Masahiro
2014-02-15
[(11)C]NOP-1A is a novel high-affinity PET ligand for imaging nociceptin/orphanin FQ peptide (NOP) receptors. Here, we report reproducibility and reliability measures of binding parameter estimates for [(11)C]NOP-1A binding in the brain of healthy humans. After intravenous injection of [(11)C]NOP-1A, PET scans were conducted twice on eleven healthy volunteers on the same (10/11 subjects) or different (1/11 subjects) days. Subjects underwent serial sampling of radial arterial blood to measure parent radioligand concentrations. Distribution volume (VT; a measure of receptor density) was determined by compartmental (one- and two-tissue) modeling in large regions and by simpler regression methods (graphical Logan and bilinear MA1) in both large regions and voxel data. Retest variability and intraclass correlation coefficient (ICC) of VT were determined as measures of reproducibility and reliability respectively. Regional [(11)C]NOP-1A uptake in the brain was high, with a peak radioactivity concentration of 4-7 SUV (standardized uptake value) and a rank order of putamen>cingulate cortex>cerebellum. Brain time-activity curves fitted well in 10 of 11 subjects by unconstrained two-tissue compartmental model. The retest variability of VT was moderately good across brain regions except cerebellum, and was similar across different modeling methods, averaging 12% for large regions and 14% for voxel-based methods. The retest reliability of VT was also moderately good in most brain regions, except thalamus and cerebellum, and was similar across different modeling methods averaging 0.46 for large regions and 0.48 for voxels having gray matter probability >20%. The lowest retest variability and highest retest reliability of VT were achieved by compartmental modeling for large regions, and by the parametric Logan method for voxel-based methods. Moderately good reproducibility and reliability measures of VT for [(11)C]NOP-1A make it a useful PET ligand for comparing NOP receptor binding between different subject groups or under different conditions in the same subject. Copyright © 2013. Published by Elsevier Inc.
DIETARY EXPOSURES OF YOUNG CHILDREN, PART 3: MODELLING
A deterministic model was used to model dietary exposure of young children. Parameters included pesticide residue on food before handling, surface pesticide loading, transfer efficiencies and children's activity patterns. Three components of dietary pesticide exposure were includ...
Connecting Earth Systems: Developing Holistic Understanding through the Earth-System-Science Model
ERIC Educational Resources Information Center
Gagnon, Valoree; Bradway, Heather
2012-01-01
For many years, Earth science concepts have been taught as thematic units with lessons in nice, neat chapter packages complete with labs and notes. But compartmentalized Earth science no longer exists, and implementing teaching methods that support student development of holistic understandings can be a time-consuming and difficult task. While…
Land Treatment Research and Development Program, Synthesis of Research Results,
1983-08-01
at Pack Forest, Washington .......... 22 8. Infiltration test and the relationship between cumulative water uptake and tim e...the chemistry of phos- phorus in land treatment ..................................... 37 18. Schematic diagram of the compartmental water flow model...39 19. Comparison between predicted and measured water content in slow rate soils .................................................. 39 20
A common phenomenon observed in natural and constructed wetlands is short-circuiting of flow and formation of stagnant zones that are only indirectly connected with the incoming water. Biogeochemistry of passive areas is potentially much different than that of active zones. In ...
Arguments for a Common Set of Principles for Collaborative Inquiry in Evaluation
ERIC Educational Resources Information Center
Cousins, J. Bradley; Whitmore, Elizabeth; Shulha, Lyn
2013-01-01
In this article, we critique two recent theoretical developments about collaborative inquiry in evaluation--using logic models as a means to understand theory, and efforts to compartmentalize versions of collaborative inquiry into discrete genres--as a basis for considering future direction for the field. We argue that collaborative inquiry in…
Mesoscopic and continuum modelling of angiogenesis
Spill, F.; Guerrero, P.; Alarcon, T.; Maini, P. K.; Byrne, H. M.
2016-01-01
Angiogenesis is the formation of new blood vessels from pre-existing ones in response to chemical signals secreted by, for example, a wound or a tumour. In this paper, we propose a mesoscopic lattice-based model of angiogenesis, in which processes that include proliferation and cell movement are considered as stochastic events. By studying the dependence of the model on the lattice spacing and the number of cells involved, we are able to derive the deterministic continuum limit of our equations and compare it to similar existing models of angiogenesis. We further identify conditions under which the use of continuum models is justified, and others for which stochastic or discrete effects dominate. We also compare different stochastic models for the movement of endothelial tip cells which have the same macroscopic, deterministic behaviour, but lead to markedly different behaviour in terms of production of new vessel cells. PMID:24615007
González-Alvarez, I; Fernández-Teruel, C; Garrigues, T M; Casabo, V G; Ruiz-García, A; Bermejo, M
2005-12-01
The purpose was to develop a general mathematical model for estimating passive permeability and efflux transport parameters from in vitro cell culture experiments. The procedure is applicable for linear and non-linear transport of drug with time, <10 or >10% of drug transport, negligible or relevant back flow, and would allow the adequate correction in the case of relevant mass balance problems. A compartmental kinetic approach was used and the transport barriers were described quantitatively in terms of apical and basolateral clearances. The method can be applied when sink conditions are not achieved and it allows the evaluation of the location of the transporter and its binding site. In this work it was possible to demonstrate, from a functional point of view, the higher efflux capacity of the TC7 clone and to identify the apical membrane as the main resistance for the xenobiotic transport. This methodology can be extremely useful as a complementary tool for molecular biology approaches in order to establish meaningful hypotheses about transport mechanisms.
The role of the bi-compartmental stem cell niche in delaying cancer
NASA Astrophysics Data System (ADS)
Shahriyari, Leili; Komarova, Natalia L.
2015-10-01
In recent years, by using modern imaging techniques, scientists have found evidence of collaboration between different types of stem cells (SCs), and proposed a bi-compartmental organization of the SC niche. Here we create a class of stochastic models to simulate the dynamics of such a heterogeneous SC niche. We consider two SC groups: the border compartment, S1, is in direct contact with transit-amplifying (TA) cells, and the central compartment, S2, is hierarchically upstream from S1. The S1 SCs differentiate or divide asymmetrically when the tissue needs TA cells. Both groups proliferate when the tissue requires SCs (thus maintaining homeostasis). There is an influx of S2 cells into the border compartment, either by migration, or by proliferation. We examine this model in the context of double-hit mutant generation, which is a rate-limiting step in the development of many cancers. We discover that this type of a cooperative pattern in the stem niche with two compartments leads to a significantly smaller rate of double-hit mutant production compared with a homogeneous, one-compartmental SC niche. Furthermore, the minimum probability of double-hit mutant generation corresponds to purely symmetric division of SCs, consistent with the literature. Finally, the optimal architecture (which minimizes the rate of double-hit mutant production) requires a large proliferation rate of S1 cells along with a small, but non-zero, proliferation rate of S2 cells. This result is remarkably similar to the niche structure described recently by several authors, where one of the two SC compartments was found more actively engaged in tissue homeostasis and turnover, while the other was characterized by higher levels of quiescence (but contributed strongly to injury recovery). Both numerical and analytical results are presented.
NASA Astrophysics Data System (ADS)
Castaneda-Lopez, Homero
A methodology for detecting and locating defects or discontinuities on the outside covering of coated metal underground pipelines subjected to cathodic protection has been addressed. On the basis of wide range AC impedance signals for various frequencies applied to a steel-coated pipeline system and by measuring its corresponding transfer function under several laboratory simulation scenarios, a physical laboratory setup of an underground cathodic-protected, coated pipeline was built. This model included different variables and elements that exist under real conditions, such as soil resistivity, soil chemical composition, defect (holiday) location in the pipeline covering, defect area and geometry, and level of cathodic protection. The AC impedance data obtained under different working conditions were used to fit an electrical transmission line model. This model was then used as a tool to fit the impedance signal for different experimental conditions and to establish trends in the impedance behavior without the necessity of further experimental work. However, due to the chaotic nature of the transfer function response of this system under several conditions, it is believed that non-deterministic models based on pattern recognition algorithms are suitable for field condition analysis. A non-deterministic approach was used for experimental analysis by applying an artificial neural network (ANN) algorithm based on classification analysis capable of studying the pipeline system and differentiating the variables that can change impedance conditions. These variables include level of cathodic protection, location of discontinuities (holidays), and severity of corrosion. This work demonstrated a proof-of-concept for a well-known technique and a novel algorithm capable of classifying impedance data for experimental results to predict the exact location of the active holidays and defects on the buried pipelines. Laboratory findings from this procedure are promising, and efforts to develop it for field conditions should continue.
Hidden order in crackling noise during peeling of an adhesive tape.
Kumar, Jagadish; Ciccotti, M; Ananthakrishna, G
2008-04-01
We address the longstanding problem of recovering dynamical information from noisy acoustic emission signals arising from peeling of an adhesive tape subject to constant traction velocity. Using the phase space reconstruction procedure we demonstrate the deterministic chaotic dynamics by establishing the existence of correlation dimension as also a positive Lyapunov exponent in a midrange of traction velocities. The results are explained on the basis of the model that also emphasizes the deterministic origin of acoustic emission by clarifying its connection to stick-slip dynamics.
The general situation, (but exemplified in urban areas), where a significant degree of sub-grid variability (SGV) exists in grid models poses problems when comparing gridbased air quality modeling results with observations. Typically, grid models ignore or parameterize processes ...
Extravascular transport in normal and tumor tissues.
Jain, R K; Gerlowski, L E
1986-01-01
The transport characteristics of the normal and tumor tissue extravascular space provide the basis for the determination of the optimal dosage and schedule regimes of various pharmacological agents in detection and treatment of cancer. In order for the drug to reach the cellular space where most therapeutic action takes place, several transport steps must first occur: (1) tissue perfusion; (2) permeation across the capillary wall; (3) transport through interstitial space; and (4) transport across the cell membrane. Any of these steps including intracellular events such as metabolism can be the rate-limiting step to uptake of the drug, and these rate-limiting steps may be different in normal and tumor tissues. This review examines these transport limitations, first from an experimental point of view and then from a modeling point of view. Various types of experimental tumor models which have been used in animals to represent human tumors are discussed. Then, mathematical models of extravascular transport are discussed from the prespective of two approaches: compartmental and distributed. Compartmental models lump one or more sections of a tissue or body into a "compartment" to describe the time course of disposition of a substance. These models contain "effective" parameters which represent the entire compartment. Distributed models consider the structural and morphological aspects of the tissue to determine the transport properties of that tissue. These distributed models describe both the temporal and spatial distribution of a substance in tissues. Each of these modeling techniques is described in detail with applications for cancer detection and treatment in mind.
Koeppe, R A; Holthoff, V A; Frey, K A; Kilbourn, M R; Kuhl, D E
1991-09-01
The in vivo kinetic behavior of [11C]flumazenil ([11C]FMZ), a non-subtype-specific central benzodiazepine antagonist, is characterized using compartmental analysis with the aim of producing an optimized data acquisition protocol and tracer kinetic model configuration for the assessment of [11C]FMZ binding to benzodiazepine receptors (BZRs) in human brain. The approach presented is simple, requiring only a single radioligand injection. Dynamic positron emission tomography data were acquired on 18 normal volunteers using a 60- to 90-min sequence of scans and were analyzed with model configurations that included a three-compartment, four-parameter model, a three-compartment, three-parameter model, with a fixed value for free plus nonspecific binding; and a two-compartment, two-parameter model. Statistical analysis indicated that a four-parameter model did not yield significantly better fits than a three-parameter model. Goodness of fit was improved for three- versus two-parameter configurations in regions with low receptor density, but not in regions with moderate to high receptor density. Thus, a two-compartment, two-parameter configuration was found to adequately describe the kinetic behavior of [11C]FMZ in human brain, with stable estimates of the model parameters obtainable from as little as 20-30 min of data. Pixel-by-pixel analysis yields functional images of transport rate (K1) and ligand distribution volume (DV"), and thus provides independent estimates of ligand delivery and BZR binding.
Kleiman, Martin B.; Allen, Stephen D.; Neal, Patricia; Reynolds, Janet
1981-01-01
A necrotizing meningoencephalitis complicated by ventricular compartmentalization and abscess formation caused by Enterobacter sakazakii in a previously healthy 5-week-old female is described. A detailed description of the isolate is presented. This communication firmly establishes the pathogenicity of E. sakazakii. PMID:7287892
ERIC Educational Resources Information Center
Huijsmans, Roy
2012-01-01
Based on fieldwork material from Lao People's Democratic Republic, this paper introduces an analytical framework that transcends compartmentalized approaches towards migration involving young people. The notions of fluid and institutionalized forms of migration illuminate key differences and commonalities in the relational fabric underpinning…
Kevin T. Smith
2006-01-01
For more than 30 years, the compartmentdization concept has helped tree care practitioners and land managers interpret patterns of decay in living trees. Understanding these patterns can help guide the selection of treatments that meet the needs of people and communities while respecting the underlying tree biology. At its simplest, compartmentalization resists the...
Programming chemistry in DNA-addressable bioreactors.
Fellermann, Harold; Cardelli, Luca
2014-10-06
We present a formal calculus, termed the chemtainer calculus, able to capture the complexity of compartmentalized reaction systems such as populations of possibly nested vesicular compartments. Compartments contain molecular cargo as well as surface markers in the form of DNA single strands. These markers serve as compartment addresses and allow for their targeted transport and fusion, thereby enabling reactions of previously separated chemicals. The overall system organization allows for the set-up of programmable chemistry in microfluidic or other automated environments. We introduce a simple sequential programming language whose instructions are motivated by state-of-the-art microfluidic technology. Our approach integrates electronic control, chemical computing and material production in a unified formal framework that is able to mimic the integrated computational and constructive capabilities of the subcellular matrix. We provide a non-deterministic semantics of our programming language that enables us to analytically derive the computational and constructive power of our machinery. This semantics is used to derive the sets of all constructable chemicals and supermolecular structures that emerge from different underlying instruction sets. Because our proofs are constructive, they can be used to automatically infer control programs for the construction of target structures from a limited set of resource molecules. Finally, we present an example of our framework from the area of oligosaccharide synthesis. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Bull, Marta E; Heath, Laura M; McKernan-Mullin, Jennifer L; Kraft, Kelli M; Acevedo, Luis; Hitti, Jane E; Cohn, Susan E; Tapia, Kenneth A; Holte, Sarah E; Dragavon, Joan A; Coombs, Robert W; Mullins, James I; Frenkel, Lisa M
2013-04-15
Whether unique human immunodeficiency type 1 (HIV) genotypes occur in the genital tract is important for vaccine development and management of drug resistant viruses. Multiple cross-sectional studies suggest HIV is compartmentalized within the female genital tract. We hypothesize that bursts of HIV replication and/or proliferation of infected cells captured in cross-sectional analyses drive compartmentalization but over time genital-specific viral lineages do not form; rather viruses mix between genital tract and blood. Eight women with ongoing HIV replication were studied during a period of 1.5 to 4.5 years. Multiple viral sequences were derived by single-genome amplification of the HIV C2-V5 region of env from genital secretions and blood plasma. Maximum likelihood phylogenies were evaluated for compartmentalization using 4 statistical tests. In cross-sectional analyses compartmentalization of genital from blood viruses was detected in three of eight women by all tests; this was associated with tissue specific clades containing multiple monotypic sequences. In longitudinal analysis, the tissues-specific clades did not persist to form viral lineages. Rather, across women, HIV lineages were comprised of both genital tract and blood sequences. The observation of genital-specific HIV clades only in cross-sectional analysis and an absence of genital-specific lineages in longitudinal analyses suggest a dynamic interchange of HIV variants between the female genital tract and blood.
Lamaye, Françoise; Galliot, Sonia; Alibardi, Lorenzo; Lafontaine, Denis L J; Thiry, Marc
2011-05-01
Two types of nucleolus can be distinguished among eukaryotic cells: a tri-compartmentalized nucleolus in amniotes and a bi-compartmentalized nucleolus in all the others. However, though the nucleolus' ultrastructure is well characterized in mammals and birds, it has been so far much less studied in reptiles. In this work, we examined the ultrastructural organization of the nucleolus in various tissues from different reptilian species (three turtles, three lizards, two crocodiles, and three snakes). Using cytochemical and immunocytological methods, we showed that in reptiles both types of nucleolus are present: a bi-compartmentalized nucleolus in turtles and a tri-compartmentalized nucleolus in the other species examined in this study. Furthermore, in a given species, the same type of nucleolus is present in all the tissues, however, the importance and the repartition of those nucleolar components could vary from one tissue to another. We also reveal that, contrary to the mammalian nucleolus, the reptilian fibrillar centers contain small clumps of condensed chromatin and that their surrounding dense fibrillar component is thicker. Finally, we also report that Cajal bodies are detected in reptiles. Altogether, we believe that these results have profound evolutionarily implications since they indicate that the point of transition between bipartite and tripartite nucleoli lies at the emergence of the amniotes within the class Reptilia. Copyright © 2011 Elsevier Inc. All rights reserved.
In vivo kinematics of a robot-assisted uni- and multi-compartmental knee arthroplasty.
Watanabe, Toshifumi; Abbasi, Ali Z; Conditt, Michael A; Christopher, Jennifer; Kreuzer, Stefan; Otto, Jason K; Banks, Scott A
2014-07-01
There is great interest in providing reliable and durable treatments for one- and two-compartment arthritic degeneration of the cruciate-ligament intact knee. One approach is to resurface only the diseased compartments with discrete unicompartmental components, retaining the undamaged compartment(s). However, placing multiple small implants into the knee presents a greater surgical challenge than total knee arthroplasty, so it is not certain that the natural knee mechanics can be maintained or restored. The goal of this study was to determine whether near-normal knee kinematics can be obtained with a robot-assisted multi-compartmental knee arthroplasty. Thirteen patients with 15 multi-compartmental knee arthroplasties using haptic robotic-assisted bone preparation were involved in this study. Nine subjects received a medial unicompartmental knee arthroplasty (UKA), three subjects received a medial UKA and patellofemoral (PF) arthroplasty, and three subjects received medial and lateral bi-unicondylar arthroplasty. Knee motions were recorded using video-fluoroscopy an average of 13 months (6-29 months) after surgery during stair and kneeling activities. The three-dimensional position and orientation of the implant components were determined using model-image registration techniques. Knee kinematics during maximum flexion kneeling showed femoral external rotation and posterior lateral condylar translation. All knees showed femoral external rotation and posterior condylar translation with flexion during the step activity. Knees with medial UKA and PF arthroplasty showed the most femoral external rotation and posterior translation, and knees with bicondylar UKA showed the least. Knees with accurately placed uni- or bi-compartmental arthroplasty exhibited stable knee kinematics consistent with intact and functioning cruciate ligaments. The patterns of tibiofemoral motion were more similar to natural knees than commonly has been observed in knees with total knee arthroplasty. Larger series are required to confirm these as general observations, but the present results demonstrate the potential to restore or maintain closer-to-normal knee kinematics by retaining intact structures and compartments.
A variational method for analyzing limit cycle oscillations in stochastic hybrid systems
NASA Astrophysics Data System (ADS)
Bressloff, Paul C.; MacLaurin, James
2018-06-01
Many systems in biology can be modeled through ordinary differential equations, which are piece-wise continuous, and switch between different states according to a Markov jump process known as a stochastic hybrid system or piecewise deterministic Markov process (PDMP). In the fast switching limit, the dynamics converges to a deterministic ODE. In this paper, we develop a phase reduction method for stochastic hybrid systems that support a stable limit cycle in the deterministic limit. A classic example is the Morris-Lecar model of a neuron, where the switching Markov process is the number of open ion channels and the continuous process is the membrane voltage. We outline a variational principle for the phase reduction, yielding an exact analytic expression for the resulting phase dynamics. We demonstrate that this decomposition is accurate over timescales that are exponential in the switching rate ɛ-1 . That is, we show that for a constant C, the probability that the expected time to leave an O(a) neighborhood of the limit cycle is less than T scales as T exp (-C a /ɛ ) .
Effects of Noise on Ecological Invasion Processes: Bacteriophage-mediated Competition in Bacteria
NASA Astrophysics Data System (ADS)
Joo, Jaewook; Eric, Harvill; Albert, Reka
2007-03-01
Pathogen-mediated competition, through which an invasive species carrying and transmitting a pathogen can be a superior competitor to a more vulnerable resident species, is one of the principle driving forces influencing biodiversity in nature. Using an experimental system of bacteriophage-mediated competition in bacterial populations and a deterministic model, we have shown in [Joo et al 2005] that the competitive advantage conferred by the phage depends only on the relative phage pathology and is independent of the initial phage concentration and other phage and host parameters such as the infection-causing contact rate, the spontaneous and infection-induced lysis rates, and the phage burst size. Here we investigate the effects of stochastic fluctuations on bacterial invasion facilitated by bacteriophage, and examine the validity of the deterministic approach. We use both numerical and analytical methods of stochastic processes to identify the source of noise and assess its magnitude. We show that the conclusions obtained from the deterministic model are robust against stochastic fluctuations, yet deviations become prominently large when the phage are more pathological to the invading bacterial strain.
Analysis of stochastic model for non-linear volcanic dynamics
NASA Astrophysics Data System (ADS)
Alexandrov, D.; Bashkirtseva, I.; Ryashko, L.
2014-12-01
Motivated by important geophysical applications we consider a dynamic model of the magma-plug system previously derived by Iverson et al. (2006) under the influence of stochastic forcing. Due to strong nonlinearity of the friction force for solid plug along its margins, the initial deterministic system exhibits impulsive oscillations. Two types of dynamic behavior of the system under the influence of the parametric stochastic forcing have been found: random trajectories are scattered on both sides of the deterministic cycle or grouped on its internal side only. It is shown that dispersions are highly inhomogeneous along cycles in the presence of noises. The effects of noise-induced shifts, pressure stabilization and localization of random trajectories have been revealed with increasing the noise intensity. The plug velocity, pressure and displacement are highly dependent of noise intensity as well. These new stochastic phenomena are related with the nonlinear peculiarities of the deterministic phase portrait. It is demonstrated that the repetitive stick-slip motions of the magma-plug system in the case of stochastic forcing can be connected with drumbeat earthquakes.
Modelling the interaction between flooding events and economic growth
NASA Astrophysics Data System (ADS)
Grames, J.; Prskawetz, A.; Grass, D.; Blöschl, G.
2015-06-01
Socio-hydrology describes the interaction between the socio-economy and water. Recent models analyze the interplay of community risk-coping culture, flooding damage and economic growth (Di Baldassarre et al., 2013; Viglione et al., 2014). These models descriptively explain the feedbacks between socio-economic development and natural disasters like floods. Contrary to these descriptive models, our approach develops an optimization model, where the intertemporal decision of an economic agent interacts with the hydrological system. In order to build this first economic growth model describing the interaction between the consumption and investment decisions of an economic agent and the occurrence of flooding events, we transform an existing descriptive stochastic model into an optimal deterministic model. The intermediate step is to formulate and simulate a descriptive deterministic model. We develop a periodic water function to approximate the former discrete stochastic time series of rainfall events. Due to the non-autonomous exogenous periodic rainfall function the long-term path of consumption and investment will be periodic.
Automated Calibration For Numerical Models Of Riverflow
NASA Astrophysics Data System (ADS)
Fernandez, Betsaida; Kopmann, Rebekka; Oladyshkin, Sergey
2017-04-01
Calibration of numerical models is fundamental since the beginning of all types of hydro system modeling, to approximate the parameters that can mimic the overall system behavior. Thus, an assessment of different deterministic and stochastic optimization methods is undertaken to compare their robustness, computational feasibility, and global search capacity. Also, the uncertainty of the most suitable methods is analyzed. These optimization methods minimize the objective function that comprises synthetic measurements and simulated data. Synthetic measurement data replace the observed data set to guarantee an existing parameter solution. The input data for the objective function derivate from a hydro-morphological dynamics numerical model which represents an 180-degree bend channel. The hydro- morphological numerical model shows a high level of ill-posedness in the mathematical problem. The minimization of the objective function by different candidate methods for optimization indicates a failure in some of the gradient-based methods as Newton Conjugated and BFGS. Others reveal partial convergence, such as Nelder-Mead, Polak und Ribieri, L-BFGS-B, Truncated Newton Conjugated, and Trust-Region Newton Conjugated Gradient. Further ones indicate parameter solutions that range outside the physical limits, such as Levenberg-Marquardt and LeastSquareRoot. Moreover, there is a significant computational demand for genetic optimization methods, such as Differential Evolution and Basin-Hopping, as well as for Brute Force methods. The Deterministic Sequential Least Square Programming and the scholastic Bayes Inference theory methods present the optimal optimization results. keywords: Automated calibration of hydro-morphological dynamic numerical model, Bayesian inference theory, deterministic optimization methods.
Carbon nanotubes exhibit fibrillar pharmacology in primates
Alidori, Simone; Thorek, Daniel L. J.; Beattie, Bradley J.; ...
2017-08-28
Nanomedicine rests at the nexus of medicine, bioengineering, and biology with great potential for improving health through innovation and development of new drugs and devices. Carbon nanotubes are an example of a fibrillar nanomaterial poised to translate into medical practice. The leading candidate material in this class is ammonium-functionalized carbon nanotubes (fCNT) that exhibits unexpected pharmacological behavior in vivo with important biotechnology applications. Here, we provide a multi-organ evaluation of the distribution, uptake and processing of fCNT in nonhuman primates using quantitative whole body positron emission tomography (PET), compartmental modeling of pharmacokinetic data, serum biomarkers and ex vivo pathology investigation.more » Kidney and liver are the two major organ systems that accumulate and excrete [ 86Y]fCNT in nonhuman primates and accumulation is cell specific as described by compartmental modeling analyses of the quantitative PET data. A serial two-compartment model explains renal processing of tracer-labeled fCNT; hepatic data fits a parallel two-compartment model. These modeling data also reveal significant elimination of the injected activity (>99.8%) from the primate within 3 days (t 1/2 = 11.9 hours). Thus, these favorable results in nonhuman primates provide important insight to the fate of fCNT in vivo and pave the way to further engineering design considerations for sophisticated nanomedicines to aid late stage development and clinical use in man.« less
Carbon nanotubes exhibit fibrillar pharmacology in primates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alidori, Simone; Thorek, Daniel L. J.; Beattie, Bradley J.
Nanomedicine rests at the nexus of medicine, bioengineering, and biology with great potential for improving health through innovation and development of new drugs and devices. Carbon nanotubes are an example of a fibrillar nanomaterial poised to translate into medical practice. The leading candidate material in this class is ammonium-functionalized carbon nanotubes (fCNT) that exhibits unexpected pharmacological behavior in vivo with important biotechnology applications. Here, we provide a multi-organ evaluation of the distribution, uptake and processing of fCNT in nonhuman primates using quantitative whole body positron emission tomography (PET), compartmental modeling of pharmacokinetic data, serum biomarkers and ex vivo pathology investigation.more » Kidney and liver are the two major organ systems that accumulate and excrete [ 86Y]fCNT in nonhuman primates and accumulation is cell specific as described by compartmental modeling analyses of the quantitative PET data. A serial two-compartment model explains renal processing of tracer-labeled fCNT; hepatic data fits a parallel two-compartment model. These modeling data also reveal significant elimination of the injected activity (>99.8%) from the primate within 3 days (t 1/2 = 11.9 hours). Thus, these favorable results in nonhuman primates provide important insight to the fate of fCNT in vivo and pave the way to further engineering design considerations for sophisticated nanomedicines to aid late stage development and clinical use in man.« less
Salgia, Ravi; Mambetsariev, Isa; Hewelt, Blake; Achuthan, Srisairam; Li, Haiqing; Poroyko, Valeriy; Wang, Yingyu; Sattler, Martin
2018-05-25
Mathematical cancer models are immensely powerful tools that are based in part on the fractal nature of biological structures, such as the geometry of the lung. Cancers of the lung provide an opportune model to develop and apply algorithms that capture changes and disease phenotypes. We reviewed mathematical models that have been developed for biological sciences and applied them in the context of small cell lung cancer (SCLC) growth, mutational heterogeneity, and mechanisms of metastasis. The ultimate goal is to develop the stochastic and deterministic nature of this disease, to link this comprehensive set of tools back to its fractalness and to provide a platform for accurate biomarker development. These techniques may be particularly useful in the context of drug development research, such as combination with existing omics approaches. The integration of these tools will be important to further understand the biology of SCLC and ultimately develop novel therapeutics.
MIMICKING COUNTERFACTUAL OUTCOMES TO ESTIMATE CAUSAL EFFECTS.
Lok, Judith J
2017-04-01
In observational studies, treatment may be adapted to covariates at several times without a fixed protocol, in continuous time. Treatment influences covariates, which influence treatment, which influences covariates, and so on. Then even time-dependent Cox-models cannot be used to estimate the net treatment effect. Structural nested models have been applied in this setting. Structural nested models are based on counterfactuals: the outcome a person would have had had treatment been withheld after a certain time. Previous work on continuous-time structural nested models assumes that counterfactuals depend deterministically on observed data, while conjecturing that this assumption can be relaxed. This article proves that one can mimic counterfactuals by constructing random variables, solutions to a differential equation, that have the same distribution as the counterfactuals, even given past observed data. These "mimicking" variables can be used to estimate the parameters of structural nested models without assuming the treatment effect to be deterministic.
NASA Astrophysics Data System (ADS)
Li, Fei; Subramanian, Kartik; Chen, Minghan; Tyson, John J.; Cao, Yang
2016-06-01
The asymmetric cell division cycle in Caulobacter crescentus is controlled by an elaborate molecular mechanism governing the production, activation and spatial localization of a host of interacting proteins. In previous work, we proposed a deterministic mathematical model for the spatiotemporal dynamics of six major regulatory proteins. In this paper, we study a stochastic version of the model, which takes into account molecular fluctuations of these regulatory proteins in space and time during early stages of the cell cycle of wild-type Caulobacter cells. We test the stochastic model with regard to experimental observations of increased variability of cycle time in cells depleted of the divJ gene product. The deterministic model predicts that overexpression of the divK gene blocks cell cycle progression in the stalked stage; however, stochastic simulations suggest that a small fraction of the mutants cells do complete the cell cycle normally.
Pharmacokinetic–pharmacodynamic modelling in anaesthesia
Gambús, Pedro L; Trocóniz, Iñaki F
2015-01-01
Anaesthesiologists adjust drug dosing, administration system and kind of drug to the characteristics of the patient. They then observe the expected response and adjust dosing to the specific requirements according to the difference between observed response, expected response and the context of the surgery and the patient. The approach above can be achieved because on one hand quantification technology has made significant advances allowing the anaesthesiologist to measure almost any effect by using noninvasive, continuous measuring systems. On the other the knowledge on the relations between dosing, concentration, biophase dynamics and effect as well as detection of variability sources has been achieved as being the benchmark specialty for pharmacokinetic–pharmacodynamic (PKPD) modelling. The aim of the review is to revisit the most common PKPD models applied in the field of anaesthesia (i.e. effect compartmental, turnover, drug–receptor binding and drug interaction models) through representative examples. The effect compartmental model has been widely used in this field and there are multiple applications and examples. The use of turnover models has been limited mainly to describe respiratory effects. Similarly, cases in which the dissociation process of the drug–receptor complex is slow compared with other processes relevant to the time course of the anaesthetic effect are not frequent in anaesthesia, where in addition to a rapid onset, a fast offset of the response is required. With respect to the characterization of PD drug interactions different response surface models are discussed. Relevant applications that have changed the way modern anaesthesia is practiced are also provided. PMID:24251846
Amplification of intrinsic fluctuations by the Lorenz equations
NASA Astrophysics Data System (ADS)
Fox, Ronald F.; Elston, T. C.
1993-07-01
Macroscopic systems (e.g., hydrodynamics, chemical reactions, electrical circuits, etc.) manifest intrinsic fluctuations of molecular and thermal origin. When the macroscopic dynamics is deterministically chaotic, the intrinsic fluctuations may become amplified by several orders of magnitude. Numerical studies of this phenomenon are presented in detail for the Lorenz model. Amplification to macroscopic scales is exhibited, and quantitative methods (binning and a difference-norm) are presented for measuring macroscopically subliminal amplification effects. In order to test the quality of the numerical results, noise induced chaos is studied around a deterministically nonchaotic state, where the scaling law relating the Lyapunov exponent to noise strength obtained for maps is confirmed for the Lorenz model, a system of ordinary differential equations.
Uniqueness of Nash equilibrium in vaccination games.
Bai, Fan
2016-12-01
One crucial condition for the uniqueness of Nash equilibrium set in vaccination games is that the attack ratio monotonically decreases as the vaccine coverage level increasing. We consider several deterministic vaccination models in homogeneous mixing population and in heterogeneous mixing population. Based on the final size relations obtained from the deterministic epidemic models, we prove that the attack ratios can be expressed in terms of the vaccine coverage levels, and also prove that the attack ratios are decreasing functions of vaccine coverage levels. Some thresholds are presented, which depend on the vaccine efficacy. It is proved that for vaccination games in homogeneous mixing population, there is a unique Nash equilibrium for each game.
Automated Assume-Guarantee Reasoning by Abstraction Refinement
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra
2008-01-01
Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.
Deterministic Impulsive Vacuum Foundations for Quantum-Mechanical Wavefunctions
NASA Astrophysics Data System (ADS)
Valentine, John S.
2013-09-01
By assuming that a fermion de-constitutes immediately at source, that its constituents, as bosons, propagate uniformly as scalar vacuum terms with phase (radial) symmetry, and that fermions are unique solutions for specific phase conditions, we find a model that self-quantizes matter from continuous waves, unifying bosons and fermion ontologies in a single basis, in a constitution-invariant process. Vacuum energy has a wavefunction context, as a mass-energy term that enables wave collapse and increases its amplitude, with gravitational field as the gradient of the flux density. Gravitational and charge-based force effects emerge as statistics without special treatment. Confinement, entanglement, vacuum statistics, forces, and wavefunction terms emerge from the model's deterministic foundations.
The Constitutive Modeling of Thin Films with Randon Material Wrinkles
NASA Technical Reports Server (NTRS)
Murphey, Thomas W.; Mikulas, Martin M.
2001-01-01
Material wrinkles drastically alter the structural constitutive properties of thin films. Normally linear elastic materials, when wrinkled, become highly nonlinear and initially inelastic. Stiffness' reduced by 99% and negative Poisson's ratios are typically observed. This paper presents an effective continuum constitutive model for the elastic effects of material wrinkles in thin films. The model considers general two-dimensional stress and strain states (simultaneous bi-axial and shear stress/strain) and neglects out of plane bending. The constitutive model is derived from a traditional mechanics analysis of an idealized physical model of random material wrinkles. Model parameters are the directly measurable wrinkle characteristics of amplitude and wavelength. For these reasons, the equations are mechanistic and deterministic. The model is compared with bi-axial tensile test data for wrinkled Kaptong(Registered Trademark) HN and is shown to deterministically predict strain as a function of stress with an average RMS error of 22%. On average, fitting the model to test data yields an RMS error of 1.2%
INTEGRATED PLANNING MODEL - EPA APPLICATIONS
The Integrated Planning Model (IPM) is a multi-regional, dynamic, deterministic linear programming (LP) model of the electric power sector in the continental lower 48 states and the District of Columbia. It provides forecasts up to year 2050 of least-cost capacity expansion, elec...
NASA Astrophysics Data System (ADS)
Mannattil, Manu; Pandey, Ambrish; Verma, Mahendra K.; Chakraborty, Sagar
2017-12-01
Constructing simpler models, either stochastic or deterministic, for exploring the phenomenon of flow reversals in fluid systems is in vogue across disciplines. Using direct numerical simulations and nonlinear time series analysis, we illustrate that the basic nature of flow reversals in convecting fluids can depend on the dimensionless parameters describing the system. Specifically, we find evidence of low-dimensional behavior in flow reversals occurring at zero Prandtl number, whereas we fail to find such signatures for reversals at infinite Prandtl number. Thus, even in a single system, as one varies the system parameters, one can encounter reversals that are fundamentally different in nature. Consequently, we conclude that a single general low-dimensional deterministic model cannot faithfully characterize flow reversals for every set of parameter values.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110
NASA Astrophysics Data System (ADS)
Song, Yiliao; Qin, Shanshan; Qu, Jiansheng; Liu, Feng
2015-10-01
The issue of air quality regarding PM pollution levels in China is a focus of public attention. To address that issue, to date, a series of studies is in progress, including PM monitoring programs, PM source apportionment, and the enactment of new ambient air quality index standards. However, related research concerning computer modeling for PM future trends estimation is rare, despite its significance to forecasting and early warning systems. Thereby, a study regarding deterministic and interval forecasts of PM is performed. In this study, data on hourly and 12 h-averaged air pollutants are applied to forecast PM concentrations within the Yangtze River Delta (YRD) region of China. The characteristics of PM emissions have been primarily examined and analyzed using different distribution functions. To improve the distribution fitting that is crucial for estimating PM levels, an artificial intelligence algorithm is incorporated to select the optimal parameters. Following that step, an ANF model is used to conduct deterministic forecasts of PM. With the identified distributions and deterministic forecasts, different levels of PM intervals are estimated. The results indicate that the lognormal or gamma distributions are highly representative of the recorded PM data with a goodness-of-fit R2 of approximately 0.998. Furthermore, the results of the evaluation metrics (MSE, MAPE and CP, AW) also show high accuracy within the deterministic and interval forecasts of PM, indicating that this method enables the informative and effective quantification of future PM trends.
Defining Lipid Transport Pathways in Animal Cells
NASA Astrophysics Data System (ADS)
Pagano, Richard E.; Sleight, Richard G.
1985-09-01
A new technique for studying the metabolism and intracellular transport of lipid molecules in living cells based on the use of fluorescent lipid analogs is described. The cellular processing of various intermediates (phosphatidic acid and ceramide) and end products (phosphatidylcholine and phosphatidylethanolamine) in lipid biosynthesis is reviewed and a working model for compartmentalization during lipid biosynthesis is presented.
Global separation of plant transpiration from groundwater and streamflow
Jaivime Evaristo; Scott Jasechko; Jeffrey J. McDonnell
2015-01-01
Current land surface models assume that groundwater, streamflow and plant transpiration are all sourced and mediated by the same well mixed water reservoirâthe soil. However, recent work in Oregon and Mexico has shown evidence of ecohydrological separation, whereby different subsurface compartmentalized pools of water supply either plant transpiration fluxes or the...
Physiological system integrations with emphasis on the respiratory-cardiovascular system
NASA Technical Reports Server (NTRS)
Gallagher, R. R.
1975-01-01
The integration of two types of physiological system simulations is presented. The long term model is a circulatory system model which simulates long term blood flow variations and compartmental fluid shifts. The short term models simulate transient phenomena of the respiratory, thermoregulatory, and pulsatile cardiovascular systems as they respond to stimuli such as LBNP, exercise, and environmental gaseous variations. An overview of the interfacing approach is described. Descriptions of the variable interface for long term to short term and between the three short term models are given.
Lessons from the synthetic chemist nature.
Jürjens, Gerrit; Kirschning, Andreas; Candito, David A
2015-05-01
This conceptual review examines the ideal multistep synthesis from the perspective of nature. We suggest that besides step- and redox economies, one other key to efficiency is steady state processing with intermediates that are immediately transformed to the next intermediate when formed. We discuss four of nature's strategies (multicatalysis, domino reactions, iteration and compartmentation) that commonly proceed via short-lived intermediates and show that these strategies are also part of the chemist's portfolio. We particularly focus on compartmentation which in nature is found microscopically within cells (organelles) and between cells and on a molecular level on multiprotein scaffolds (e.g. in polyketide synthases) and demonstrate how compartmentation is manifested in modern multistep flow synthesis.
An analytical framework to assist decision makers in the use of forest ecosystem model predictions
USDA-ARS?s Scientific Manuscript database
The predictions of most terrestrial ecosystem models originate from deterministic simulations. Relatively few uncertainty evaluation exercises in model outputs are performed by either model developers or users. This issue has important consequences for decision makers who rely on models to develop n...
A random walk on water (Henry Darcy Medal Lecture)
NASA Astrophysics Data System (ADS)
Koutsoyiannis, D.
2009-04-01
Randomness and uncertainty had been well appreciated in hydrology and water resources engineering in their initial steps as scientific disciplines. However, this changed through the years and, following other geosciences, hydrology adopted a naïve view of randomness in natural processes. Such a view separates natural phenomena into two mutually exclusive types, random or stochastic, and deterministic. When a classification of a specific process into one of these two types fails, then a separation of the process into two different, usually additive, parts is typically devised, each of which may be further subdivided into subparts (e.g., deterministic subparts such as periodic and aperiodic or trends). This dichotomous logic is typically combined with a manichean perception, in which the deterministic part supposedly represents cause-effect relationships and thus is physics and science (the "good"), whereas randomness has little relationship with science and no relationship with understanding (the "evil"). Probability theory and statistics, which traditionally provided the tools for dealing with randomness and uncertainty, have been regarded by some as the "necessary evil" but not as an essential part of hydrology and geophysics. Some took a step further to banish them from hydrology, replacing them with deterministic sensitivity analysis and fuzzy-logic representations. Others attempted to demonstrate that irregular fluctuations observed in natural processes are au fond manifestations of underlying chaotic deterministic dynamics with low dimensionality, thus attempting to render probabilistic descriptions unnecessary. Some of the above recent developments are simply flawed because they make erroneous use of probability and statistics (which, remarkably, provide the tools for such analyses), whereas the entire underlying logic is just a false dichotomy. To see this, it suffices to recall that Pierre Simon Laplace, perhaps the most famous proponent of determinism in the history of philosophy of science (cf. Laplace's demon), is, at the same time, one of the founders of probability theory, which he regarded as "nothing but common sense reduced to calculation". This harmonizes with James Clerk Maxwell's view that "the true logic for this world is the calculus of Probabilities" and was more recently and epigrammatically formulated in the title of Edwin Thompson Jaynes's book "Probability Theory: The Logic of Science" (2003). Abandoning dichotomous logic, either on ontological or epistemic grounds, we can identify randomness or stochasticity with unpredictability. Admitting that (a) uncertainty is an intrinsic property of nature; (b) causality implies dependence of natural processes in time and thus suggests predictability; but, (c) even the tiniest uncertainty (e.g., in initial conditions) may result in unpredictability after a certain time horizon, we may shape a stochastic representation of natural processes that is consistent with Karl Popper's indeterministic world view. In this representation, probability quantifies uncertainty according to the Kolmogorov system, in which probability is a normalized measure, i.e., a function that maps sets (areas where the initial conditions or the parameter values lie) to real numbers (in the interval [0, 1]). In such a representation, predictability (suggested by deterministic laws) and unpredictability (randomness) coexist, are not separable or additive components, and it is a matter of specifying the time horizon of prediction to decide which of the two dominates. An elementary numerical example has been devised to illustrate the above ideas and demonstrate that they offer a pragmatic and useful guide for practice, rather than just pertaining to philosophical discussions. A chaotic model, with fully and a priori known deterministic dynamics and deterministic inputs (without any random agent), is assumed to represent the hydrological balance in an area partly covered by vegetation. Experimentation with this toy model demonstrates, inter alia, that: (1) for short time horizons the deterministic dynamics is able to give good predictions; but (2) these predictions become extremely inaccurate and useless for long time horizons; (3) for such horizons a naïve statistical prediction (average of past data) which fully neglects the deterministic dynamics is more skilful; and (4) if this statistical prediction, in addition to past data, is combined with the probability theory (the principle of maximum entropy, in particular), it can provide a more informative prediction. Also, the toy model shows that the trajectories of the system state (and derivative properties thereof) do not resemble a regular (e.g., periodic) deterministic process nor a purely random process, but exhibit patterns indicating anti-persistence and persistence (where the latter statistically complies with a Hurst-Kolmogorov behaviour). If the process is averaged over long time scales, the anti-persistent behaviour improves predictability, whereas the persistent behaviour substantially deteriorates it. A stochastic representation of this deterministic system, which incorporates dynamics, is not only possible, but also powerful as it provides good predictions for both short and long horizons and helps to decide on when the deterministic dynamics should be considered or neglected. Obviously, a natural system is extremely more complex than this simple toy model and hence unpredictability is naturally even more prominent in the former. In addition, in a complex natural system, we can never know the exact dynamics and we must infer it from past data, which implies additional uncertainty and an additional role of stochastics in the process of formulating the system equations and estimating the involved parameters. Data also offer the only solid grounds to test any hypothesis about the dynamics, and failure of performing such testing against evidence from data renders the hypothesised dynamics worthless. If this perception of natural phenomena is adequately plausible, then it may help in studying interesting fundamental questions regarding the current state and the trends of hydrological and water resources research and their promising future paths. For instance: (i) Will it ever be possible to achieve a fully "physically based" modelling of hydrological systems that will not depend on data or stochastic representations? (ii) To what extent can hydrological uncertainty be reduced and what are the effective means for such reduction? (iii) Are current stochastic methods in hydrology consistent with observed natural behaviours? What paths should we explore for their advancement? (iv) Can deterministic methods provide solid scientific grounds for water resources engineering and management? In particular, can there be risk-free hydraulic engineering and water management? (v) Is the current (particularly important) interface between hydrology and climate satisfactory?. In particular, should hydrology rely on climate models that are not properly validated (i.e., for periods and scales not used in calibration)? In effect, is the evolution of climate and its impacts on water resources deterministically predictable?
NASA Astrophysics Data System (ADS)
Wong, B.; Kilthau, W.; Knopf, D. A.
2017-12-01
Immersion freezing is recognized as the most important ice crystal formation process in mixed-phase cloud environments. It is well established that mineral dust species can act as efficient ice nucleating particles. Previous research has focused on determination of the ice nucleation propensity of individual mineral dust species. In this study, the focus is placed on how different mineral dust species such as illite, kaolinite and feldspar, initiate freezing of water droplets when present in internal and external mixtures. The frozen fraction data for single and multicomponent mineral dust droplet mixtures are recorded under identical cooling rates. Additionally, the time dependence of freezing is explored. Externally and internally mixed mineral dust droplet samples are exposed to constant temperatures (isothermal freezing experiments) and frozen fraction data is recorded based on time intervals. Analyses of single and multicomponent mineral dust droplet samples include different stochastic and deterministic models such as the derivation of the heterogeneous ice nucleation rate coefficient (Jhet), the single contact angle (α) description, the α-PDF model, active sites representation, and the deterministic model. Parameter sets derived from freezing data of single component mineral dust samples are evaluated for prediction of cooling rate dependent and isothermal freezing of multicomponent externally or internally mixed mineral dust samples. The atmospheric implications of our findings are discussed.
Godt, J.W.; Baum, R.L.; Savage, W.Z.; Salciarini, D.; Schulz, W.H.; Harp, E.L.
2008-01-01
Application of transient deterministic shallow landslide models over broad regions for hazard and susceptibility assessments requires information on rainfall, topography and the distribution and properties of hillside materials. We survey techniques for generating the spatial and temporal input data for such models and present an example using a transient deterministic model that combines an analytic solution to assess the pore-pressure response to rainfall infiltration with an infinite-slope stability calculation. Pore-pressures and factors of safety are computed on a cell-by-cell basis and can be displayed or manipulated in a grid-based GIS. Input data are high-resolution (1.8??m) topographic information derived from LiDAR data and simple descriptions of initial pore-pressure distribution and boundary conditions for a study area north of Seattle, Washington. Rainfall information is taken from a previously defined empirical rainfall intensity-duration threshold and material strength and hydraulic properties were measured both in the field and laboratory. Results are tested by comparison with a shallow landslide inventory. Comparison of results with those from static infinite-slope stability analyses assuming fixed water-table heights shows that the spatial prediction of shallow landslide susceptibility is improved using the transient analyses; moreover, results can be depicted in terms of the rainfall intensity and duration known to trigger shallow landslides in the study area.
Yang, Zhen; Zan, Yunlong; Zheng, Xiujuan; Hai, Wangxi; Chen, Kewei; Huang, Qiu; Xu, Yuhong; Peng, Jinliang
2015-01-01
[18F]fluoro-2-deoxy-D-glucose positron emission tomography (FDG-PET) has been widely used in oncologic procedures such as tumor diagnosis and staging. However, false-positive rates have been high, unacceptable and mainly caused by inflammatory lesions. Misinterpretations take place especially when non-subcutaneous inflammations appear at the tumor site, for instance in the lung. The aim of the current study is to evaluate the use of dynamic PET imaging procedure to differentiate in situ and subcutaneous non-small cell lung carcinoma (NSCLC) from inflammation, and estimate the kinetics of inflammations in various locations. Dynamic FDG-PET was performed on 33 female mice inoculated with tumor and/or inflammation subcutaneously or inside the lung. Standardized Uptake Values (SUVs) from static imaging (SUVmax) as well as values of influx rate constant (Ki) of compartmental modeling from dynamic imaging were obtained. Static and kinetic data from different lesions (tumor and inflammations) or different locations (subcutaneous, in situ and spontaneous group) were compared. Values of SUVmax showed significant difference in subcutaneous tumor and inflammation (p<0.01), and in inflammations from different locations (p<0.005). However, SUVmax showed no statistical difference between in situ tumor and inflammation (p = 1.0) and among tumors from different locations (subcutaneous and in situ, p = 0.91). Values of Ki calculated from compartmental modeling showed significant difference between tumor and inflammation both subcutaneously (p<0.005) and orthotopically (p<0.01). Ki showed also location specific values for inflammations (subcutaneous, in situ and spontaneous, p<0.015). However, Ki of tumors from different locations (subcutaneous and in situ) showed no significant difference (p = 0.46). In contrast to static PET based SUVmax, both subcutaneous and in situ inflammations and malignancies can be differentiated via dynamic FDG-PET based Ki. Moreover, Values of influx rate constant Ki from compartmental modeling can offer an assessment for inflammations at different locations of the body, which also implies further validation is necessary before the replacement of in situ inflammation with its subcutaneous counterpart in animal experiments.
NASA Astrophysics Data System (ADS)
Zarindast, Atousa; Seyed Hosseini, Seyed Mohamad; Pishvaee, Mir Saman
2017-06-01
Robust supplier selection problem, in a scenario-based approach has been proposed, when the demand and exchange rates are subject to uncertainties. First, a deterministic multi-objective mixed integer linear programming is developed; then, the robust counterpart of the proposed mixed integer linear programming is presented using the recent extension in robust optimization theory. We discuss decision variables, respectively, by a two-stage stochastic planning model, a robust stochastic optimization planning model which integrates worst case scenario in modeling approach and finally by equivalent deterministic planning model. The experimental study is carried out to compare the performances of the three models. Robust model resulted in remarkable cost saving and it illustrated that to cope with such uncertainties, we should consider them in advance in our planning. In our case study different supplier were selected due to this uncertainties and since supplier selection is a strategic decision, it is crucial to consider these uncertainties in planning approach.
Robust Planning for Effects-Based Operations
2006-06-01
Algorithm ......................................... 34 2.6 Robust Optimization Literature ..................................... 36 2.6.1 Protecting Against...Model Formulation ...................... 55 3.1.5 Deterministic EBO Model Example and Performance ............. 59 3.1.6 Greedy Algorithm ...111 4.1.9 Conclusions on Robust EBO Model Performance .................... 116 4.2 Greedy Algorithm versus EBO Models
Application of Wavelet Filters in an Evaluation of Photochemical Model Performance
Air quality model evaluation can be enhanced with time-scale specific comparisons of outputs and observations. For example, high-frequency (hours to one day) time scale information in observed ozone is not well captured by deterministic models and its incorporation into model pe...
Macroanatomy of compartmentalization in fire scars of three western conifers
Kevin T. Smith; Elaine Sutherland; Estelle Arbellay; Markus Stoffel; Donald Falk
2013-01-01
Fire scars are visible evidence of compartmentalization and closure processes that contribute to tree survival after fire injury. Preliminary observations of dissected fire scars from trees injured within the last decade showed centripetal development of wound-initiated discoloration (WID) through 2-3 decades of former sapwood in Larix occidentalis and Pseudotsuga...
Thallium (TI) is an extremely toxic metal which, due to its similarities to K, is readily taken up by plants. Thallium is efficiently hyperaccumulated in Iberis intermedia as TI(I). Distribution and compartmentalization of TI in I. intermedia is highes...
Wu, Fei; Pelster, Lindsey N; Minteer, Shelley D
2015-01-25
Dynamics of metabolon formation in mitochondria was probed by studying diffusional motion of two sequential Krebs cycle enzymes in a microfluidic channel. Enhanced directional co-diffusion of both enzymes against a substrate concentration gradient was observed in the presence of intermediate generation. This reveals a metabolite directed compartmentation of metabolic pathways.
Smoothed Particle Hydrodynamic Simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-10-05
This code is a highly modular framework for developing smoothed particle hydrodynamic (SPH) simulations running on parallel platforms. The compartmentalization of the code allows for rapid development of new SPH applications and modifications of existing algorithms. The compartmentalization also allows changes in one part of the code used by many applications to instantly be made available to all applications.
Repurposing the Saccharomyces cerevisiae peroxisome for compartmentalizing multi-enzyme pathways
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeLoache, William; Russ, Zachary; Samson, Jennifer
The peroxisome of Saccharomyces cerevisiae was targeted for repurposing in order to create a synthetic organelle that provides a generalizable compartment for engineered metabolic pathways. Compartmentalization of enzymes into organelles is a promising strategy for limiting metabolic crosstalk, improving pathway efficiency, and ultimately modifying the chemical environment to be distinct from that of the cytoplasm. We focused on the Saccharomyces cerevisiae peroxisome, as this organelle is not required for viability when grown on conventional media. We identified an enhanced peroxisomal targeting signal type 1 (PTS1) for rapidly importing non-native cargo proteins. Additionally, we performed the first systematic in vivo measurementsmore » of nonspecific metabolite permeability across the peroxisomal membrane using a polymer exclusion assay and characterized the size dependency of metabolite trafficking. Finally, we applied these new insights to compartmentalize a two-enzyme pathway in the peroxisome and characterize the expression regimes where compartmentalization leads to improved product titer. This work builds a foundation for using the peroxisome as a synthetic organelle, highlighting both promise and future challenges on the way to realizing this goal.« less
Mazzotti, Eva; Farina, Benedetto; Imperatori, Claudio; Mansutti, Federica; Prunetti, Elena; Speranza, Anna Maria; Barbaranelli, Claudio
2016-01-01
Background In this study, we explored the ability of the Dissociative Experiences Scale (DES) to catch detachment and compartmentalization symptoms. Participants and methods The DES factor structure was evaluated in 768 psychiatric patients (546 women and 222 men) and in 2,403 subjects enrolled in nonpsychiatric settings (1,857 women and 546 men). All participants were administered the Italian version of DES. Twenty senior psychiatric experts in the treatment of dissociative symptoms independently assessed the DES items and categorized each of them as follows: “C” for compartmentalization, “D” for detachment, and “NC” for noncongruence with either C or D. Results Confirmatory factor analysis supported the three-factor structure of DES in both clinical and nonclinical samples and its invariance across the two groups. Moreover, factor analyses results overlapped with those from the expert classification procedure. Conclusion Our results showed that DES can be used as a valid instrument for clinicians to assess the frequency of different types of dissociative experiences including detachment and compartmentalization. PMID:27350746
Mazzotti, Eva; Farina, Benedetto; Imperatori, Claudio; Mansutti, Federica; Prunetti, Elena; Speranza, Anna Maria; Barbaranelli, Claudio
2016-01-01
In this study, we explored the ability of the Dissociative Experiences Scale (DES) to catch detachment and compartmentalization symptoms. The DES factor structure was evaluated in 768 psychiatric patients (546 women and 222 men) and in 2,403 subjects enrolled in nonpsychiatric settings (1,857 women and 546 men). All participants were administered the Italian version of DES. Twenty senior psychiatric experts in the treatment of dissociative symptoms independently assessed the DES items and categorized each of them as follows: "C" for compartmentalization, "D" for detachment, and "NC" for noncongruence with either C or D. Confirmatory factor analysis supported the three-factor structure of DES in both clinical and nonclinical samples and its invariance across the two groups. Moreover, factor analyses results overlapped with those from the expert classification procedure. Our results showed that DES can be used as a valid instrument for clinicians to assess the frequency of different types of dissociative experiences including detachment and compartmentalization.
FACTORS INFLUENCING TOTAL DIETARY EXPOSURES OF YOUNG CHILDREN
A deterministic model was developed to identify the critical input parameters needed to assess dietary intakes of young children. The model was used as a framework for understanding the important factors in data collection and data analysis. Factors incorporated into the model i...
Huttunen, K-L; Mykrä, H; Oksanen, J; Astorga, A; Paavola, R; Muotka, T
2017-05-03
One of the key challenges to understanding patterns of β diversity is to disentangle deterministic patterns from stochastic ones. Stochastic processes may mask the influence of deterministic factors on community dynamics, hindering identification of the mechanisms causing variation in community composition. We studied temporal β diversity (among-year dissimilarity) of macroinvertebrate communities in near-pristine boreal streams across 14 years. To assess whether the observed β diversity deviates from that expected by chance, and to identify processes (deterministic vs. stochastic) through which different explanatory factors affect community variability, we used a null model approach. We observed that at the majority of sites temporal β diversity was low indicating high community stability. When stochastic variation was unaccounted for, connectivity was the only variable explaining temporal β diversity, with weakly connected sites exhibiting higher community variability through time. After accounting for stochastic effects, connectivity lost importance, suggesting that it was related to temporal β diversity via random colonization processes. Instead, β diversity was best explained by in-stream vegetation, community variability decreasing with increasing bryophyte cover. These results highlight the potential of stochastic factors to dampen the influence of deterministic processes, affecting our ability to understand and predict changes in biological communities through time.
Deterministic generation of remote entanglement with active quantum feedback
Martin, Leigh; Motzoi, Felix; Li, Hanhan; ...
2015-12-10
We develop and study protocols for deterministic remote entanglement generation using quantum feedback, without relying on an entangling Hamiltonian. In order to formulate the most effective experimentally feasible protocol, we introduce the notion of average-sense locally optimal feedback protocols, which do not require real-time quantum state estimation, a difficult component of real-time quantum feedback control. We use this notion of optimality to construct two protocols that can deterministically create maximal entanglement: a semiclassical feedback protocol for low-efficiency measurements and a quantum feedback protocol for high-efficiency measurements. The latter reduces to direct feedback in the continuous-time limit, whose dynamics can bemore » modeled by a Wiseman-Milburn feedback master equation, which yields an analytic solution in the limit of unit measurement efficiency. Our formalism can smoothly interpolate between continuous-time and discrete-time descriptions of feedback dynamics and we exploit this feature to derive a superior hybrid protocol for arbitrary nonunit measurement efficiency that switches between quantum and semiclassical protocols. Lastly, we show using simulations incorporating experimental imperfections that deterministic entanglement of remote superconducting qubits may be achieved with current technology using the continuous-time feedback protocol alone.« less
Xia, J.; Franseen, E.K.; Miller, R.D.; Weis, T.V.
2004-01-01
We successfully applied deterministic deconvolution to real ground-penetrating radar (GPR) data by using the source wavelet that was generated in and transmitted through air as the operator. The GPR data were collected with 400-MHz antennas on a bench adjacent to a cleanly exposed quarry face. The quarry site is characterized by horizontally bedded carbonate strata with shale partings. In order to provide groundtruth for this deconvolution approach, 23 conductive rods were drilled into the quarry face at key locations. The steel rods provided critical information for: (1) correlation between reflections on GPR data and geologic features exposed in the quarry face, (2) GPR resolution limits, (3) accuracy of velocities calculated from common midpoint data and (4) identifying any multiples. Comparing the results of deconvolved data with non-deconvolved data demonstrates the effectiveness of deterministic deconvolution in low dielectric-loss media for increased accuracy of velocity models (improved at least 10-15% in our study after deterministic deconvolution), increased vertical and horizontal resolution of specific geologic features and more accurate representation of geologic features as confirmed from detailed study of the adjacent quarry wall. ?? 2004 Elsevier B.V. All rights reserved.
Bull, Marta E.; Heath, Laura M.; McKernan-Mullin, Jennifer L.; Kraft, Kelli M.; Acevedo, Luis; Hitti, Jane E.; Cohn, Susan E.; Tapia, Kenneth A.; Holte, Sarah E.; Dragavon, Joan A.; Coombs, Robert W.; Mullins, James I.; Frenkel, Lisa M.
2013-01-01
Background. Whether unique human immunodeficiency type 1 (HIV) genotypes occur in the genital tract is important for vaccine development and management of drug resistant viruses. Multiple cross-sectional studies suggest HIV is compartmentalized within the female genital tract. We hypothesize that bursts of HIV replication and/or proliferation of infected cells captured in cross-sectional analyses drive compartmentalization but over time genital-specific viral lineages do not form; rather viruses mix between genital tract and blood. Methods. Eight women with ongoing HIV replication were studied during a period of 1.5 to 4.5 years. Multiple viral sequences were derived by single-genome amplification of the HIV C2-V5 region of env from genital secretions and blood plasma. Maximum likelihood phylogenies were evaluated for compartmentalization using 4 statistical tests. Results. In cross-sectional analyses compartmentalization of genital from blood viruses was detected in three of eight women by all tests; this was associated with tissue specific clades containing multiple monotypic sequences. In longitudinal analysis, the tissues-specific clades did not persist to form viral lineages. Rather, across women, HIV lineages were comprised of both genital tract and blood sequences. Conclusions. The observation of genital-specific HIV clades only in cross-sectional analysis and an absence of genital-specific lineages in longitudinal analyses suggest a dynamic interchange of HIV variants between the female genital tract and blood. PMID:23315326
Herzog, Sereina A; Blaizot, Stéphanie; Hens, Niel
2017-12-18
Mathematical models offer the possibility to investigate the infectious disease dynamics over time and may help in informing design of studies. A systematic review was performed in order to determine to what extent mathematical models have been incorporated into the process of planning studies and hence inform study design for infectious diseases transmitted between humans and/or animals. We searched Ovid Medline and two trial registry platforms (Cochrane, WHO) using search terms related to infection, mathematical model, and study design from the earliest dates to October 2016. Eligible publications and registered trials included mathematical models (compartmental, individual-based, or Markov) which were described and used to inform the design of infectious disease studies. We extracted information about the investigated infection, population, model characteristics, and study design. We identified 28 unique publications but no registered trials. Focusing on compartmental and individual-based models we found 12 observational/surveillance studies and 11 clinical trials. Infections studied were equally animal and human infectious diseases for the observational/surveillance studies, while all but one between humans for clinical trials. The mathematical models were used to inform, amongst other things, the required sample size (n = 16), the statistical power (n = 9), the frequency at which samples should be taken (n = 6), and from whom (n = 6). Despite the fact that mathematical models have been advocated to be used at the planning stage of studies or surveillance systems, they are used scarcely. With only one exception, the publications described theoretical studies, hence, not being utilised in real studies.
NASA Astrophysics Data System (ADS)
Scolari, Vittore F.; Cosentino Lagomarsino, Marco
Recent experimental results suggest that the E. coli chromosome feels a self-attracting interaction of osmotic origin, and is condensed in foci by bridging interactions. Motivated by these findings, we explore a generic modeling framework combining solely these two ingredients, in order to characterize their joint effects. Specifically, we study a simple polymer physics computational model with weak ubiquitous short-ranged self attraction and stronger sparse bridging interactions. Combining theoretical arguments and simulations, we study the general phenomenology of polymer collapse induced by these dual contributions, in the case of regularly-spaced bridging. Our results distinguish a regime of classical Flory-like coil-globule collapse dictated by the interplay of excluded volume and attractive energy and a switch-like collapse where bridging interaction compete with entropy loss terms from the looped arms of a star-like rosette. Additionally, we show that bridging can induce stable compartmentalized domains. In these configurations, different "cores" of bridging proteins are kept separated by star-like polymer loops in an entropically favorable multi-domain configuration, with a mechanism that parallels micellar polysoaps. Such compartmentalized domains are stable, and do not need any intra-specific interactions driving their segregation. Domains can be stable also in presence of uniform attraction, as long as the uniform collapse is above its theta point.
Choquette, Amélie; Troncy, Eric; Guillot, Martin; Varin, France; Del Castillo, Jérôme R E
2017-01-01
Adrenaline is known to prolong the duration of local anesthesia but its effects on the pharmacokinetic processes of local anesthetic drugs are not fully understood. Our objective was to develop a compartmental model for quantification of adrenaline's impact on the pharmacokinetics of perineurally-injected lidocaine in the dog. Dogs were subjected to paravertebral brachial plexus block using lidocaine alone or adrenalinated lidocaine. Data was collected through a prospective, randomised, blinded crossover protocol performed over three periods. Blood samples were collected during 180 minutes following block execution. Compartmental pharmacokinetic models were developed and their goodness-of-fit were compared. The lowering effects of adrenaline on the absorption of lidocaine were statistically determined with one-sided tests. A one-compartment disposition model with two successive zero-order absorption processes best fitted our experimental data. Adrenaline decreased the peak plasma lidocaine concentration by approximately 60% (P < 0.001), decreased this local anesthetic's fast and slow zero-order absorption rates respectively by 50% and 90% (P = 0.046, and P < 0.001), which respective durations were prolonged by 90% and 1300% (P < 0.020 and P < 0.001). Lidocaine demonstrated a previously unreported atypical absorption profile following its paravertebral injection in dogs. Adrenaline decreased the absorption rate of lidocaine and prolonged the duration of its absorption.
NASA Astrophysics Data System (ADS)
Khaki, M.; Hoteit, I.; Kuhn, M.; Awange, J.; Forootan, E.; van Dijk, A. I. J. M.; Schumacher, M.; Pattiaratchi, C.
2017-09-01
The time-variable terrestrial water storage (TWS) products from the Gravity Recovery And Climate Experiment (GRACE) have been increasingly used in recent years to improve the simulation of hydrological models by applying data assimilation techniques. In this study, for the first time, we assess the performance of the most popular data assimilation sequential techniques for integrating GRACE TWS into the World-Wide Water Resources Assessment (W3RA) model. We implement and test stochastic and deterministic ensemble-based Kalman filters (EnKF), as well as Particle filters (PF) using two different resampling approaches of Multinomial Resampling and Systematic Resampling. These choices provide various opportunities for weighting observations and model simulations during the assimilation and also accounting for error distributions. Particularly, the deterministic EnKF is tested to avoid perturbing observations before assimilation (that is the case in an ordinary EnKF). Gaussian-based random updates in the EnKF approaches likely do not fully represent the statistical properties of the model simulations and TWS observations. Therefore, the fully non-Gaussian PF is also applied to estimate more realistic updates. Monthly GRACE TWS are assimilated into W3RA covering the entire Australia. To evaluate the filters performances and analyze their impact on model simulations, their estimates are validated by independent in-situ measurements. Our results indicate that all implemented filters improve the estimation of water storage simulations of W3RA. The best results are obtained using two versions of deterministic EnKF, i.e. the Square Root Analysis (SQRA) scheme and the Ensemble Square Root Filter (EnSRF), respectively, improving the model groundwater estimations errors by 34% and 31% compared to a model run without assimilation. Applying the PF along with Systematic Resampling successfully decreases the model estimation error by 23%.
Relation Between the Cell Volume and the Cell Cycle Dynamics in Mammalian cell
NASA Astrophysics Data System (ADS)
Magno, A. C. G.; Oliveira, I. L.; Hauck, J. V. S.
2016-08-01
The main goal of this work is to add and analyze an equation that represents the volume in a dynamical model of the mammalian cell cycle proposed by Gérard and Goldbeter (2011) [1]. The cell division occurs when the cyclinB/Cdkl complex is totally degraded (Tyson and Novak, 2011)[2] and it reaches a minimum value. At this point, the cell is divided into two newborn daughter cells and each one will contain the half of the cytoplasmic content of the mother cell. The equations of our base model are only valid if the cell volume, where the reactions occur, is constant. Whether the cell volume is not constant, that is, the rate of change of its volume with respect to time is explicitly taken into account in the mathematical model, then the equations of the original model are no longer valid. Therefore, every equations were modified from the mass conservation principle for considering a volume that changes with time. Through this approach, the cell volume affects all model variables. Two different dynamic simulation methods were accomplished: deterministic and stochastic. In the stochastic simulation, the volume affects every model's parameters which have molar unit, whereas in the deterministic one, it is incorporated into the differential equations. In deterministic simulation, the biochemical species may be in concentration units, while in stochastic simulation such species must be converted to number of molecules which are directly proportional to the cell volume. In an effort to understand the influence of the new equation a stability analysis was performed. This elucidates how the growth factor impacts the stability of the model's limit cycles. In conclusion, a more precise model, in comparison to the base model, was created for the cell cycle as it now takes into consideration the cell volume variation
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Romps, David M.
2016-03-01
Convective entrainment is a process that is poorly represented in existing convective parameterizations. By many estimates, convective entrainment is the leading source of error in global climate models. As a potential remedy, an Eulerian implementation of the Stochastic Parcel Model (SPM) is presented here as a convective parameterization that treats entrainment in a physically realistic and computationally efficient way. Drawing on evidence that convecting clouds comprise air parcels subject to Poisson-process entrainment events, the SPM calculates the deterministic limit of an infinite number of such parcels. For computational efficiency, the SPM groups parcels at each height by their purity, whichmore » is a measure of their total entrainment up to that height. This reduces the calculation of convective fluxes to a sequence of matrix multiplications. The SPM is implemented in a single-column model and compared with a large-eddy simulation of deep convection.« less
Spatial scaling patterns and functional redundancies in a changing boreal lake landscape
Angeler, David G.; Allen, Craig R.; Uden, Daniel R.; Johnson, Richard K.
2015-01-01
Global transformations extend beyond local habitats; therefore, larger-scale approaches are needed to assess community-level responses and resilience to unfolding environmental changes. Using longterm data (1996–2011), we evaluated spatial patterns and functional redundancies in the littoral invertebrate communities of 85 Swedish lakes, with the objective of assessing their potential resilience to environmental change at regional scales (that is, spatial resilience). Multivariate spatial modeling was used to differentiate groups of invertebrate species exhibiting spatial patterns in composition and abundance (that is, deterministic species) from those lacking spatial patterns (that is, stochastic species). We then determined the functional feeding attributes of the deterministic and stochastic invertebrate species, to infer resilience. Between one and three distinct spatial patterns in invertebrate composition and abundance were identified in approximately one-third of the species; the remainder were stochastic. We observed substantial differences in metrics between deterministic and stochastic species. Functional richness and diversity decreased over time in the deterministic group, suggesting a loss of resilience in regional invertebrate communities. However, taxon richness and redundancy increased monotonically in the stochastic group, indicating the capacity of regional invertebrate communities to adapt to change. Our results suggest that a refined picture of spatial resilience emerges if patterns of both the deterministic and stochastic species are accounted for. Spatially extensive monitoring may help increase our mechanistic understanding of community-level responses and resilience to regional environmental change, insights that are critical for developing management and conservation agendas in this current period of rapid environmental transformation.
Production scheduling and rescheduling with genetic algorithms.
Bierwirth, C; Mattfeld, D C
1999-01-01
A general model for job shop scheduling is described which applies to static, dynamic and non-deterministic production environments. Next, a Genetic Algorithm is presented which solves the job shop scheduling problem. This algorithm is tested in a dynamic environment under different workload situations. Thereby, a highly efficient decoding procedure is proposed which strongly improves the quality of schedules. Finally, this technique is tested for scheduling and rescheduling in a non-deterministic environment. It is shown by experiment that conventional methods of production control are clearly outperformed at reasonable run-time costs.
NASA Astrophysics Data System (ADS)
Lemarchand, A.; Lesne, A.; Mareschal, M.
1995-05-01
The reaction-diffusion equation associated with the Fisher chemical model A+B-->2A admits wave-front solutions by replacing an unstable stationary state with a stable one. The deterministic analysis concludes that their propagation velocity is not prescribed by the dynamics. For a large class of initial conditions the velocity which is spontaneously selected is equal to the minimum allowed velocity vmin, as predicted by the marginal stability criterion. In order to test the relevance of this deterministic description we investigate the macroscopic consequences, on the velocity and the width of the front, of the intrinsic stochasticity due to the underlying microscopic dynamics. We solve numerically the Langevin equations, deduced analytically from the master equation within a system size expansion procedure. We show that the mean profile associated with the stochastic solution propagates faster than the deterministic solution at a velocity up to 25% greater than vmin.
NASA Astrophysics Data System (ADS)
Han, Jiang; Chen, Ye-Hwa; Zhao, Xiaomin; Dong, Fangfang
2018-04-01
A novel fuzzy dynamical system approach to the control design of flexible joint manipulators with mismatched uncertainty is proposed. Uncertainties of the system are assumed to lie within prescribed fuzzy sets. The desired system performance includes a deterministic phase and a fuzzy phase. First, by creatively implanting a fictitious control, a robust control scheme is constructed to render the system uniformly bounded and uniformly ultimately bounded. Both the manipulator modelling and control scheme are deterministic and not IF-THEN heuristic rules-based. Next, a fuzzy-based performance index is proposed. An optimal design problem for a control design parameter is formulated as a constrained optimisation problem. The global solution to this problem can be obtained from solving two quartic equations. The fuzzy dynamical system approach is systematic and is able to assure the deterministic performance as well as to minimise the fuzzy performance index.
Impact and cost-effectiveness of chlamydia testing in Scotland: a mathematical modelling study.
Looker, Katharine J; Wallace, Lesley A; Turner, Katherine M E
2015-01-15
Chlamydia is the most common sexually transmitted bacterial infection in Scotland, and is associated with potentially serious reproductive outcomes, including pelvic inflammatory disease (PID) and tubal factor infertility (TFI) in women. Chlamydia testing in Scotland is currently targeted towards symptomatic individuals, individuals at high risk of existing undetected infection, and young people. The cost-effectiveness of testing and treatment to prevent PID and TFI in Scotland is uncertain. A compartmental deterministic dynamic model of chlamydia infection in 15-24 year olds in Scotland was developed. The model was used to estimate the impact of a change in testing strategy from baseline (16.8% overall testing coverage; 0.4 partners notified and tested/treated per treated positive index) on PID and TFI cases. Cost-effectiveness calculations informed by best-available estimates of the quality-adjusted life years (QALYs) lost due to PID and TFI were also performed. Increasing overall testing coverage by 50% from baseline to 25.2% is estimated to result in 21% fewer cases in young women each year (PID: 703 fewer; TFI: 88 fewer). A 50% decrease to 8.4% would result in 20% more PID (669 additional) and TFI (84 additional) cases occurring annually. The cost per QALY gained of current testing activities compared to no testing is £40,034, which is above the £20,000-£30,000 cost-effectiveness threshold. However, calculations are hampered by lack of reliable data. Any increase in partner notification from baseline would be cost-effective (incremental cost per QALY gained for a partner notification efficacy of 1 compared to baseline: £5,119), and would increase the cost-effectiveness of current testing strategy compared to no testing, with threshold cost-effectiveness reached at a partner notification efficacy of 1.5. However, there is uncertainty in the extent to which partner notification is currently done, and hence the amount by which it could potentially be increased. Current chlamydia testing strategy in Scotland is not cost-effective under the conservative model assumptions applied. However, with better data enabling some of these assumptions to be relaxed, current coverage could be cost-effective. Meanwhile, increasing partner notification efficacy on its own would be a cost-effective way of preventing PID and TFI from current strategy.
Abstract: Two physically based and deterministic models, CASC2-D and KINEROS are evaluated and compared for their performances on modeling sediment movement on a small agricultural watershed over several events. Each model has different conceptualization of a watershed. CASC...
Identifiability Of Systems With Modeling Errors
NASA Technical Reports Server (NTRS)
Hadaegh, Yadolah " fred"
1988-01-01
Advances in theory of modeling errors reported. Recent paper on errors in mathematical models of deterministic linear or weakly nonlinear systems. Extends theoretical work described in NPO-16661 and NPO-16785. Presents concrete way of accounting for difference in structure between mathematical model and physical process or system that it represents.
The goal of achieving verisimilitude of air quality simulations to observations is problematic. Chemical transport models such as the Community Multi-Scale Air Quality (CMAQ) modeling system produce volume averages of pollutant concentration fields. When grid sizes are such tha...
Sasakawa, Tomoki; Masui, Kenichi; Kazama, Tomiei; Iwasaki, Hiroshi
2016-08-01
Rocuronium concentration prediction using pharmacokinetic (PK) models would be useful for controlling rocuronium effects because neuromuscular monitoring throughout anesthesia can be difficult. This study assessed whether six different compartmental PK models developed from data obtained after bolus administration only could predict the measured plasma concentration (Cp) values of rocuronium delivered by bolus followed by continuous infusion. Rocuronium Cp values from 19 healthy subjects who received a bolus dose followed by continuous infusion in a phase III multicenter trial in Japan were used retrospectively as evaluation datasets. Six different compartmental PK models of rocuronium were used to simulate rocuronium Cp time course values, which were compared with measured Cp values. Prediction error (PE) derivatives of median absolute PE (MDAPE), median PE (MDPE), wobble, divergence absolute PE, and divergence PE were used to assess inaccuracy, bias, intra-individual variability, and time-related trends in APE and PE values. MDAPE and MDPE values were acceptable only for the Magorian and Kleijn models. The divergence PE value for the Kleijn model was lower than -10 %/h, indicating unstable prediction over time. The Szenohradszky model had the lowest divergence PE (-2.7 %/h) and wobble (5.4 %) values with negative bias (MDPE = -25.9 %). These three models were developed using the mixed-effects modeling approach. The Magorian model showed the best PE derivatives among the models assessed. A PK model developed from data obtained after single-bolus dosing can predict Cp values during bolus and continuous infusion. Thus, a mixed-effects modeling approach may be preferable in extrapolating such data.
Understanding Rasch Measurement: Rasch Models Overview.
ERIC Educational Resources Information Center
Wright, Benjamin D.; Mok, Magdalena
2000-01-01
Presents an overview of Rasch measurement models that begins with a conceptualization of continuous experiences often captured as discrete observations. Discusses the mathematical properties of the Rasch family of models that allow the transformation of discrete deterministic counts into continuous probabilistic abstractions. Also discusses six of…
NASA Astrophysics Data System (ADS)
Gao, Yi
The development and utilization of wind energy for satisfying electrical demand has received considerable attention in recent years due to its tremendous environmental, social and economic benefits, together with public support and government incentives. Electric power generation from wind energy behaves quite differently from that of conventional sources. The fundamentally different operating characteristics of wind energy facilities therefore affect power system reliability in a different manner than those of conventional systems. The reliability impact of such a highly variable energy source is an important aspect that must be assessed when the wind power penetration is significant. The focus of the research described in this thesis is on the utilization of state sampling Monte Carlo simulation in wind integrated bulk electric system reliability analysis and the application of these concepts in system planning and decision making. Load forecast uncertainty is an important factor in long range planning and system development. This thesis describes two approximate approaches developed to reduce the number of steps in a load duration curve which includes load forecast uncertainty, and to provide reasonably accurate generating and bulk system reliability index predictions. The developed approaches are illustrated by application to two composite test systems. A method of generating correlated random numbers with uniform distributions and a specified correlation coefficient in the state sampling method is proposed and used to conduct adequacy assessment in generating systems and in bulk electric systems containing correlated wind farms in this thesis. The studies described show that it is possible to use the state sampling Monte Carlo simulation technique to quantitatively assess the reliability implications associated with adding wind power to a composite generation and transmission system including the effects of multiple correlated wind sites. This is an important development as it permits correlated wind farms to be incorporated in large practical system studies without requiring excessive increases in computer solution time. The procedures described in this thesis for creating monthly and seasonal wind farm models should prove useful in situations where time period models are required to incorporate scheduled maintenance of generation and transmission facilities. There is growing interest in combining deterministic considerations with probabilistic assessment in order to evaluate the quantitative system risk and conduct bulk power system planning. A relatively new approach that incorporates deterministic and probabilistic considerations in a single risk assessment framework has been designated as the joint deterministic-probabilistic approach. The research work described in this thesis illustrates that the joint deterministic-probabilistic approach can be effectively used to integrate wind power in bulk electric system planning. The studies described in this thesis show that the application of the joint deterministic-probabilistic method provides more stringent results for a system with wind power than the traditional deterministic N-1 method because the joint deterministic-probabilistic technique is driven by the deterministic N-1 criterion with an added probabilistic perspective which recognizes the power output characteristics of a wind turbine generator.
Huisman, J.A.; Breuer, L.; Bormann, H.; Bronstert, A.; Croke, B.F.W.; Frede, H.-G.; Graff, T.; Hubrechts, L.; Jakeman, A.J.; Kite, G.; Lanini, J.; Leavesley, G.; Lettenmaier, D.P.; Lindstrom, G.; Seibert, J.; Sivapalan, M.; Viney, N.R.; Willems, P.
2009-01-01
An ensemble of 10 hydrological models was applied to the same set of land use change scenarios. There was general agreement about the direction of changes in the mean annual discharge and 90% discharge percentile predicted by the ensemble members, although a considerable range in the magnitude of predictions for the scenarios and catchments under consideration was obvious. Differences in the magnitude of the increase were attributed to the different mean annual actual evapotranspiration rates for each land use type. The ensemble of model runs was further analyzed with deterministic and probabilistic ensemble methods. The deterministic ensemble method based on a trimmed mean resulted in a single somewhat more reliable scenario prediction. The probabilistic reliability ensemble averaging (REA) method allowed a quantification of the model structure uncertainty in the scenario predictions. It was concluded that the use of a model ensemble has greatly increased our confidence in the reliability of the model predictions. ?? 2008 Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Bennett, J.; David, R. E.; Wang, Q.; Li, M.; Shrestha, D. L.
2016-12-01
Flood forecasting in Australia has historically relied on deterministic forecasting models run only when floods are imminent, with considerable forecaster input and interpretation. These now co-existed with a continually available 7-day streamflow forecasting service (also deterministic) aimed at operational water management applications such as environmental flow releases. The 7-day service is not optimised for flood prediction. We describe progress on developing a system for ensemble streamflow forecasting that is suitable for both flood prediction and water management applications. Precipitation uncertainty is handled through post-processing of Numerical Weather Prediction (NWP) output with a Bayesian rainfall post-processor (RPP). The RPP corrects biases, downscales NWP output, and produces reliable ensemble spread. Ensemble precipitation forecasts are used to force a semi-distributed conceptual rainfall-runoff model. Uncertainty in precipitation forecasts is insufficient to reliably describe streamflow forecast uncertainty, particularly at shorter lead-times. We characterise hydrological prediction uncertainty separately with a 4-stage error model. The error model relies on data transformation to ensure residuals are homoscedastic and symmetrically distributed. To ensure streamflow forecasts are accurate and reliable, the residuals are modelled using a mixture-Gaussian distribution with distinct parameters for the rising and falling limbs of the forecast hydrograph. In a case study of the Murray River in south-eastern Australia, we show ensemble predictions of floods generally have lower errors than deterministic forecasting methods. We also discuss some of the challenges in operationalising short-term ensemble streamflow forecasts in Australia, including meeting the needs for accurate predictions across all flow ranges and comparing forecasts generated by event and continuous hydrological models.
Wu, Fei; Sioshansi, Ramteen
2017-05-04
Here, we develop a model to optimize the location of public fast charging stations for electric vehicles (EVs). A difficulty in planning the placement of charging stations is uncertainty in where EV charging demands appear. For this reason, we use a stochastic flow-capturing location model (SFCLM). A sample-average approximation method and an averaged two-replication procedure are used to solve the problem and estimate the solution quality. We demonstrate the use of the SFCLM using a Central-Ohio based case study. We find that most of the stations built are concentrated around the urban core of the region. As the number ofmore » stations built increases, some appear on the outskirts of the region to provide an extended charging network. We find that the sets of optimal charging station locations as a function of the number of stations built are approximately nested. We demonstrate the benefits of the charging-station network in terms of how many EVs are able to complete their daily trips by charging midday—six public charging stations allow at least 60% of EVs that would otherwise not be able to complete their daily tours without the stations to do so. We finally compare the SFCLM to a deterministic model, in which EV flows are set equal to their expected values. We show that if a limited number of charging stations are to be built, the SFCLM outperforms the deterministic model. As the number of stations to be built increases, the SFCLM and deterministic model select very similar station locations.« less
Deterministic chaotic dynamics of Raba River flow (Polish Carpathian Mountains)
NASA Astrophysics Data System (ADS)
Kędra, Mariola
2014-02-01
Is the underlying dynamics of river flow random or deterministic? If it is deterministic, is it deterministic chaotic? This issue is still controversial. The application of several independent methods, techniques and tools for studying daily river flow data gives consistent, reliable and clear-cut results to the question. The outcomes point out that the investigated discharge dynamics is not random but deterministic. Moreover, the results completely confirm the nonlinear deterministic chaotic nature of the studied process. The research was conducted on daily discharge from two selected gauging stations of the mountain river in southern Poland, the Raba River.
Macroanatomy and compartmentalization of recent fire scars in three North American conifers
Kevin T. Smith; Estelle Arbellay; Donald A. Falk; Elaine Kennedy Sutherland
2016-01-01
Fire scars are initiated by cambial necrosis caused by localized lethal heating of the tree stem. Scars develop as part of the linked survival processes of compartmentalization and wound closure. The position of scars within dated tree ring series is the basis for dendrochronological reconstruction of fire history. Macroanatomical features were described for western...
USDA-ARS?s Scientific Manuscript database
Eukaryotic cells compartmentalize neutral lipids into organelles called lipid droplets (LDs), and while much is known about the role of LDs in storing triacylglycerols (TAGs) in seeds, their biogenesis and function in non-seed tissues is poorly understood. Recently, we identified a class of plant-sp...
NASA Astrophysics Data System (ADS)
Turnbull, Heather; Omenzetter, Piotr
2018-03-01
vDifficulties associated with current health monitoring and inspection practices combined with harsh, often remote, operational environments of wind turbines highlight the requirement for a non-destructive evaluation system capable of remotely monitoring the current structural state of turbine blades. This research adopted a physics based structural health monitoring methodology through calibration of a finite element model using inverse techniques. A 2.36m blade from a 5kW turbine was used as an experimental specimen, with operational modal analysis techniques utilised to realize the modal properties of the system. Modelling the experimental responses as fuzzy numbers using the sub-level technique, uncertainty in the response parameters was propagated back through the model and into the updating parameters. Initially, experimental responses of the blade were obtained, with a numerical model of the blade created and updated. Deterministic updating was carried out through formulation and minimisation of a deterministic objective function using both firefly algorithm and virus optimisation algorithm. Uncertainty in experimental responses were modelled using triangular membership functions, allowing membership functions of updating parameters (Young's modulus and shear modulus) to be obtained. Firefly algorithm and virus optimisation algorithm were again utilised, however, this time in the solution of fuzzy objective functions. This enabled uncertainty associated with updating parameters to be quantified. Varying damage location and severity was simulated experimentally through addition of small masses to the structure intended to cause a structural alteration. A damaged model was created, modelling four variable magnitude nonstructural masses at predefined points and updated to provide a deterministic damage prediction and information in relation to the parameters uncertainty via fuzzy updating.
Simulation of anaerobic digestion processes using stochastic algorithm.
Palanichamy, Jegathambal; Palani, Sundarambal
2014-01-01
The Anaerobic Digestion (AD) processes involve numerous complex biological and chemical reactions occurring simultaneously. Appropriate and efficient models are to be developed for simulation of anaerobic digestion systems. Although several models have been developed, mostly they suffer from lack of knowledge on constants, complexity and weak generalization. The basis of the deterministic approach for modelling the physico and bio-chemical reactions occurring in the AD system is the law of mass action, which gives the simple relationship between the reaction rates and the species concentrations. The assumptions made in the deterministic models are not hold true for the reactions involving chemical species of low concentration. The stochastic behaviour of the physicochemical processes can be modeled at mesoscopic level by application of the stochastic algorithms. In this paper a stochastic algorithm (Gillespie Tau Leap Method) developed in MATLAB was applied to predict the concentration of glucose, acids and methane formation at different time intervals. By this the performance of the digester system can be controlled. The processes given by ADM1 (Anaerobic Digestion Model 1) were taken for verification of the model. The proposed model was verified by comparing the results of Gillespie's algorithms with the deterministic solution for conversion of glucose into methane through degraders. At higher value of 'τ' (timestep), the computational time required for reaching the steady state is more since the number of chosen reactions is less. When the simulation time step is reduced, the results are similar to ODE solver. It was concluded that the stochastic algorithm is a suitable approach for the simulation of complex anaerobic digestion processes. The accuracy of the results depends on the optimum selection of tau value.
Kim, Sung-Cheol; Wunsch, Benjamin H; Hu, Huan; Smith, Joshua T; Austin, Robert H; Stolovitzky, Gustavo
2017-06-27
Deterministic lateral displacement (DLD) is a technique for size fractionation of particles in continuous flow that has shown great potential for biological applications. Several theoretical models have been proposed, but experimental evidence has demonstrated that a rich class of intermediate migration behavior exists, which is not predicted. We present a unified theoretical framework to infer the path of particles in the whole array on the basis of trajectories in a unit cell. This framework explains many of the unexpected particle trajectories reported and can be used to design arrays for even nanoscale particle fractionation. We performed experiments that verify these predictions and used our model to develop a condenser array that achieves full particle separation with a single fluidic input.
The effect of boron deficiency on gene expression and boron compartmentalization in sugarbeet
USDA-ARS?s Scientific Manuscript database
NIP5, BOR1, NIP6, and WRKY6 genes were investigated for their role in boron deficiency in sugar beet, each with a proposed role in boron use in model plant species. All genes showed evidence of polymorphism in fragment size and gene expression in the target genomic DNA and cDNA libraries, with no co...
Developing Stochastic Models as Inputs for High-Frequency Ground Motion Simulations
NASA Astrophysics Data System (ADS)
Savran, William Harvey
High-frequency ( 10 Hz) deterministic ground motion simulations are challenged by our understanding of the small-scale structure of the earth's crust and the rupture process during an earthquake. We will likely never obtain deterministic models that can accurately describe these processes down to the meter scale length required for broadband wave propagation. Instead, we can attempt to explain the behavior, in a statistical sense, by including stochastic models defined by correlations observed in the natural earth and through physics based simulations of the earthquake rupture process. Toward this goal, we develop stochastic models to address both of the primary considerations for deterministic ground motion simulations: namely, the description of the material properties in the crust, and broadband earthquake source descriptions. Using borehole sonic log data recorded in Los Angeles basin, we estimate the spatial correlation structure of the small-scale fluctuations in P-wave velocities by determining the best-fitting parameters of a von Karman correlation function. We find that Hurst exponents, nu, between 0.0-0.2, vertical correlation lengths, az, of 15-150m, an standard deviation, sigma of about 5% characterize the variability in the borehole data. Usin these parameters, we generated a stochastic model of velocity and density perturbations and combined with leading seismic velocity models to perform a validation exercise for the 2008, Chino Hills, CA using heterogeneous media. We find that models of velocity and density perturbations can have significant effects on the wavefield at frequencies as low as 0.3 Hz, with ensemble median values of various ground motion metrics varying up to +/-50%, at certain stations, compared to those computed solely from the CVM. Finally, we develop a kinematic rupture generator based on dynamic rupture simulations on geometrically complex faults. We analyze 100 dynamic rupture simulations on strike-slip faults ranging from Mw 6.4-7.2. We find that our dynamic simulations follow empirical scaling relationships for inter-plate strike-slip events, and provide source spectra comparable with an o -2 model. Our rupture generator reproduces GMPE medians and intra-event standard deviations spectral accelerations for an ensemble of 10 Hz fully-deterministic ground motion simulations, as compared to NGA West2 GMPE relationships up to 0.2 seconds.
Stochastic and deterministic causes of streamer branching in liquid dielectrics
NASA Astrophysics Data System (ADS)
Jadidian, Jouya; Zahn, Markus; Lavesson, Nils; Widlund, Ola; Borg, Karl
2013-08-01
Streamer branching in liquid dielectrics is driven by stochastic and deterministic factors. The presence of stochastic causes of streamer branching such as inhomogeneities inherited from noisy initial states, impurities, or charge carrier density fluctuations is inevitable in any dielectric. A fully three-dimensional streamer model presented in this paper indicates that deterministic origins of branching are intrinsic attributes of streamers, which in some cases make the branching inevitable depending on shape and velocity of the volume charge at the streamer frontier. Specifically, any given inhomogeneous perturbation can result in streamer branching if the volume charge layer at the original streamer head is relatively thin and slow enough. Furthermore, discrete nature of electrons at the leading edge of an ionization front always guarantees the existence of a non-zero inhomogeneous perturbation ahead of the streamer head propagating even in perfectly homogeneous dielectric. Based on the modeling results for streamers propagating in a liquid dielectric, a gauge on the streamer head geometry is introduced that determines whether the branching occurs under particular inhomogeneous circumstances. Estimated number, diameter, and velocity of the born branches agree qualitatively with experimental images of the streamer branching.
Burbrink, Frank T; Chen, Xin; Myers, Edward A; Brandley, Matthew C; Pyron, R Alexander
2012-12-07
Adaptive radiation (AR) theory predicts that groups sharing the same source of ecological opportunity (EO) will experience deterministic species diversification and morphological evolution. Thus, deterministic ecological and morphological evolution should be correlated with deterministic patterns in the tempo and mode of speciation for groups in similar habitats and time periods. We test this hypothesis using well-sampled phylogenies of four squamate groups that colonized the New World (NW) in the Late Oligocene. We use both standard and coalescent models to assess species diversification, as well as likelihood models to examine morphological evolution. All squamate groups show similar early pulses of speciation, as well as diversity-dependent ecological limits on clade size at a continental scale. In contrast, processes of morphological evolution are not easily predictable and do not show similar pulses of early and rapid change. Patterns of morphological and species diversification thus appear uncoupled across these groups. This indicates that the processes that drive diversification and disparification are not mechanistically linked, even among similar groups of taxa experiencing the same sources of EO. It also suggests that processes of phenotypic diversification cannot be predicted solely from the existence of an AR or knowledge of the process of diversification.
Automated Flight Routing Using Stochastic Dynamic Programming
NASA Technical Reports Server (NTRS)
Ng, Hok K.; Morando, Alex; Grabbe, Shon
2010-01-01
Airspace capacity reduction due to convective weather impedes air traffic flows and causes traffic congestion. This study presents an algorithm that reroutes flights in the presence of winds, enroute convective weather, and congested airspace based on stochastic dynamic programming. A stochastic disturbance model incorporates into the reroute design process the capacity uncertainty. A trajectory-based airspace demand model is employed for calculating current and future airspace demand. The optimal routes minimize the total expected traveling time, weather incursion, and induced congestion costs. They are compared to weather-avoidance routes calculated using deterministic dynamic programming. The stochastic reroutes have smaller deviation probability than the deterministic counterpart when both reroutes have similar total flight distance. The stochastic rerouting algorithm takes into account all convective weather fields with all severity levels while the deterministic algorithm only accounts for convective weather systems exceeding a specified level of severity. When the stochastic reroutes are compared to the actual flight routes, they have similar total flight time, and both have about 1% of travel time crossing congested enroute sectors on average. The actual flight routes induce slightly less traffic congestion than the stochastic reroutes but intercept more severe convective weather.
Burbrink, Frank T.; Chen, Xin; Myers, Edward A.; Brandley, Matthew C.; Pyron, R. Alexander
2012-01-01
Adaptive radiation (AR) theory predicts that groups sharing the same source of ecological opportunity (EO) will experience deterministic species diversification and morphological evolution. Thus, deterministic ecological and morphological evolution should be correlated with deterministic patterns in the tempo and mode of speciation for groups in similar habitats and time periods. We test this hypothesis using well-sampled phylogenies of four squamate groups that colonized the New World (NW) in the Late Oligocene. We use both standard and coalescent models to assess species diversification, as well as likelihood models to examine morphological evolution. All squamate groups show similar early pulses of speciation, as well as diversity-dependent ecological limits on clade size at a continental scale. In contrast, processes of morphological evolution are not easily predictable and do not show similar pulses of early and rapid change. Patterns of morphological and species diversification thus appear uncoupled across these groups. This indicates that the processes that drive diversification and disparification are not mechanistically linked, even among similar groups of taxa experiencing the same sources of EO. It also suggests that processes of phenotypic diversification cannot be predicted solely from the existence of an AR or knowledge of the process of diversification. PMID:23034709
Zulkifley, Mohd Asyraf; Rawlinson, David; Moran, Bill
2012-01-01
In video analytics, robust observation detection is very important as the content of the videos varies a lot, especially for tracking implementation. Contrary to the image processing field, the problems of blurring, moderate deformation, low illumination surroundings, illumination change and homogenous texture are normally encountered in video analytics. Patch-Based Observation Detection (PBOD) is developed to improve detection robustness to complex scenes by fusing both feature- and template-based recognition methods. While we believe that feature-based detectors are more distinctive, however, for finding the matching between the frames are best achieved by a collection of points as in template-based detectors. Two methods of PBOD—the deterministic and probabilistic approaches—have been tested to find the best mode of detection. Both algorithms start by building comparison vectors at each detected points of interest. The vectors are matched to build candidate patches based on their respective coordination. For the deterministic method, patch matching is done in 2-level test where threshold-based position and size smoothing are applied to the patch with the highest correlation value. For the second approach, patch matching is done probabilistically by modelling the histograms of the patches by Poisson distributions for both RGB and HSV colour models. Then, maximum likelihood is applied for position smoothing while a Bayesian approach is applied for size smoothing. The result showed that probabilistic PBOD outperforms the deterministic approach with average distance error of 10.03% compared with 21.03%. This algorithm is best implemented as a complement to other simpler detection methods due to heavy processing requirement. PMID:23202226
The Office of Pesticide Programs models daily aquatic pesticide exposure values for 30 years in its risk assessments. However, only a fraction of that information is typically used in these assessments. The population model employed herein is a deterministic, density-dependent pe...
Gulf of Mexico dissolved oxygen model (GoMDOM) research and quality assurance project plan
An integrated high resolution mathematical modeling framework is being developed that will link hydrodynamic, atmospheric, and water quality models for the northern Gulf of Mexico. This Research and Quality Assurance Project Plan primarily focuses on the deterministic Gulf of Me...
A combinatorial model of malware diffusion via bluetooth connections.
Merler, Stefano; Jurman, Giuseppe
2013-01-01
We outline here the mathematical expression of a diffusion model for cellphones malware transmitted through Bluetooth channels. In particular, we provide the deterministic formula underlying the proposed infection model, in its equivalent recursive (simple but computationally heavy) and closed form (more complex but efficiently computable) expression.
Chaotic Lagrangian models for turbulent relative dispersion.
Lacorata, Guglielmo; Vulpiani, Angelo
2017-04-01
A deterministic multiscale dynamical system is introduced and discussed as a prototype model for relative dispersion in stationary, homogeneous, and isotropic turbulence. Unlike stochastic diffusion models, here trajectory transport and mixing properties are entirely controlled by Lagrangian chaos. The anomalous "sweeping effect," a known drawback common to kinematic simulations, is removed through the use of quasi-Lagrangian coordinates. Lagrangian dispersion statistics of the model are accurately analyzed by computing the finite-scale Lyapunov exponent (FSLE), which is the optimal measure of the scaling properties of dispersion. FSLE scaling exponents provide a severe test to decide whether model simulations are in agreement with theoretical expectations and/or observation. The results of our numerical experiments cover a wide range of "Reynolds numbers" and show that chaotic deterministic flows can be very efficient, and numerically low-cost, models of turbulent trajectories in stationary, homogeneous, and isotropic conditions. The mathematics of the model is relatively simple, and, in a geophysical context, potential applications may regard small-scale parametrization issues in general circulation models, mixed layer, and/or boundary layer turbulence models as well as Lagrangian predictability studies.
Chaotic Lagrangian models for turbulent relative dispersion
NASA Astrophysics Data System (ADS)
Lacorata, Guglielmo; Vulpiani, Angelo
2017-04-01
A deterministic multiscale dynamical system is introduced and discussed as a prototype model for relative dispersion in stationary, homogeneous, and isotropic turbulence. Unlike stochastic diffusion models, here trajectory transport and mixing properties are entirely controlled by Lagrangian chaos. The anomalous "sweeping effect," a known drawback common to kinematic simulations, is removed through the use of quasi-Lagrangian coordinates. Lagrangian dispersion statistics of the model are accurately analyzed by computing the finite-scale Lyapunov exponent (FSLE), which is the optimal measure of the scaling properties of dispersion. FSLE scaling exponents provide a severe test to decide whether model simulations are in agreement with theoretical expectations and/or observation. The results of our numerical experiments cover a wide range of "Reynolds numbers" and show that chaotic deterministic flows can be very efficient, and numerically low-cost, models of turbulent trajectories in stationary, homogeneous, and isotropic conditions. The mathematics of the model is relatively simple, and, in a geophysical context, potential applications may regard small-scale parametrization issues in general circulation models, mixed layer, and/or boundary layer turbulence models as well as Lagrangian predictability studies.
Kemsawasd, Varongsiri; Branco, Patrícia; Almeida, Maria Gabriela; Caldeira, Jorge; Albergaria, Helena; Arneborg, Nils
2015-07-01
The roles of cell-to-cell contact and antimicrobial peptides in the early death of Lachanchea thermotolerans CBS2803 during anaerobic, mixed-culture fermentations with Saccharomyces cerevisiae S101 were investigated using a commercially available, double-compartment fermentation system separated by cellulose membranes with different pore sizes, i.e. 1000 kDa for mixed- and single-culture fermentations, and 1000 and 3.5-5 kDa for compartmentalized-culture fermentations. SDS-PAGE and gel filtration chromatography were used to determine an antimicrobial peptidic fraction in the fermentations. Our results showed comparable amounts of the antimicrobial peptidic fraction in the inner compartments of the mixed-culture and 1000 kDa compartmentalized-culture fermentations containing L. thermotolerans after 4 days of fermentation, but a lower death rate of L. thermotolerans in the 1000 kDa compartmentalized-culture fermentation than in the mixed-culture fermentation. Furthermore, L. thermotolerans died off even more slowly in the 3.5-5 kDa than in the 1000 kDa compartmentalized-culture fermentation, which coincided with the presence of less of the antimicrobial peptidic fraction in the inner compartment of that fermentation than of the 1000 kDa compartmentalized-culture fermentation. Taken together, these results indicate that the death of L. thermotolerans in mixed cultures with S. cerevisiae is caused by a combination of cell-to-cell contact and antimicrobial peptides. © FEMS 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Metals and lipid oxidation. Contemporary issues.
Schaich, K M
1992-03-01
Lipid oxidation is now recognized to be a critically important reaction in physiological and toxicological processes as well as in food products. This provides compelling reasons to understand what causes lipid oxidation in order to be able to prevent or control the reactions. Redox-active metals are major factors catalyzing lipid oxidation in biological systems. Classical mechanisms of direct electron transfer to double bonds by higher valence metals and of reduction of hydroperoxides by lower valence metals do not always account for patterns of metal catalysis of lipid oxidation in multiphasic or compartmentalized biological systems. To explain why oxidation kinetics, mechanisms, and products in molecular environments which are both chemically and physically complex often do not follow classical patterns predicted by model system studies, increased consideration must be given to five contemporary issues regarding metal catalysis of lipid oxidation: hypervalent non-heme iron or iron-oxygen complexes, heme catalysis mechanism(s), compartmentalization of reactions and lipid phase reactions of metals, effects of metals on product mixes, and factors affecting the mode of metal catalytic action.
Manteca, Angel; Sanchez, Jesus; Jung, Hye R.; Schwämmle, Veit; Jensen, Ole N.
2010-01-01
Streptomyces species produce many clinically important secondary metabolites, including antibiotics and antitumorals. They have a complex developmental cycle, including programmed cell death phenomena, that makes this bacterium a multicellular prokaryotic model. There are two differentiated mycelial stages: an early compartmentalized vegetative mycelium (first mycelium) and a multinucleated reproductive mycelium (second mycelium) arising after programmed cell death processes. In the present study, we made a detailed proteomics analysis of the distinct developmental stages of solid confluent Streptomyces coelicolor cultures using iTRAQ (isobaric tags for relative and absolute quantitation) labeling and LC-MS/MS. A new experimental approach was developed to obtain homogeneous samples at each developmental stage (temporal protein analysis) and also to obtain membrane and cytosolic protein fractions (spatial protein analysis). A total of 345 proteins were quantified in two biological replicates. Comparative bioinformatics analyses revealed the switch from primary to secondary metabolism between the initial compartmentalized mycelium and the multinucleated hyphae. PMID:20224110
An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 1. Theory
Yen, Chung-Cheng; Guymon, Gary L.
1990-01-01
An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.
An Efficient Deterministic-Probabilistic Approach to Modeling Regional Groundwater Flow: 1. Theory
NASA Astrophysics Data System (ADS)
Yen, Chung-Cheng; Guymon, Gary L.
1990-07-01
An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.
Low-frequency fluctuations in vertical cavity lasers: Experiments versus Lang-Kobayashi dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torcini, Alessandro; Istituto Nazionale di Fisica Nucleare, Sezione di Firenze, via Sansone 1, 50019 Sesto Fiorentino; Barland, Stephane
2006-12-15
The limits of applicability of the Lang-Kobayashi (LK) model for a semiconductor laser with optical feedback are analyzed. The model equations, equipped with realistic values of the parameters, are investigated below the solitary laser threshold where low-frequency fluctuations (LFF's) are usually observed. The numerical findings are compared with experimental data obtained for the selected polarization mode from a vertical cavity surface emitting laser (VCSEL) subject to polarization selective external feedback. The comparison reveals the bounds within which the dynamics of the LK model can be considered as realistic. In particular, it clearly demonstrates that the deterministic LK model, for realisticmore » values of the linewidth enhancement factor {alpha}, reproduces the LFF's only as a transient dynamics towards one of the stationary modes with maximal gain. A reasonable reproduction of real data from VCSEL's can be obtained only by considering the noisy LK or alternatively deterministic LK model for extremely high {alpha} values.« less
NASA Astrophysics Data System (ADS)
Rodríguez, Clara Rojas; Fernández Calvo, Gabriel; Ramis-Conde, Ignacio; Belmonte-Beitia, Juan
2017-08-01
Tumor-normal cell interplay defines the course of a neoplastic malignancy. The outcome of this dual relation is the ultimate prevailing of one of the cells and the death or retreat of the other. In this paper we study the mathematical principles that underlay one important scenario: that of slow-progressing cancers. For this, we develop, within a stochastic framework, a mathematical model to account for tumor-normal cell interaction in such a clinically relevant situation and derive a number of deterministic approximations from the stochastic model. We consider in detail the existence and uniqueness of the solutions of the deterministic model and study the stability analysis. We then focus our model to the specific case of low grade gliomas, where we introduce an optimal control problem for different objective functionals under the administration of chemotherapy. We derive the conditions for which singular and bang-bang control exist and calculate the optimal control and states.
FACTORS INFLUENCING TOTAL DIETARY EXPOSURE OF YOUNG CHILDREN
A deterministic model was developed to identify critical input parameters to assess dietary intake of young children. The model was used as a framework for understanding important factors in data collection and analysis. Factors incorporated included transfer efficiencies of pest...
Field Evaluation of an Avian Risk Assessment Model
We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in ...
Optimization Testbed Cometboards Extended into Stochastic Domain
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.; Patnaik, Surya N.
2010-01-01
COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials.
NASA Astrophysics Data System (ADS)
Roirand, Q.; Missoum-Benziane, D.; Thionnet, A.; Laiarinandrasana, L.
2017-09-01
Textile composites are composed of 3D complex architecture. To assess the durability of such engineering structures, the failure mechanisms must be highlighted. Examinations of the degradation have been carried out thanks to tomography. The present work addresses a numerical damage model dedicated to the simulation of the crack initiation and propagation at the scale of the warp yarns. For the 3D woven composites under study, loadings in tension and combined tension and bending were considered. Based on an erosion procedure of broken elements, the failure mechanisms have been modelled on 3D periodic cells by finite element calculations. The breakage of one element was determined using a failure criterion at the mesoscopic scale based on the yarn stress at failure. The results were found to be in good agreement with the experimental data for the two kinds of macroscopic loadings. The deterministic approach assumed a homogeneously distributed stress at failure all over the integration points in the meshes of woven composites. A stochastic approach was applied to a simple representative elementary periodic cell. The distribution of the Weibull stress at failure was assigned to the integration points using a Monte Carlo simulation. It was shown that this stochastic approach allowed more realistic failure simulations avoiding the idealised symmetry due to the deterministic modelling. In particular, the stochastic simulations performed have shown several variations of the stress as well as strain at failure and the failure modes of the yarn.
CTCF and Cohesin in Genome Folding and Transcriptional Gene Regulation.
Merkenschlager, Matthias; Nora, Elphège P
2016-08-31
Genome function, replication, integrity, and propagation rely on the dynamic structural organization of chromosomes during the cell cycle. Genome folding in interphase provides regulatory segmentation for appropriate transcriptional control, facilitates ordered genome replication, and contributes to genome integrity by limiting illegitimate recombination. Here, we review recent high-resolution chromosome conformation capture and functional studies that have informed models of the spatial and regulatory compartmentalization of mammalian genomes, and discuss mechanistic models for how CTCF and cohesin control the functional architecture of mammalian chromosomes.
Evaluation of a Compartmental Model for Prediction of Nitrate Leaching Losses,
1981-12-01
model results limit their utility, the calculated total dissolved solids (TDS) of the soil solution (7146 mg L-1) and the measured TDS of tile...measured values of plant uptake, residual inorganic N and average annual In eq 1, the term on the left-hand side represents soil solution N concentrations...Research Applied to National the soil solution below which the uptake efficiency Needs, decreases sharply. 11 Table 3. Summary of water input data (cm of H2
Compartmentalization: a conceptual framework for understanding how trees grow and defend themselves
Alex L. Shigo
1984-01-01
The purpose of this chapter is to describe a conceptual framework for understanding how trees grow and how they and other perennial plants defend themselves. The concept of compartmentalization has developed over many years, a synthesis of ideas from a number of investigators. It is derived from detailed studies of the gross morphology and cellular anatomy of the wood...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varney, Peter J.
2002-04-23
This research established the Dakota-outcrop sequence stratigraphy in part of the eastern San Juan Basin, New Mexico, and relates reservoir quality lithologies in depositional sequences to structure and reservoir compartmentalization in the South Lindrith Field area. The result was a predictive tool that will help guide further exploration and development.
Youping Zhou; Benli Zhang; Hilary Stuart-Williams; Kliti Grice; Charles H. Hocart; Arthur Gessler; Zachary E. Kayler; Graham D. Farquhar
2018-01-01
Compartmentation of C4 photosynthetic biochemistry into bundle sheath (BS) and mesophyll (M) cells, and photorespiration in C3 plants is predicted to have hydrogen isotopic consequences for metabolites at both molecular and site-specific levels. Molecular-level evidence was recently reported (Zhou et al., 2016), but...
Harouaka, Djamila; Engle, Ronald E; Wollenberg, Kurt; Diaz, Giacomo; Tice, Ashley B; Zamboni, Fausto; Govindarajan, Sugantha; Alter, Harvey; Kleiner, David E; Farci, Patrizia
2016-02-02
Analysis of hepatitis C virus (HCV) replication and quasispecies distribution within the tumor of patients with HCV-associated hepatocellular carcinoma (HCC) can provide insight into the role of HCV in hepatocarcinogenesis and, conversely, the effect of HCC on the HCV lifecycle. In a comprehensive study of serum and multiple liver specimens from patients with HCC who underwent liver transplantation, we found a sharp and significant decrease in HCV RNA in the tumor compared with surrounding nontumorous tissues, but found no differences in multiple areas of control non-HCC cirrhotic livers. Diminished HCV replication was not associated with changes in miR-122 expression. HCV genetic diversity was significantly higher in livers containing HCC compared with control non-HCC cirrhotic livers. Tracking of individual variants demonstrated changes in the viral population between tumorous and nontumorous areas, the extent of which correlated with the decline in HCV RNA, suggesting HCV compartmentalization within the tumor. In contrast, compartmentalization was not observed between nontumorous areas and serum, or in controls between different areas of the cirrhotic liver or between liver and serum. Our findings indicate that HCV replication within the tumor is restricted and compartmentalized, suggesting segregation of specific viral variants in malignant hepatocytes.
The Simplest Complete Model of Choice Response Time: Linear Ballistic Accumulation
ERIC Educational Resources Information Center
Brown, Scott D.; Heathcote, Andrew
2008-01-01
We propose a linear ballistic accumulator (LBA) model of decision making and reaction time. The LBA is simpler than other models of choice response time, with independent accumulators that race towards a common response threshold. Activity in the accumulators increases in a linear and deterministic manner. The simplicity of the model allows…
Identifying influences on model uncertainty: an application using a forest carbon budget model
James E. Smith; Linda S. Heath
2001-01-01
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...